Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Wednesday, January 21st, 2015
| Time |
Event |
| 1:00p |
Digital Realty to Offer Tenants Free Renewable Energy Credits This is shaping up to be a big week for renewable energy in data centers. Following Tuesday’s announcement of a big wind power purchase agreement by Amazon Web Services, Digital Realty said it will give every customer that signs a new lease in any of its facilities around the world Renewable Energy Credits to apply to the energy their data center space consumes for the first year of their lease.
Few companies that own and operate their data centers choose to pay the premiums on their energy rates to make their IT operations carbon neutral. Even fewer companies that lease their data center space from the likes of Digital Realty do it, making the San Francisco-based real estate investment trust’s announcement stand out.
The move indicates that there is an increased interest among potential data center customers in renewable energy. Data center providers have very little business incentive to pay extra for renewable energy, since being able to offer the lowest energy rates to their customers is one of the main ways they compete. Digital Realty’s willingness to foot the bill for the tenant’s first year indicates that the company has seen enough demand in the market to make it worth the extra operating cost.
Clean Energy Makes You Look Good
Good publicity is another obvious benefit for the provider. Last year, Greenpeace started including Digital Realty and its peers, such as DuPont Fabros, and Equinix, on its annual scorecard for environmentally friendly energy practices by data center operators. Until last year’s report, only the big Internet companies, cloud service providers, and hardware vendors were included. Digital Realty received Ds in every category: for energy transparency, renewable energy commitment and site selection policy, energy efficiency and mitigation, and renewable energy deployment and advocacy.
Neither of the five data center providers listed did exceptionally well, but Digital Realty’s score was the lowest on the widely publicized report. It is likely to do much better on this year’s report, now that it has the new REC program.
Giving Clients an ‘Easy Button’
Aaron Binkley, the company’s director of sustainability, said every customer that signed a new lease would receive RECs automatically. Digital Realty will procure them on the client’s behalf at no additional cost to the client. After their first year ends, they can choose to start paying for the credits or opt out. The company will source RECs near the customer’s data center location wherever feasible.
The company’s leadership hopes the program will demonstrate to customers that the renewable energy premiums are not as high, and that the procurement process is not as complicated “as the market has been led to believe,” Binkley said.
Perceived high costs and complexity have been major deterrents for data center customers, many of whom say, “We’d love it if there was an easy button,” he said. So Digital Realty will provide that easy button. Energy procurement at scale is a big part of its operations, and the company has developed a lot of expertise and connections in the space. Once the initial year expires, all the legwork necessary to get and apply the credits to the energy the customer consumes will already have been done for them, lowering the barrier to entry. “The longer-term objective here is to really spur the growth of this within our client base,” Binkley said.
More to Come
Asked whether he and his colleagues have considered doing away with the one-year limit and just providing renewable energy to tenants indefinitely, he said it had been a discussion point, but “we wanted to walk before we run.” Right now, the objective is to show that it’s not so expensive and difficult. “This is a first step in what we think will be a number of steps.” | | 3:00p |
Rackspace to Offer Managed Private VMware vCloud Rackspace has added dedicated VMware vCloud to its portfolio of managed services. The managed cloud service essentially offloads day-to-day tasks like backing up VMs and applying patches to the operating system from IT staff’s plate, so they can “focus on higher-value projects,” Arrian Mehis, general manager of VMware Practice at Rackspace, wrote in a blog post Wednesday.
“As we saw throughout 2014, IT budgets and headcount remained flat; however, businesses still expected central IT to help drive growth and innovation,” Mehis wrote.
In a market where cloud infrastructure capacity is a commodity that’s rapidly shrinking in cost, Rackspace has chosen to differentiate by providing a wide variety of highly hands-on managed cloud services. Last November, the company announced managed Microsoft private clouds. Earlier last year, Rackspace expanded its managed services portfolio extensively and introduced pay-as-you-go billing for manage services.
According to Wednesday’s announcement, Rackspace will provide a hosted, single-tenant vCloud environment, including automation, self-service capabilities, hosted catalogs, access to vCloud API and vCLoud web portal, plus the company’s 100 percent network uptime and one-hour hardware replacement guarantees. The hardware hosting each private cloud will be completely isolated, from firewall to storage.
The vCloud API (application programming interface) will enable a customer to integrate third-part orchestration tools and policy-based governance.
Customers with existing vCloud environments hosted in-house will be able to extend them into Rackspace’s data centers.
Rackspace has been providing managed VMware vCenter services previously. | | 4:30p |
File and Object Based Storage to Reign Supreme in Big Data Future Steve Wojtowecz is Vice President of Storage Software Development for a suite of solutions offered by IBM Cloud & Smarter Infrastructure.
Snapchat. Vine. Instagram. Tumblr. While these names might have meant little to most of us just a few years ago, these social networks are growing in adoption and popularity at an exponential pace. As a consequence, they are now some of our richest sources of consumer preferences, trends and styles across numerous demographics and markets.
Their common thread: they are all driven largely by the creation and consuming of videos, photos, audio files and other rich media, otherwise known as unstructured data.
The Rise of Unstructured Data
Unstructured data represents an entirely new set of obstacles to businesses’ already mounting big data challenges. Like its name implies, it’s unorganized, large and unpredictable – making it almost impossible to collect, store and analyze without advanced technologies such as cloud computing and analytics.
Simultaneously, file and object based storage (FOBS) is skyrocketing in criticality as unstructured and enormous structured files continue to explode onto the big data scene. Analyst firm IDC predicts the worldwide FOBS market will continue to gain momentum and reach $38 billion by 2017, signaling the future direction of data storage technology as organizations craft strategies to cope with big data.
While traditional storage, such as block, has been around for a while, it will not be able to keep up. As cloud brings a seemingly infinite amount of computing power and flexibility, and analytics technology crunches data faster and smarter, organizations want a simple way to store the new, data-laden workloads these technologies create, as well as the ability to use rich and user-defined metadata – an area in which block falls short.
Taming the Beast: Managing the Data
FOBS helps to solve these unstructured, big data problems in a few ways:
- Uncorking analytics bottlenecks: Performance slowdowns in crunching unstructured data can often be a costly challenge, and block storage doesn’t exactly help speed analytics along. Object storage, however, can help store data at a relatively low cost and without bound – allowing the flexibility for users to define the object as they like; a highly unique attribute in storage. This opens up the opportunity to better apply advanced analytics, a far-fetched dream when using block.
- Object storage is cloud’s best friend: A cloud-based workload generates object data in and of itself such as scripts and images. In turn, object storage is a necessity when dealing with data extracted from the cloud, allowing it to be managed, protected, and analyzed.
- Harvesting big data when it’s ripe: File-based storage also presents useful tools for dealing with videos, photos and other multi-media rich data. Both file and object storage work well in conjunction with cloud and analytics technologies, and allow organizations to quickly and effectively capture and store data at the peak of its value: immediately after it’s created and is still relevant to consumers. This enables businesses to act on emerging trends and market shifts in near-real time – a feature whose importance cannot be overstated in today’s extremely fickle industries, within which opinions can turn in an instant.
Together, FOBS enables a better understanding of what our growing mountains of unstructured data need, and how to manage their new workloads without an entirely new infrastructure. Using FOBS eliminates the need for costly add-ons, enabling organizations to expand their knowledge and insights into their customers.
With the need to scale to very high capacity and store large numbers of files in the billions, FOBS can better equip organizations to manage data-heavy workloads through cloud and analytics, whether they are from mobile, social or other emerging networks. And giving businesses faster access to and insights from this data means happier customer interactions, intelligence-driven marketing, and better business.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 5:49p |
Equinix Buys Cloud Professional Services Firm Nimbo Equinix, a Redwood City, California-based global colocation and interconnection giant, has acquired Nimbo, a professional services company that helps enterprises make the jump from legacy IT to modern infrastructure.
Nimbo’s expertise includes things like moving legacy business applications from being hosted strictly in the customer’s own data centers to a more agile hybrid infrastructure that combines their data centers with cloud services. The New York-based company is a certified Amazon Web Services and Microsoft Azure partner.
Acquisition price has not been disclosed.
For Equinix, which since the start of the year has been operating as a real estate investment trust, the deal is a step in a broader initiative to build out its professional services capabilities. The company wants to be able to support customer data center migrations, help them set up modern, optimized WANs, and build hybrid infrastructure for their applications.
Cloud services and their rising popularity with users have been a major growth driver for Equinix. The company’s role isn’t that of a cloud provider, however. Its data centers act as points where customers can easily connect to a multitude of providers.
Equinix has traditionally placed focus on enabling interconnection between service providers and end users in its facilities. That, combined with a massive global footprint and a huge amount of customers, has made its data centers very effective hubs for accessing cloud services.
Recently, Equinix has been investing in technological capabilities to enable this exchange of services at a more sophisticated level via a platform called the Equinix Cloud Exchange. The platform enables on-demand secure connections to participating cloud providers, including Azure, AWS, Google Cloud Platform, and IBM SoftLayer, among others. Users can connect to as many service providers as they need through virtual circuits on a single physical port.
With Nimbo’s cloud professional services capabilities, Equinix will be able to not only provide the platform to connect to cloud providers but also to help the customer figure out which providers and what kind of an architecture is best for them and their applications.
Pete Hayes, chief sales officer at Equinix, said Nimbo had numerous Fortune 100 companies on its customer roster. “With our acquisition of Nimbo, we have expanded our ability to help customers leverage our unique cloud-density and Cloud Exchange value propositions and assist with the design and deployment of hybrid cloud solutions,” he said in a statement. | | 8:06p |
Report: Google Data Center Spend Spurs Major Fund to Cut Stake in Giant Data center construction and operations are some of the largest operating costs Google has. As the company has grown, so has the amount of money it has spent on its infrastructure.
The most recent increases in Google data center costs have led the largest mutual fund investor in its stock to reduce its stake in the Internet giant. Fidelity Contrafund, a $110 billion giant, has sold off some of its roughly $7 billion worth of Google stock, Reuters reported citing the fund’s recent update for investors.
“We trimmed the position based on our view that the stock’s short-term performance could remain choppy as investors digest the company’s increased investment in data centers, as well as other initiatives not directly tied to its core search business,” Contrafund’s statement read.
Contrafund holds an “overweight” position in Google, but the Mountain View, California-based giant’s sporadic stock performance over the course of 2014 have hurt it. Google’s share price today is about 10 percent lower than it was last April, according to Bloomberg.
Google’s Data Center Billions
Google went from slightly north of 100 servers 15 years ago to spending about $5 billion per quarter on building and operating its data center infrastructure. The company’s data center costs have grown continuously, and its leadership expects them to continue on the same trajectory.
News of the most recent Google data center investment came in December, when a Taiwanese government body announced the company would spend $66 million to expand its data center in the country.
In the third quarter of 2014, for example, Google’s cost of revenue increased by about $1.29 billion, more than half of which was an increase in data center costs, according to its SEC filing.
Its total cost of revenue for the quarter was about $6.69 billion. About $3.35 billion of that was traffic acquisition costs. The rest went to data center operation expenses, hardware costs, and other expenses, such as transaction processing and content acquisition costs.
Those are operating expenses. They do not include the money Google spends on buying real estate, building data centers, and buying the IT gear to put inside them.
Contrafund is a Key Tech Investor
Contrafund is managed by well-known portfolio manager Will Danoff, who specializes in technology stocks.
It has not disclosed how much it has reduced its stake in Google, but said its stock was still one of its biggest positions and was still an overweight holding. The fund’s other top holdings include Apple, Facebook, Microsoft, and Disney. | | 8:30p |
Ravello Raises $28M for Cloud Mobility Tech Ravello Systems, a Silicon Valley startup with technology that gives applications cloud mobility, announced it has closed a $28 million funding round, led by Qualcomm and SanDisk. This brings the total funds raised to date to $54 million.
Founded by the team that created the KVM hypervisor, Ravello offers what it calls a nested virtualization-powered cloud service, enabling businesses to re-create their data centers in the public cloud. The company also announced major release of its HVX-nested hypervisor, which it says wraps complex application environments in self-contained capsules that can run on any cloud.
The almost-two-year-old company launched its beta product in 2013, alongside an initial $26 million funding round from Sequoia Capital, Norwest Venture Partners, and Bessemer Venture Partners, all of whom also participated in this round.
As an enabler of cloud mobility, the company says it works to bridge the divide between public cloud and enterprise VMware-virtualized data centers. Ravello plans significant expansion of its marketing and sales operations around the world with the new funding.
Delivering on the promise of encapsulating an entire application stack is of course familiar in recent times, with the container movement and success of Docker. Taking it another step forward, Ravello says it enables VMware workloads, Android emulators, and entire OpenStack labs on Amazon Web Services or Google Cloud Platform.
“The ability to use leading public clouds seamlessly is increasingly becoming a need for enterprises,” Mony Hassid, senior director, Qualcomm Ventures, said in a statement. “There is huge complexity in re-creating enterprise application environments in other clouds, and Ravello’s approach breaks down those technical barriers with incredible speed and simplicity.”
Ravello says the new release of HVX features support for virtualization extensions, such as Intel VT and AMD SVM, and running third-party hypervisors, such as KVM on top of AWS or Google’s cloud. New networking functionality in the HVX release gives full support for VLANs and mirror ports on top of AWS and Google cloud. Finally, Ravello has done a complete refresh of the user interface and included a unified repository of all compute, storage, and networking resources. | | 9:00p |
Physical Breach Highlights Security Holes at Canada’s Electronic Surveillance Agency 
This article originally appeared at The WHIR
The new headquarters of Canada’s electronic surveillance agency had an “extreme vulnerability” which was inadvertently breached by firefighters responding to an emergency call, the Toronto Star reports. The Canadian Communications Security Establishment (CSE) revealed the vulnerability by sending uncensored documents in response to an access to information request by the Star about the fire.
The sensitive information contained in the documents was highlighted, but not censored, compounding one security breakdown with another.
During the construction of the $800 million CAD (about $660 million USD) building for the CSE, a routine call in response to a small fire lead local firefighters to different entrance than the one they were expected at. Finding no-one there, they cut a padlock to access the building.
The documents also reveal vulnerabilities such as inoperative security cameras and a long-missing visitor pass. At least some of those vulnerabilities have since been addressed, and the agency told the Star that the construction access point used in the incident no longer exists, now that the building is complete and occupied.
The documents also included the identities of several CSE employees, which are usually kept secret, along with contact information.
ICANN revealed in December that a network security breach started with a successful spear phishing email, as low tech decision-making continues to be a major factor in information security.
“Careless and untrained insiders” were blamed for 42 percent of breaches at US federal agencies in a 2014 survey by SolarWinds.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/physical-breach-highlights-security-holes-canadas-electronic-surveillance-agency | | 11:54p |
Oracle Intros Next-Gen Line of its Big Data Center Appliances Updated with comment by Cisco
Oracle announced five new appliances for a variety of applications Wednesday. The company’s “engineered systems” are hardware-software bundles optimized to work together.
The company announced a new version of its virtual compute appliance, a converged infrastructure package that competes with the likes of vBlock, a system by VCE that combines Cisco UCS servers and Nexus network switches with EMC storage. VCE started as a joint venture between Cisco and EMC, but EMC took full control of the company last October.
The new Oracle Virtual Compute Appliance X5 can be paired with the company’s FS1 flash storage system for a complete converged infrastructure system.
In its announcement, Oracle said the system was 50 percent cheaper and easier to deploy than the Cisco-EMC solution. Cisco responded with an emailed statement by Paul Perez, vice president and general manager of Cisco’s UCS business, touting Cisco’s position in the converged infrastructure market but not addressing the price difference mentioned by Oracle.
“We feel pretty good about our hand in the converged infrastructure market,” Perez said. “This is a market Cisco created with EMC back in 2009 with a joint venture that became one of the most successful in IT history. Since then we’ve been positioned as the number-one vendor for integrated infrastructure systems for multiple years, and Cisco UCS has also become the system of choice for integrated infrastructure offerings from NetApp, Red Hat, HDS, and most recently IBM. Oracle has a lot of catching up to do.”
The other additions to the Oracle appliance portfolio are database, big data, data protection, and database solutions.
“We’re going to compete for that core data center business,” Larry Ellison, the company’s chairman and CTO, said in a statement. “Our customers want their data centers to be as simple and as automated as possible.”
Ellison, the company’s founder who served as its one and only CEO until September 2014, has yielded the chief executive role to Safra Catz, Oracle’s former CFO, and Mark Hurd, who was its president.
The sixth-generation Oracle Exadata Database Machine X5 has 50 percent faster processors and 50 percent more memory than its predecessor. It includes an all-flash storage server using PCIe flash drives, Non-Volatile Memory Express flash protocol, and scale-out capabilities via InfiniBand.
Oracle also announced a new Database Appliance X5, meant for distributed and branch office deployments. The new version of the appliance has flash caching, more compute cores, more storage, and integrated InfiniBand connectivity.
Oracle is pitching its new Big Data Appliance X5 as an alternative to custom-built compute clusters for Hadoop and NoSQL. It comes with twice the RAM and 2.25 times more CPU cores than its predecessor. Customers can use the appliance to run Oracle Big Data SQL, which gives users the ability to run SQL queries against data stored in Hadoop and NoSQL databases.
Finally, the vendor’s new Zero Data Loss Recovery Appliance X5 is a data protection solution integrated with Oracle Database. It has faster processors, more capacity per rack, faster recovery, higher throughput, and better database backup consolidation than the previous-generation version of the product. |
|