Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, July 3rd, 2014
Time |
Event |
12:30p |
QTS Acquires McGraw Hill’s New Jersey Data Center for $75M QTS Realty Trust has bought a data center in East Windsor, New Jersey, for $75 million from McGraw Hill Financial.
The financial services firm will continue to occupy the data center as a tenant and customer of QTS services. It signed a 10-year lease with QTS, which includes a 15-year extension option.
This type of deal is commonly referred to as a “sales-leaseback transaction,” where a facility’s owner sells it but continues to use it, essentially transferring the responsibility of data center management to a specialist.
In this case, however, McGraw Hill is effetively outsourcing data center services to two companies. As part of the announcement, QTS said it had partnered with French IT outsourcing firm Atos, and the two companies would provide services to the tenant together.
QTS will provide the data center and data center management services, while Atos will provide IT outsourcing. Technically, QTS will be providing its services to Atos, which will combine them with its own services in a comprehensive package it will deliver to McGraw Hill.
QTS indicated that the partnership with Atos will extend beyond the New Jersey data center.
The data center itself is a fairly large facility that came with a low price tag. The building’s total size is 560,000 square feet, and the data center floor measures about 58,000 square feet.
Its total current power capacity is 12 megawatts, which QTS said could be expanded to 20 MW. The building can also accommodate an expansion of the raised floor area to 100,000 square feet.
The facility also comes with a massive (50-acre) “solar field,” whose power generation capacity is 14.1 MW, according to QTS.
QTS CEO Chad Williams said the deal was consistent with the company’s strategy to grow its business with enterprise customers by buying large facilities in strategic markets.
“We focus on infrastructure rich facilities, acquired at a low basis, with significant capacity to continue to support our growth in a low-risk, cost-efficient model,” he said. | 1:00p |
VMware Adds New Jersey Data Center to Support Cloud Services Building out its “second DNA” of a service provider, VMware has added a data center in Jersey City, New Jersey, to the fleet of locations supporting the VMware vCloud Hybrid Service.
Located in close proximity to New York and New England markets, this is the sixth data center supporting the service to date, and the company plans to add more in the near future.
VMware virtualization and tools form the basis of many enterprise clouds. The company saw an opportunity to provide its wares and expertise as a service, and began its push to become a cloud provider. Its focus is on hybrid deployments – not necessarily the mixing of on-premises and cloud, but providing customers the choice and flexibility to deploy how they want, whether on-premise or in the cloud.
Expanding to all key markets
Initial infrastructure for vCloud Hybrid Service was based in Las Vegas at the Switch SuperNAP facility. Everything was done out of Las Vegas up until general availability, which was launched in mid-2013.
The company then added data center space in Virginia and California to cover the coasts, as well as Dallas to cover the middle part of the country. The first international expansion was in Slough, UK in February. The New Jersey data center fills the void in the New York metro, on of the top data center markets in the country.
Non-disruptive cloud on-ramp
The service, built on VMware vSphere, enables customers to extend the same applications, networking, management, operations and tools across both on-premise and off-premise environments. Customers can manage and automate vCloud Hybrid Service from their vSphere console, vCloud Automation Center, vCloud Application Director and their own tools using the vCloud API.
“Our hybrid approach helps enterprise customers use the public cloud with an infrastructure that matches their existing architectures and data location, giving IT all the freedom of the public cloud with the manageability and security they expect from their existing data center or private cloud,” wrote the vCloud team on the company’s official blog. “For IT departments, a hybrid cloud can remove traditional barriers to innovation and radically change the relationship between IT and the business.”
VMware also recently began beta testing for the next version of vSphere, its virtualization platform, which takes a more open approach than the company has taken in the past, as it’s kept its proprietary software close to its chest.
Amazon recently released a management portal for vCenter, which some pundits saw as a way for Amazon to poach some of VMware’s business.
VMware was a raving success during the early virtualization wars but now it steps into a new arena of voracious cloud providers. The new data center shows its commitment to continue investing in the vCloud Hybrid Service, and it’s hitting all the major markets first.
Angelos Kottas, director of product marketing, detailed the plans for VMware’s evolution last April. “Our second DNA is we will become a cloud service provider. We won’t get out of selling package software, but our first and primary route will become as-a-service delivery.” | 1:30p |
Interxion Launches Private Links to Azure and AWS Across Europe European carrier neutral colocation provider Interxion is now offering private connections to both Amazon Web Services and Microsoft Azure clouds. Interxion’s new private connections will span all of its 37 data centers across 11 countries.
The service enables customers in multi-tenant data centers connect directly to public clouds, bypassing the public Internet. It offers higher security, better throughput and reliability and lower latency connections between customer data centers and public cloud services.
Connections are provided through Amazon Direct Connect partners and Microsoft ExpressRoute partners. Network partners to AWS and Azure include Level 3, Verizon, AT&T, IX Reach and BT, all of which have points of presence in Interxion’s data centers.
Interxion’s cloud-neutral strategy
Interxion has maintained a “cloud-neutral” stance, indicating that it has no intentions of moving up the stack and offering cloud services itself, instead focusing on connecting customers to whatever clouds they want.
Not offering cloud itself ensures the company doesn’t potentially step on the toes of its customers who provide cloud services out of its data centers. Interxion has noted it has seen high growth for its colocation services among European cloud providers, necessitating acceleration of expansion in Amsterdam and Frankfurt.
Local cloud providers have done well in Europe, as each country desires local cloud that conforms to a given country’s data laws. Offering private connections to the leading public clouds extends Interxion’s cloud connectivity services, raising its appeal among enterprises looking to use hybrid infrastructures.
“Our data centers act as a true Cloud Hub, enabling interconnection between cloud services providers, their customers and channel partners, making it attractive to customers building and managing reliable and high performance hybrid IT solutions,” said Vincent in’t Veld, director of cloud segment at Interxion. “Thanks to our rich carrier community, our customers are provided with a choice to order services from. For example, in London, Frankfurt, Paris and Amsterdam we have at least six AWS Direct Connect and all four network partners for Microsoft Azure ExpressRoute.”
Competitor Equinix recently rolled out Azure ExpressRoute globally, including in Europe. Retail colocation providers in general are recognizing the benefits of offering public cloud through private connections, treating cloud as not a threat to colocation, but as a complementary offering.
Interxion plans on enabling similar solutions with other cloud providers in the future. The company has also partnered with OnApp and Dell for CloudPOD, a quick way for customers to stand up clouds in its data centers. | 5:06p |
Oracle’s Latest VM Release Aims to Unify App Management Across x86 and SPARC The latest VM Release 3.3 from Oracle looks to deliver enterprise scale performance enhancements for x86 and SPARC architectures and expand support for both Oracle and non-Oracle workloads.
New release of the Oracle server virtualization platform features deep integration with Oracle’s own application-driven architecture and with Microsoft Windows, but will also work with Red Hat or SUSE Linux and support OpenStack. In May Oracle introduced a technology preview of an OpenStack distribution that allows Oracle Linux and Oracle VM users to work with the open source cloud software.
Oracle VM Server for x86 leverages Oracle’s Unbreakable Enterprise Kernel Release 3, allowing customers to utilize the same technology powering Oracle Linux and Oracle engineered systems. To enhance network and disk I/O throughput for Microsoft Windows guest OS environments Oracle developed VM PV (paravirtualization) drivers for Windows.
Oracle lists several VM 3.3 enhancements that build enterprise-scale performance and improved flexibility for automation of virtualization management, including:
- A new HTML5 virtual machine console – eliminating the need for a Java virtual machine to be installed.
- VM Manager comes bundled with MySQL Enterprise Edition for automated database backup and integrated tools for database object consistency checking.
- VM Manager provides a fully supported web services API that includes both SOAP and REST interfaces to enable more automation and interoperability.
- Tighter security controls built into VM Manager, including a reduction in the number of required open ports and a certificate-based authentication model.
- Improved SPARC virtualization management and availability, with support for Fibre Channel, iSCSI, ZFS volume and local disk.
“Oracle VM 3.3 continues to refine ease-of-use capabilities, provides more flexibility in network design and delivers a new, intuitive VM console,” said Wim Coekaerts, senior vice president, Linux and Virtualization Engineering, Oracle. “This new release will allow customers to more easily deploy, manage, and maintain enterprise-scale applications across both x86 and SPARC environments in a unified way.” | 7:25p |
Equinix to Deploy Ciena Gear to Enable On-Demand Network Provisioning Globally Networking vendor Ciena has sold its switching platform for core metro networks to Equinix, the world’s largest data center colocation provider, which will use it to enable on-demand network provisioning for a variety of connectivity services it offers to its customers.
The products include the Ciena 6500 platform, which provides 100Gbps capabilities and the ability to go up to 400Gbps in the future, the vendor’s V-WAN network resource broker and scheduling application. The systems provide multi-layer Software Defined Network (SDN) control for rapid creation and provisioning of high-speed connections.
Ciena’s coherent optical technology and software will support Equinix’s Cloud Exchange, which automates interconnection between cloud, network and managed service providers and enables service orchestration. The technology gives Equinix customers the ability to create a private 1Gb and 10Gb connection to transfer data between their own infrastructure and multiple cloud providers.
Building on its leadership as a multi-tenant data center provider, Equinix is leveraging its global facilities (it has presence in 32 markets around the world) and interconnections to lure more enterprise customers, with direct access to multiple cloud services. Equinix has developed key partnerships with Amazon Web Services, Microsoft Azure, AT&T and others to capture the enterprise opportunities.
From video services for sporting events to automated cloud interconnections, there is growing demand for sophisticated networking platforms to speed on-demand global network services.
“Many Equinix customers are looking for the ability to expand their networks globally and they require high-speed connections within a metro, or globally, between our IBX data centers,” Equinix CTO Ihab Tarazi said. “Through our relationship with Ciena, we can offer customers a dynamic and automated interface to the hundreds of network and cloud service providers inside our data centers and meet their needs for next-generation connectivity solutions. ” | 7:42p |
Report: Retail Colo Growth in Europe Outpaces Wholesale Data Center Growth European colocation market growth is being driven by retail carrier neutral services, according to a recent report by Synergy Research Group.
While wholesale growth has typically been higher than retail, research shows that wholesale growth rates have been dropping. Carrier neutral retail saw a huge bump in the latest quarter, reaching almost 14 percent year-on-year growth, the highest growth rate the region has seen in five quarters.
In total, the European colocation market grew 11 percent in the quarter.
The research not only indicates healthy growth for retail colocation services in Europe, it also shows that telecoms without carrier neutrality are struggling to keep pace. Retail bandwidth provider growth rates are well below average.
UK remains the largest market, accounting for 27 percent of Q1 revenue, followed by Germany, France and the Netherlands. The top four countries account for 65 percent of regional revenues.
Equinix leads the carrier-neutral retail segment, closely followed by European-centric providers TelecityGroup and Interxion. Those three companies make up the bulk of total revenue, followed by Telehouse and Colt, both of whom are “a long way behind the top three,” according to the analysts.
“Despite the presence of all the major telcos in the European colocation market, the carrier-neutral segment is far larger than the bandwidth provider segment” Synergy analyst John Dinsdale said. “Clearly customers see benefits in having a choice of carriers and consequently the relatively open European market has enabled the growth of pan-European colocation specialists.” | 7:52p |
Critical Facilities Summit The Critical Facilities Summit will be held September 29 – October 1 in Charlotte, North Carolina at the Charlotte Convention Center.
Critical Facilities Summit is an educational conference and exhibition for facilities professionals, consulting engineers, and design/build firms responsible for the design, construction and on-going operation of data centers and other mission critical facilities. The Summit is an exclusive gathering of senior-level professional working in data centers, labs, hospitals, financials, e-tailers and other mission critical facilities.
Critical Facilities Summit provides a deep dive into the infrastructure and operational aspects of mission critical facility management. Get more information and register today!
To view additional events, return to the Data Center Knowledge Events Calendar. | 8:06p |
Verizon Launches CDN Geared Specifically for E-Commerce Since it acquired Content Delivery Network (CDN) provider EdgeCast Networks, Verizon Digital Media Services has undertaken major expansion of the CDN network, having added presence in São Paulo and more than 20 other cities around the world. The company has also introduced an integrated solution targeted specifically at e-commerce customers with EdgeCast at the core.
At the time of acquisition, EdgeCast had more than 6,000 accounts and served some of the largest web brands for global media delivery and acceleration services. Under the Verizon Digital Media Services umbrella, more investment was pumped into the quickly growing CDN.
The performance of an e-commerce website ties directly to revenue. Slow browsing and check-out can often lead to shopping cart abandonment and lost dollars. There are also credit card data security issues to consider.
Customers are checking out retail websites on a variety of devices, meaning a misconfigured page might look awful on a particular mobile device. Verizon’s new e-commerce solution enables retailers to cost-effectively accelerate transactions with faster loading of websites, efficient video streaming, accurate content targetting based on shoppers’ geography and redirecting mobile users to mobile website versions.
There are several vendors that solve these individual problems, but for the most part, the market that provides various services around security and performance is fragmented.
The package heavily leverages the EdgeCast CDN Verizon acquired last December and takes a page out of the EdgeCast playbook.
EdgeCast launched Transact, an e-commerce-specific CDN just a short time before it was acquired. Verizon’s new service looks a lot like Transact with additional bells and whistles.
It includes VDMS products delivering acceleration, security, edge intelligence and device detection. Verizon Transact is at the core of the solution, a global PCI-certified acceleration network dedicated to e-commerce customers and partitioned from the global Internet for extra security. The pieces of the solution include:
- Transact: a PCI-certified global e-commerce delivery platform
- Protect: advanced content protection
- Compute: logic and intelligence at the edge
- Analyze: 360-degree visibility into content performance and user experience
- Develop: custom logic at the edge
- Support: 24/7 year-round customer support
- Store: replicating origin content across a geographically distributed footprint
- Route: DNS- and Anycast-based routing and traffic management
New cities with the CDN network points of presence include Warsaw, Stockholm, Milan, Vienna, Melbourne, Helsinki, Kaohsiung, Batam, Jakarta and São Paulo. The company also expanded its presence with additional PoPs in many cities already served, including London, Madrid, Paris and Amsterdam. | 8:30p |
Google’s MapReduce Divorce Does Not Mean End of Hadoop is Near At the Google I/O conference in San Francisco last week, Urs Hölzle, senior vice president of technical infrastructure at Google, said MapReduce was no longer sufficient for the scale of data analytics the company needed and that the company had stopped using it years ago, having replaced it with a much more capable system its engineers had cooked up.
The programming model for distributing massive amounts of data across clusters of commodity servers and to run those servers in parallel to process a lot of data quickly came out of Google, which published a paper describing the system in 2004 and received a patent for MapReduce in 2010.
The model did however serve as basis for Apache Hadoop, the open source implementation of MapReduce. As companies try to get value out of all the data they and their customers generate and store, Hadoop has become very popular, and a number of business models (and businesses) have sprung up to offer Hadoop distributions and services around it to enterprises.
Hadoop MapReduce linkage broken
Last week, however, Google said MapReduce was no longer cutting it. Once datasets the company was running queries across reached multi-petabyte scale, the system became too cumbersome, Hölzle said. The announcement naturally raised the question of whether it meant the beginning of the end of the Hadoop ecosystem.
While it is further proof that MapReduce is losing some of the steam it once had (it is a 10-year-old technology after all), the Hadoop ecosystem has grown into something much larger than MapReduce, and it is way too early to declare that the sun is setting on this ecosystem.
As John Fanelli, vice president of marketing at DataTorrent notes, MapReduce and Hadoop have not been inseparable ever since the release of Hadoop 2, which is really more like an operating system that can run different kinds of data processing workloads, MapReduce being only one of them.
The second generation of Hadoop introduced YARN (Yet Another Resource Negotiator) which breaks the linkage between MapReduce and the Hadoop Distributed File System and makes it possible for other processing models to be applied to data stored on Hadoop clusters.
Batch processing demand on decline
Arguably the biggest advantage of Hadoop 2 and YARN is real-time data processing, also referred to as stream processing. MapReduce is designed for batch processing, while users increasingly need stream processing.
Google’s replacement for MapReduce, its Cloud Dataflow system, combines batch and stream processing. Its developers and customers (the company is offering it as a service on its cloud platform) can create pipelines using unified programming that include both batch and streaming services.
Fanelli doubts Google itself thinks Cloud Dataflow means the end of Hadoop. “I don’t think Google views it as a Hadoop killer,” he says. “It’s an alternative. It actually continues to validate what we’re seeing from customers. They want real-time streaming analytics of their data.”
End of “one-size-fits-all data management”
Perhaps not coincidentally, MapR, one of the leading enterprise Hadoop distribution vendors, announced a $110 million funding round led by Google Ventures less than one week after Hölzle’s keynote at I/O. MapR offers its distro as a service that can be deployed on Google Compute Engine (the giant’s Infrastructure-as-a-Service offering). DataTorrent (Fanelli’s employer) has a Hadoop-based stream processing product also offered on top of Compute Engine.
Yes, Cloud Dataflow is now competing with DataTorrent, but it only adds to the variety of available offerings, each with its own advantages and disadvantages. You can only use Dataflow in the Google cloud, for example, while DataTorrent’s solution can be deployed in different public clouds as well as in a user’s own data center.
As Paul Brown, chief architect at Paradigm4, a company with a sophisticated structured-data analytics platform, puts it, if anything is coming to an end it is the era of “one-size-fits all data management.” Instead of being the de facto Big Data platform, Hadoop will become one option within a group of platforms companies will choose from, depending on the specifics of their application, he says. |
|