Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, November 5th, 2015
| Time |
Event |
| 1:00p |
Why CenturyLink Doesn’t Want to Own Data Centers CenturyLink’s colocation business, the business whose seeds were sown primarily four years ago with the $2.5 billion acquisition of Savvis, is not doing well. Colo revenue is not growing, and the telecommunications giant is looking for ways to avoid investing more capital in the segment.
While it plans to continue offering colocation services, Monroe, Louisiana-based CenturyLink is looking for “alternatives” to owning its nearly 60 data centers around the world that support colocation, managed hosting, and cloud services. The company’s leadership is under pressure to cut costs, and getting out of data center ownership is one way to do it.
To be clear, CenturyLink does not actually own most of the data center facilities in its portfolio. Aside from a handful of buildings – mostly ones that were part of its merger with Qwest in 2011 – the company leases them from wholesale data center providers, so when its CEO Glen Post announced the plans on the company’s earnings call Wednesday, he was probably talking primarily about infrastructure inside the facilities.
In most wholesale data center leases, tenants build out at least some of their own infrastructure, such as power and cooling equipment or racks and cabinets. Details on the plans are scarce, as the process has just started, and company executives are tight-lipped about it at the moment.
Colo Revenue Shrinks
A look at the financial results CenturyLink reported for the third quarter does offer some hints about the reasons behind the about-face. Just last year, Drew Leonard, VP of global colocation at the company, told us colocation remained core to the strategy, even as the company was aggressively expanding and promoting its cloud services.
Considered as a whole, revenue of the business unit that includes colocation, managed hosting, and cloud shrunk in the third quarter, going from $331 million in Q3 of last year to $324 million this year. But managed hosting and cloud revenue actually increased, going from $145 million to $152 million, while colocation, taken alone, was down to $151 million from $164 million in revenue reported for the same period last year.
Another part of the segment, hosting area network, contributed $1 million to the overall decline, but it’s clear that colocation is primarily responsible for dragging down the business segment’s overall results.
The alternatives to data center ownership CenturyLink is considering concern the colocation business alone, Post said on the call. The company is not willing to invest the necessary capital to continue expanding this business, choosing instead to focus on “investments that can drive higher returns, basically,” he said.
Plus, it’s not crucial to own data centers to be able to offer the full gamut of data center services. Possible alternatives to full ownership are continuing to own some of the assets, while selling others; selling all of them and leasing them back; entering into joint ventures where outside investors buy partial stakes in the assets. Retaining ownership of all the assets is also still on the table.
To put things in perspective, while not insubstantial, the $600 million colocation business represents a small portion of CenturyLink’s overall revenue mix. The company’s total Q3 revenue was $4.5 billion – nearly flat year over year.
Should Telcos Get Out of Data Center Business?
CenturyLink hiring an advisor to help examine the alternatives to data center ownership is a concession that the analyst who said last month that US telcos should sell their hosting businesses was at least partially right. Gregory Williams, an analyst at the investment banking firm Cowen and Company, issued a note saying CenturyLink’s hosting business, including colocation, was a “disappointment,” and that other big US telcos, such as AT&T and Verizon, also weren’t doing so well in this market.
“Since acquiring Savvis [CenturyLink’s] hosting product segment could be seen as a disappointment, as the segment experienced elevated [customer] churn, a $1.1 billion asset write-down in 2013 and grew just 4.5 percent in 2014,” he wrote.
Both Verizon and AT&T are also looking to sell many of their assets, including data centers, according to news reports. In October, Arkansas-based telco Windstream announced an agreement to sell its data center business to TierPoint, a data center roll-up focused on regional markets.
Commenting on the Windstream deal last month, Philbert Shih, managing director at Structure Research, a market research firm focused on the colocation space, warned that it was too early to assume that telcos getting out of the data center services business was going to continue as a trend.
While more such deals are likely to happen, they have to be looked at on an individual-company basis, he said. “It’s maybe a bit early to tell,” Shih said. “We probably will not see too many of them.” | | 5:34p |
Microsoft and Red Hat Sign Unlikely Deal to Support Enterprise Hybrid Cloud 
This article originally appeared at The WHIR
Microsoft and Red Hat announced a partnership that is a first for both companies on Wednesday, driven by customer demand and the rise of hybrid cloud. One of the key components of the agreement is Microsoft naming Red Hat Enterprise Linux as the preferred choice for enterprise Linux workloads on Microsoft Azure.
Also as part of the agreement, the companies will work together to provide cross-platform, cross-company support spanning Microsoft and Red Hat’s offerings, collocating support teams in Redmond initially. The partnership also includes interoperability between Red Hat CloudForms and Microsoft System Center Virtual Machine Manager and Microsoft Azure.
Developers will gain access to .NET technologies across Red Hat offerings,” giving developers the ability to build applications and include .NET services,” Paul Cormier, Red Hat executive vice president and president, Products and Technologies, said in a briefing. He called the partnership a “powerful win for the enterprise customer.”
“I think everyone knows that there is no doubt now that Linux is a key part of enterprise computing today,” Cormier said.
With “cloud at the center of Microsoft’s strategy going forward” the company sees its capabilities around hybrid cloud as a differentiation in the market, Scott Guthrie, EVP of the cloud and enterprise group at Microsoft said.
Throughout the webcast, both Guthrie and Cormier emphasized the choice and flexibility that hybrid cloud provides. Hybrid cloud adoption is set to triple over the next few years, according to a recent report.
In a Red Hat blog post, Cormier said: “Both Red Hat and Microsoft are key players in this new, hybrid cloud reality. Today, it is incredibly likely that where you once found ‘Red Hat shops’ and ‘Microsoft shops,’ you’ll find heterogeneous environments that include solutions from both companies.”
“We heard from customers and partners that they wanted our solutions to work together – with consistent APIs, frameworks, management, and platforms. They not only wanted Red Hat offerings on Microsoft Azure, they wanted to be able to build .NET applications on infrastructure powered by Red Hat Enterprise Linux, including OpenShift, Red Hat Enterprise Linux Atomic Host, and Red Hat Enterprise Linux OpenStack Platform.”
“This partnership is a much more comprehensive partnership than we have with any other public cloud providers,” Cormier said in the webcast. “We never would have thought about this [partnership] 14 years ago when we started RHEL [Red Hat Enterprise Linux], but our customers wanted this. We realized we had a common goal to satisfy customers.”
In the coming weeks, Microsoft Azure will become a Red Hat Certified Cloud andService Provider, which enables customers to run RHEL applications and workloads on Azure. In the coming months, Microsoft and Red Hat plan to provide pay-as-you-go RHEL images in the Azure Marketplace, supported by Red Hat.
Red Hat CloudForms customers will be able to managed RHEL on both Hyper-V and Microsoft Azure, with support for managing Azure workloads from Red Hat CloudForms to be added in the next few months.
Developers will have access to Red Hat offerings, including Red Hat OpenShift and RHEL, jointly backed by Microsoft and Red Hat. According to the announcement, RHEL will be the primary development and reference operating system for .NET Core on Linux.
This first ran at http://www.thewhir.com/web-hosting-news/microsoft-and-red-hat-sign-unlikely-deal-to-support-enterprise-hybrid-cloud | | 6:00p |
How to Move Beyond Data Center Monitoring and Alerts Per Bauer is the Director of Global Services at TeamQuest.
For businesses looking to get out in front of problems before they appear, smart analytics can take your system monitoring to a whole new level.
Today, proper capacity planning is crucial to an IT department’s ability to keep running smoothly. In addition to making both IT and business objectives achievable, it can help sustain your IT infrastructure when systemic problems crop up.
For those critical, reactionary moments, it’s imperative that you have a sound Data Center Infrastructure Management (DCIM) strategy that allows you to act quickly in preventing system downtime, which can have a deeply negative impact on your reputation and revenue stream. But how do you change your approach from reactive to proactive, thereby lessening the chance of costly errors and crashes before they have a chance to crop up? The solution requires a fundamental change in mindset and the right tools in place when it comes to capacity planning.
Understanding the Building Blocks of Monitoring
If you want to improve your monitoring practices, it’s best to start with the basics. IT infrastructures are complex systems that require performance data measurements at the granular level. Thus, you really should expect a great deal out of your performance monitoring platform. It should be data agnostic and equipped to collect information regardless of the source, and also allow high frequency polling and offer sufficient data retention to meet its requirements. In this age of large-scale data processing, your monitoring platform should also be able to manage the breadth of your data with room to spare. The last thing you want is to be forced to choose between monitoring one area over another. Your platform should be able to keep up with all of your data without breaking a sweat.
Finally, figure out what normal performance looks like in your environment; your management system should do this automatically as it collects data. This will provide a baseline for comparison as you move forward, which will help you react quickly to future changes in capacity requirements and keep your system optimized.
Proactive Transitions
Once you master the fundamentals, opportunities to stay ahead of potential problems will begin to appear. Powerful capacity planning tools offer automated and insightful reports that can predict errors, help you manage your resources more effectively, and reduce the overall cost of your operation.
However, any proactive capacity planner will tell you that volume isn’t the only important consideration. In addition to determining your IT system’s normal activity levels, it’s important that you understand where the activity is coming from. Numbers are key, but without proper context, they’re largely meaningless. The ability to monitor data, extract meaningful information, and then apply it in a business-minded way is the essence of smart capacity planning.
A Better Future
Companies should be looking into better capacity management information system and standardized optimization reports which can help struggling IT departments improve their data measurement techniques and manage their resources more effectively. Having comprehensive, real-time dashboards make it easy to react to issues quickly before they have time to snowball and get out of hand.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 6:30p |
Google, IBM, Microsoft Partner on Linux Foundation Open API Initiative 
This post originally appeared at The Var Guy
The Linux Foundation, Google, IBM, Microsoft and other partners are promising easier compatibility between apps and platforms through the new Open API Initiative, which will extend the Swagger API framework.
Swagger is an open source platform that helps programmers create APIs for Internet-enabled applications. Those APIs let different applications share information and resources with one another.
Swagger has proven popular among developers of open apps since it was launched in 2010. But the new Open API Initiative was founded to extend Swagger’s functionality further and provide more community collaboration in its development.
“Swagger is considered one of the most popular frameworks for building APIs. When an open source project reaches this level of maturity, it just can’t be managed by one company, organization or developer,” said Jim Zemlin, executive director at The Linux Foundation. “The Open API Initiative will extend this technology to advance connected application development through open standards.”
The Open API Initiative will focus on providing “a vendor neutral, portable and open specification for providing metadata for RESTful APIs” and produce a standard that developers can adopt to create open APIs, the Linux Foundation said in announcing the launch.
Founding members of the Open API Initiative, which is a Linux Foundation collaborative project, include 3Scale, Apigee, Capital One, Google, IBM, Intuit, Microsoft, PayPal and Restlet.
This first ran at http://thevarguy.com/open-source-application-software-companies/110515/embargo-nov-5-8-am-est-google-ibm-microsoft-partner-linux | | 7:06p |
Amazon to Launch Cloud Data Center in Korea Amazon’s cloud services unit is preparing to launch a cloud region in South Korea early next year, which will be its fifth region in Asia Pacific.
An Amazon Web Services region usually consists of multiple data centers linked by a wide area network. The company did not specify how many cloud data centers it was planning to bring online in Korea.
Providing public Infrastructure-as-a-Service has become a race among giants such as Amazon, Microsoft, IBM, and Google to enhance functionality, reduce price, and expand scale. These companies are spending billions of dollars every quarter on data center infrastructure as they compete for cloud market share.
Amazon’s announcement, which the company’s chief cloud evangelist Jeff Barr made in a blog post Wednesday, is yet another example of US-based cloud giants going aggressively after Asia’s rapidly growing cloud services market.
Other recent examples are IBM’s deal with Chinese data center provider 21Vianet to provide its Bluemix Platform-as-a-Service to customers in China and the launch of IBM’s first SoftLayer data center in India. Microsoft launched three cloud data centers in India earlier this year.
Asian cloud service providers are expanding data center capacity in the region too. Chinese e-commerce and cloud giant Alibaba recently brought online a new data center in the Zhejiang Province.
Barr listed existing Korean customers that will be able to take advantage of Amazon’s new cloud data center capacity when it comes online. They include startups, gaming companies, and enterprises, the latter category including the electronics giant Samsung.
“These customers (and many others) have asked us for a local region,” Barr wrote. “We are looking forward to making it available to them and to many other enterprises, startups, partners, government agencies, and educators in Korea.” | | 11:15p |
Interxion in €170M Expansion in Four European Data Center Markets European data center services giant Interxion announced plans to expand capacity in four key markets, expecting to invest about €170 million total.
Plans include new data center construction in Amsterdam, where the company is headquartered, Dublin, and Copenhagen, and an expanded build-out plan for a previously announced facility in Frankfurt.
In a statement, Interxion CEO David Ruberg said the provider saw “solid demand” across key European markets and was expanding in response.
The competitive landscape in Europe changed significantly after Interxion’s US-based rival Equinix tore its chief European competitor TelecityGroup out of its teeth in a deal that, if approved by regulators, will make Equinix the biggest player in the European data center market.
In February, Interxion made a $2.2 billion acquisition agreement with London-based Telecity, but the merger was interrupted when Redwood City, California-based Equinix successfully outbid Interxion, offering to buy Telecity for $3.6 billion. Equinix leadership expects to close the deal in the first half of 2016 if approved by the European Commission.
Meanwhile, Interxion is embarking on a massive expansion project across Europe.
The company said that in addition to previously announced first two phases of its latest Frankfurt data center, due to come online in the first half of 2016, it is building out phase three and four, both of which it expects to complete in the fourth quarter of next year. Once built out, the FRA10 facility will have about 50,000 square feet of data center space and 10 MW of power.
The first two phases of Interxion’s new six-phase Amsterdam data center will add close to 28,000 square feet total. The facility’s total capacity at full build-out will be 15 MW.
Interxion expects to bring online the first two phases of its new Dublin data center by the fourth quarter of 2016. Together, the two phases will provide about 13,000 square feet of data center space.
Finally, in Copenhagen, the company plans to bring about 5,000 square feet of space in phase one of its new data center there. It expects to complete phase one in the third quarter of next year.
Interxion reported €98 million in revenue for the third quarter – up 13 percent year over year. Its net profit was €10.4 million – up from €9 million for the same period one year ago. |
|