Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Monday, January 11th, 2016
Time |
Event |
1:00p |
Who May Buy Verizon’s Data centers? While officially Verizon remains quiet about the alleged auction for its massive data center portfolio, the report that it is looking to offload some $2.5 billion worth of data centers isn’t far-fetched.
Other telecoms too have realized they aren’t prepared to spend as much as they learned was necessary to grow a data center business and stay competitive. This is generally considered a good time to sell, and at least some of the data centers in Verizon’s portfolio are highly valuable from a strategic point of view. There are plenty of companies that could benefit from taking them over, given that the price is right.
The Crown Jewel
The most valuable part of the portfolio consists of the Terremark data centers Verizon took over when it acquired the successful service provider in 2011 for $1.4 billion, and the big prize within that fleet, the crown jewel, is the Nap of the Americas in Miami. One of the world’s most important carrier hotels, it is the primary network interconnection hub between the US and Latin America.
“That’s the prize,” Kelly Morgan, research director at 451 Research who tracks the colocation market, said about the building. “A lot of people would want that.” Who are those people? “Any big interconnection provider should be at least interested in that asset,” she said.
The most obvious contender in that category is Equinix, the world’s largest colocation and interconnection company. Creating interconnection ecosystems within its data centers has always been at the core of its strategy, and controlling an interconnection asset as important as the carrier hotel in Miami would make a good fit.
Read more: What’s in Those Globes Atop the Miami NAP?
Another possible contender is Digital Realty Trust, which after a decade of focusing squarely on the wholesale data center business has recently changed gears and dove head-first into the retail colocation and interconnection market, acquiring last year Equinix’s major US rival Telx for $1.9 billion. Ownership of the Nap would help Digital make the case to the market that it can become a dominant force in interconnection.
Other obvious choices would be CoreSite, a long-time player in colocation and interconnection, as well as CyrusOne, the Texas-based data center provider that’s been emphasizing its interconnection play over the last several years.
Spokespeople from Equinix, Digital Realty, CoreSite, and CyrusOne declined to comment for this story.
The Other Prize: Culpeper
While Nap of the Americas is the prize in Verizon’s Terremark data center bundle, it’s not the only prize. Another important asset is the former Terremark campus in Culpeper, Virginia, called Nap of the Capital Region. The four-building campus is about 60 miles away from the big Northern Virginia data center cluster, but it has attracted a solid list of enterprise clients any provider would be happy to get its hands on.
Read more: Inside Terremark’s Culpeper Data Fortress
 The Nap of the Capital Region in Culpeper, Virginia — one of the data center assets Verizon gained through its 2011 acquisition of Terremark.
Interestingly, one of the things that attracted Verizon to Terremark was the expectation that it would bring more government clients, and the Culpeper campus, being close to Washington, DC, was where those clients would presumably put their servers. In fact, Verizon was leasing a lot of space in the Culpeper campus to serve its government customers before the acquisition. “Verizon bought them to get a lot of federal business,” Morgan said. The campus did just OK with the feds – not as great as expected – but it did get a lot of traction with private enterprise customers, she said.
The Bad with the Good?
Another industry insider, who spoke on condition of anonymity, agreed that Terremark was an asset that would be of interest to a lot of companies. It’s too early in the process to try to predict what will happen, however, since it’s unclear whether Verizon wants to sell those key Terremark buildings at all. It’s also unclear whether it wants to sell its entire data center portfolio in one go or offload it piece by piece.
In all, Terremark had 13 data centers when Verizon bought it, which also included sites in Dallas, Silicon Valley, Sao Paulo, and Amsterdam – all major markets where providers generally don’t have trouble selling capacity. But there’s also the legacy Verizon data center portfolio, and if the telco will insist on selling the whole package, things will get complicated.
It’s unclear how many there are, but the company said at the time of the acquisition that it had “more than 220 data centers across 23 countries.” Today, it lists more than 40 data center locations around the world but doesn’t provide much detail about the amount of facilities or their size. If the reported value of the portfolio ($2.5 billion) is close to the truth, those older non-Terremark sites can’t be big given their large amount and value of the Terremark portfolio.
We don’t know what Verizon’s actual plans are – the company isn’t sharing those publicly – but if it offloads just the Terremark crown jewels on their own, selling the rest of the portfolio will be difficult. If it does insist on selling all of them at once, whether it can find a buyer that will want all the smaller Verizon sites in addition to the Terremark assets will be a big question, Morgan said.
Telcos and Colo – a Difficult Marriage
What is clear is that data center services turned out to not work as well as many of the telcos that expanded into the market thought they would. Both telecoms and colocation are extremely capital-intensive, and it’s possible that telcos that are now looking for alternatives to owning their data center portfolios – CenturyLink, Verizon, and also reportedly AT&T – had underestimated how much they would have to invest to grow their colocation businesses.
The expectation was that data center services would create additional revenue for network services and increase customer “stickiness,” Morgan said. They did to a certain extent, but possibly not to the degree that they expected.
Telecoms is also a very different business model from colocation, hosting, or cloud. “Hosting business and cloud business is dynamic, you need a lot of very qualified staff,” she said. “It’s a hard thing to sell, especially for telco sales people. It’s a very different sales process.”
As some telcos are either rethinking their strategy as it relates to data centers or have gotten out of the data center business (Windstream sold its data centers to TierPoint last year), others are forging full-speed ahead.
Examples of the latter are Japan’s NTT Communications, which has been buying data center companies in the US, Europe, and Asia, and Canada’s Shaw Communications, which bought the US data center provider ViaWest in 2014. The two telcos don’t appear to be reluctant to invest in expanding their new subsidiaries, and it will be interesting to see how these investments play out over the next several years.
It’s too early to pronounce that the telco sector’s data center ambitions have also been “f!@#ed by the cloud,” as Bloomberg’s Ashlee Vance put it when describing the strife of IBM, HP, Dell, EMC, and Cisco. One step in the right direction would be to let the data center companies they acquired do what they do best without dragging them into the quagmire of telecom bureaucracy, Morgan said. Another would be to continue investing in their growth, since scale is important for success in the business. And yes, the Cloud. Making sure customers can connect to as many cloud providers as they want from your data centers is crucial. | 4:00p |
The Struggle to Extend Enterprise-grade Mobile Access to Files Andres Rodriguez is CEO of Nasuni.
Two seemingly incompatible forces have collided in the enterprise over the past few years. The standard approach to storing and protecting files has come into direct conflict with the employee’s demand for mobile access to data. Employees want their files no matter where they are or what device they happen to be using. And they have proven that they’ll do anything to get those files, even if it means circumventing IT departments and all their carefully constructed security and enterprise controls.
So, how should enterprises extend employees the mobile access they demand without sacrificing performance and access for control, security and compliance? Most providers have approached this problem from one of two directions – consumer file sharing or enterprise storage.
The first set of solutions, file sync and share, has its roots in the user’s perspective. The easy interface, consumer-like features and simple productivity all appeal to the employee. Plus this group, which includes Box and Dropbox, benefits from a massive consumer customer base and broad name recognition. Yet popularity means little when you’re trying to satisfy enterprise IT requirements. As a result, the file sync and share providers have been building out features to attempt to make their service approach enterprise-grade.
For example, file sync and share solutions encrypt user files, but the solutions provider does the encrypting. In other words, they hold the encryption keys, which means a rogue employee at the solution provider could unlock your data. A government agency could even subpoena them and unlock your files without your knowledge. In addition, file sync and share providers have added enterprise controls such as integration with directories, but even with these changes, that fundamental flaw remains. They still own the encryption keys.
Another problem lies in the way these solutions extend access. With file sync and share, you end up with a second copy of your data somewhere else. Not only do you pay extra for this copy, but you increase the likelihood of multiple employees working on different versions of the same file. In a larger sense, whether you’re talking about security or file conflicts, what you sacrifice with file sync and share is control. You give up control of security, user access and the data itself.
The other common approach to extending mobile access comes from the enterprise storage players. For these vendors, security, compliance and control are second nature. In one sense, they have already accomplished the most difficult feats: making storage work at enterprise scale; building systems that can keep up with high performance workloads; and importantly, addressing encryption and compliance. It’s the mobile access piece that presents a challenge.
The standard enterprise access protocols – NFS, CIFS, SMB – do not work on phones and tablets. These devices are not built to access corporate networks like a PC or laptop. So enterprises turn to a third-party software overlay like Syncplicity, which EMC bought in 2012 and jettisoned this summer. There are a number of these kinds of solutions, with mixed levels of functionality; but they face many of the same obstacles. Once they move corporate files to a mobile device, for example, they have to figure out how to secure them, and this has not proven to be an easy task. Installing one of these third-party solutions also means adding yet another layer to an already complex and difficult to manage storage stack.
Ideally, a solution that extends mobile access to employees should combine the user-first perspective of file sync and share with the gravitas of the enterprise and all its associated security, compliance and control. These are difficult standards to meet, but an even harder task falls to the enterprise CIOs who must strike a balance between protecting corporate data and keeping employees happy and productive. | 6:37p |
Interxion Security Breach Exposes Customer Contacts European data center services giant Interxion suffered a security breach in December that exposed the contact information of 23,200 existing and prospective customers.
The breach of Interxion’s CRM system, which the company says has been fixed, did not affect its customers’ financial information, such as credit card details, or the infrastructure companies house in the provider’s data centers.
Interxion notified customers of the security problem via email, a copy of which was obtained by The Register.
“The business contact information that was accessed consisted of names, job titles, and (business) contact details such as (business) email addresses and phone numbers,” the email read.
“No financial or other sensitive customer data was accessed, or is stored within this system. We emphasize that this incident only affected Interxion’s CRM system and did not impact or involve any of the data centers or services that Interxion provides.”
The company says on its website that it has 40 data centers across 11 countries in Europe and serves more than 1,500 customers. It is the second-largest data center provider in the region.
Equinix is expected to become Europe’s largest provider if it successfully closes the acquisition of another European giant TelecityGroup. | 6:46p |
EMC Puts New Man in Charge of Converged Infrastructure Business VCE 
By Talkin’ Cloud
Following the company’s recent purchase by Dell, EMC has named company veteran Chad Sakac as president of its VCE division, effective immediately. Sajack replaces former VCE President Praveen Akkiraju, who will act in an advisory role to EMC Information Infrastructure President David Goulden.
Sakac will work to drive product innovation across the VCE converged infrastructure and solutions portfolio, in addition to fulfilling his previous duties as the head of EMC Global Systems Engineering. He will report to Goulden in his new role, according to the announcement.
The decision to name Sakac as president of VCE comes at the same time that EMC is looking to fully incorporate the converged infrastructure developer into its existing portfolio. In fact, VCE is scheduled to officially become the EMC Converged Platforms Division, giving EMC the ability to offer a wider swath of converged solutions to its customers and simplify technology deployment, according to the company.
“Going forward, having the VCE team more deeply integrated within EMC allows us to leverage synergies, simplify our go-to-market and improve time to market for our customers,” said Goulden, in a statement. “EMC has never been better positioned to capture the huge opportunity that exists for converged infrastructure and solutions.”
EMC announced its plan to take controlling interest in VCE in late 2014 after it purchased most of Cisco’s share in the company.
Back in 2009, EMC, Cisco and VMware agreed to share ownership of VCE as a joint venture. Since then, VCE has broadened its portfolio with a number of new converged infrastructure solutions, such as rack storage and other appliances, in addition to its Vblock hardware.
The division is currently valued at more than $2 billion, according to VCE’s latest fiscal year 2015 figures.
This first ran at http://talkincloud.com/cloud-computing-mergers-and-acquisitions/emc-names-chad-sakac-president-vce | 8:49p |
Panasonic Intros Cold Storage Tech Born at Facebook Data Centers Panasonic and Facebook have created a commercial cold-storage technology for storing rarely accessed data for prolonged periods of time cheaply.
The storage system, based on technology Facebook created to address the ballooning costs of storing all of the photos its users upload to the social network, uses a new type of optical disc Panasonic calls Archival Disc. It is the second-generation medium to the Blu-ray discs Facebook currently has in its cold-storage data centers.
Unveiling its Freeze-Ray system, Panasonic emerged as a cold storage competitor to Sony, which last year acquired a Facebook spinoff that was working to commercialize Blu-ray-based cold storage technology.
Read more: First Look: Facebook’s Oregon Cold Storage Facility
That startup, called Optical Archive, was founded by Frank Frankovsky, a former Facebook infrastructure and supply-chain head who played a key role in shaping the way the company designs and procures its data center infrastructure. For several years, Frankovsky was also the human face of the Open Compute Project, Facebook’s open source data center and hardware design initiative.
Facebook has built separate facilities specifically for cold storage at its data center campuses in Prineville, Oregon, and Forest City, North Carolina. They have simpler, less redundant infrastructure than the primary data centers and cut the amount of energy needed to store old user data by 75 percent, the company’s engineers said last year.
Read more: How Facebook Cut 75 Percent of Power It Needs to Store Your #tbt Photos
Storage capacity of Panasonic’s single Archival Disc is 300GB – three times the capacity of Blu-ray discs that store old Facebook photos in the social network’s cold storage data centers, Panasonic said. Facebook will be deploying the new systems later this year, according to the announcement.
“As Facebook continues to grow, we needed to address some of our fundamental engineering challenges with an efficient, low-cost and sustainable solution that matches our speed and exabyte-scale of data,” Jason Taylor, Facebook’s VP of infrastructure, said in a statement. “We’re seeing exponential growth in the number of photos and videos being uploaded to Facebook, and the work we’ve done with Panasonic is exciting because optical storage introduces a medium that is immutable, which helps ensure that people have long-term access to their digital memories.”
The companies said they will continue working together to create next-generation 500GB Archival Discs, followed by 1TB discs that will enable a data center operator like Facebook to deploy multi-petabyte cold storage systems.
Read more: Google Says Cold Storage Doesn’t Have to Be Cold All the Time | 9:02p |
Kaspersky Unveils Private Security Cloud for Enterprises 
By The Var Guy
Kaspersky Lab has introduced a new security cloud that provides real-time threat updates to customers without sending company data to the public cloud.
Kaspersky Private Security Network is a private cloud that includes an internal copy of Kaspersky Security Network (KSN), the company’s distributed cloud infrastructure with servers in various countries that processes on-the-fly requests from Kaspersky solutions on corporate and home user computers.
Using the cloud helps to quickly and more accurately analyze new types of malicious programs or websites detected on client devices, according to Kaspersky. When security applications encounter an unknown threat, they contact remote servers for a resolution, and receive an immediate answer. This is different than the conventional way of sending and receiving threat information, which typically takes a few hours to update databases, the company said.
KSN sends information into the cloud to test whether a file or website is dangerous or innocuous in this instantaneous way, and it currently protects about 80,000,000 users per year. However, some companies don’t want to expose internal data to the public cloud, but still want this type of real-time security protection, according to Kaspersky.
Enter KPSN, which provides all the advantages of KSN without doing one thing the public security offering does—communicate with outside servers to receive data about program and website reputations and possible threats, according to Kaspersky.
Instead, KPSN uses databases installed on servers located within the corporate information infrastructure. The network uses regular one-way synchronization with KSN to receive real-time data about threats to protect a customer’s network so not a single shred of data is sent from the corporate network to the cloud, the company said.
“In large companies and in state organizations, there are typically very strict information security policies in place regulating inbound and outgoing data traffic,” said Kaspersky CTO Nikita Shvetsov in a press release. “However, in light of an ever-growing number of cyber-threats, security solutions work most efficiently only when they maintain a continuous data exchange with a cloud, which contains the most recent threat data.”
A private security cloud like KPSN allows a company to take advantage of the opportunities provided by the distributed KSN within the walls of its IT infrastructure, in full compliance with specific company requirements, policies and needs, he added.
KPS is available now, joining Kaspersky Endpoint Security for Business, Kaspersky Security for Virtualization, Kaspersky DDoS Protection and other offerings in the company’s package of solutions designed to provide large businesses with information security.
This first ran at http://thevarguy.com/network-security-and-data-protection-software-solutions/kaspersky-unveils-private-security-cloud-boo | 9:59p |
Citrix Sells CloudStack Orchestration Platform to Accelerite 
By TheWHIR
Accelerite has acquired Citrix’s CloudStack-based cloud orchestration platform, CloudPlatform, as well as its unified cloud services delivery and business management platform CloudPortal Business Manager.
According a Monday announcement from Citrix, the acquisition will let Citrix focus its “core priorities around the secure delivery of apps and data.”
Customers using CloudPlatform and CloudPortal Business Manager product lines commercially will be managed by Accelerite.
CloudPortal Business Manager provides cloud service automation for provisioning, billing, metering and user management. This lets them provide a broad array of cloud services while integrating with existing business, operations and IT systems.
Accelerite is the products division of enterprise software developer Persistent Systems and a leading player in cloud management and enterprise mobility management software, as well as theInternet of Things.
This acquisition represents just its latest major acquisition in the cloud computing and virtualization software space which has included the acquisition of various product lines from the likes of HP, Intel and Openwave. In June, it acquired Convirture, a provider of unified management software for virtualization and cloud environments. And, in October, it began its IoT business with its acquisition of Aepona from Intel.
“The addition of CloudPlatform and CloudPortal Business Manager product lines will help round out our portfolio and enables us to offer complete end-to-end life cycle management for public and private clouds,” Accelerite CEO Nara Rajagopalan said in a statement. “Although we see an increase in both the acceptance and adoption of containers in today’s clouds, large gaps still exist in an enterprise’s ability to deploy and manage containers in these environments. CloudPlatform, with its simplicity and ease-of-use along with its tremendous customer base, is a great platform for enterprises to use to address this emerging need to help evolve the cloud into its next phase of hyper-convergence.”
Citrix will work with Accelerite on CloudPlatform software integrations including integrations with XenServer, NetScaler and Citrix Workspace Cloud. Accelerite has committed to working with the Apache Foundation and contributing its CloudPlatform roadmap to the Apache CloudStack project.
Neither company disclosed the financial details of the transaction.
This first ran at http://www.thewhir.com/web-hosting-news/accelerite-to-acquire-citrixs-cloud-orchestration-platform-cloudplatform |
|