Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, June 11th, 2015
| Time |
Event |
| 12:00p |
Why Telx Acquisition Could Be a Good Move for Digital Realty Acquiring Telx would fit well with Digital Realty’s current strategic direction, while putting the company in direct competition with Equinix, the world’s largest data center services company. Such a transaction would substantially increase Digital’s retail colocation and interconnection services play in all major US markets and add to the list of cloud services available to customers through Digital.
Digital, a San Francisco-based publicly traded real estate investment trust, is considering a more than $2 billion bid to acquire Telx, according to a Wednesday report by Reuters that cited anonymous sources. A representative from Digital declined to comment, referring to the news as a “rumor,” while a Telx spokesman did not respond to a request for comment. Answering analysts’ questions on the company’s most recent earnings call, however, Digital execs did not rule out a potential acquisition of another data center provider.
A deal would have to fit a specific set of requirements, however. The company would have to fit tightly with Digital’s strategy and add to its overall earnings per share. “We obviously would look at efficiencies that could be achieved, both on the operating side and the capital side, but we do believe that any investment we make for our shareholders should be accretive,” Digital CEO William Stein said on the earnings call in May.
Lots of consolidation has taken place in the data center provider industry recently. Equinix announced a plan to acquire TelecityGroup in Europe in May, breaking up a merger between Telecity and another European provider Interxion. Also in May, Digital’s rival QTS acquired Carpathia Hosting, a provider that does a lot of business with US government agencies. In April, CyrusOne, another rival, acquired Cervalis, expanding its presence in the New York market.
New York-based Telx is a private company, owned by private-equity firms ABRY Partners and Berkshire Partners, so information about its revenue or profitability is not publicly available. But it fits strategically with Digital’s recently renewed focus on services beyond its core wholesale data center business.
Digital has been in retail colocation for about five years, but that part of its business has not grown much. An acquisition of Telx would add 1.3 million square feet of data center space across 20 facilities to its colocation portfolio and a multitude of customers. A lot of that space is leased by Telx within Digital facilities.
A robust interconnection ecosystem is crucial to a successful colocation business. The more companies interconnect in a data center, the more attractive that data center becomes for others, and Telx has developed a robust interconnection ecosystem in its facilities.
“Digital Realty on one hand has conceded that its colocation business has been underperforming, coupled with a relatively stagnant interconnection business,” Jabez Tan, senior data center infrastructure analyst at Structure Research, said. “If Digital Realty were to acquire Telx, it could meaningfully alter [Digital’s] market position and potentially make it a serious player in the retail colocation and interconnection segment.”
Besides an increased focus on retail colocation, Digital has also been working to build a network of partners who provide cloud services. The aim is to offer customers joint solutions that include Digital’s data center space and the partners’ cloud services. Like rival Equinix, Telx has been leveraging the rich interconnection ecosystem in its data centers to attract cloud providers and turn the data centers into hubs where customers can get direct access to those providers. Equinix is far ahead of both Telx and Digital in building out a marketplace for cloud services.
If Digital is in fact eyeing a Telx acquisition, it is reportedly not the only potential bidder. There is a number of contenders, both domestic and international, according to the Reuters report.
If that’s the case, an acquisition by Digital would also have a defensive element. Since 2006, Telx, which specializes in running meet-me rooms, has had an exclusive contract to operate meet-me rooms in 10 of Digital’s data centers, including in some of the most important carrier hotels in the US, and a winning bid by Digital would prevent a competitor from taking over control of those interconnection facilities.
“It is a logical thing for Digital to bid on Telx, since Telx runs the meet-me room in several Digital Realty facilities and the company would not be thrilled to have a rival take over that space,” Kelly Morgan, research director for North America data centers at 451 Research, said. “However, such a move would be taking Digital Realty (and its investors) somewhat out of its comfort zone, so there are probably a lot of other issues for the firm to think about internally besides the price.”
While that may be true, Digital has been showing willingness to get out of its comfort zone. The company has a new executive team in place that’s pursuing a strategy of leaning on the traditional wholesale business and expanding it while increasing value of the portfolio by adding a wider variety of services through partnerships. A robust interconnection play, however, is more important today than ever, and acquisition of a company like Telx would improve that aspect of Digital’s business in a big way. | | 3:00p |
IBM Launches OpenPower Developer Cloud As part of an ongoing effort to build an ecosystem around OpenPower processors, IBM announced this week that it has launched a SuperVessel cloud service, through which developers can build applications hosted in a data center managed by IBM in Beijing.
Unfurled at the OpenPower Foundation Summit in Bejing, the SuperVessel cloud service provides developers with access to a series of web-based integrated development environments, through which they can build complex applications spanning high-end graphics to machine learning software.
Alan Lee, vice president of OpenPower innovation for IBM systems, says the goal is to make OpenPower processor technology more accessible to developers that would otherwise not have the access to the capital required to build and test such types of applications at scale before deploying them in a production environment. To help make it easier to provision SuperVessel resources IBM has thus far exposed 13 APIs for the cloud service. At present, that service runs the Ubuntu distribution of Linux, but Lee says support for other distributions of Linux will follow.
“SuperVessel gives developers access to servers,” says Lee. “We’re trying to accelerate OpenPower innovation and grow the ecosystem.”
The service not only provides access to OpenPower processors, developers are also exposed to field programmable gate arrays (FPGAs) and GPUs running in a private cloud managed using OpenStack cloud management software.
The SuperVessel service itself is divided into online “labs” focused on Big Data and Internet of Things applications, along with acceleration and virtualization on OpenPower processors themselves. Examples of projects underway include a ProteinGoggle project from Tongji University that examines protein sequences to better understand human health and a project to optimize the city of Chongqing subway system that is being led by Chongqing University.
The SuperVessel cloud service is currently hosted in a single data center managed in the research and development arm of IBM, which operates separately from IBM’s commercial SoftLayer cloud service. Eventually, IBM plans to extend the reach of SuperVessel to multiple data centers in other cities where IBM has strong ties to a local university.
As part of its effort to usurp Intel, it’s become apparent to IBM that getting the next generation of advanced applications being developed to run on OpenPower processors, which are now being developed by a consortium of vendors led by IBM. While the outcome of that effort may currently unknown, it’s clear that IBM is committed to providing as much access to its processor technologies to developers as the cloud will allow. | | 3:30p |
Leak Prevention in Data Centers: How Prepared Are You? Jeremy Swanner is the Executive Vice President of RLE Technologies.
One of the biggest downtime threats for data center operators is leaks, and preventing them is one of the most fundamental areas of protection for data centers and critical facilities. The 2013 Cost of Data Center Outages study by the Ponemon Institute reported that, of the data centers surveyed, 24 percent of unplanned outages were caused by weather, water, heat, or CRAC failures. Clearly, damage caused by moisture: weather-related fluid intrusion, faulty piping, and structural failures of roofs and windows, are some of the largest problems of which most data centers must contend. At the same time, leaks from water, caustic chemicals, or other corrosive fluids are also easily detected and extremely preventable. The following best practices for leak prevention can help ensure avoid moisture-related downtime.
Don’t Over Protect
Some facilities managers think they need to protect every square foot of space from water intrusions. While this is an attractive option for those with unlimited budgets, in reality the best use of resources is to fully protect critical areas around sensitive equipment with a variety of tools (fluid and chemical sensing cables, zone controllers, humidity sensors) and to use spot detectors to protect the rest of the facility (in some cases skipping nonessential areas altogether). Assess where the threats will most likely come from and only protect the necessary areas.
Don’t Assume You Know Where Water Will Run
In my experience, it is a bad idea to assume that water is going to run or pool in a certain way. When performing facility evaluations after a downtime event due to fluid incursion, I often see spot detectors in very logical places close to sensitive equipment that water has simply gone around. While well intentioned, in these instances I recommend the use of liquid detection cables. They are extremely versatile and can reach most of the areas where fluids discharge in a facility. This allows facilities managers to protect larger areas at a lower cost, and more importantly, removes the onerous task of trying to predict the direction liquid will flow if a fluid intrusion should occur.
When Retrofitting, Think Wire-Free
It’s not always feasible or cost effective to run more cables and wires all the way back to your home monitoring station or BMS/DCIM. Wire-free leak detection can be quickly and easily deployed to communicate wirelessly back to the base unit, saving you installation costs and time. Recent advances in wire-free monitoring technology have made it an attractive option for the modern facilities manager. Extended battery life, energy harvesting technology, ease of scalability, increased fault detection, and more robust security encryption are just a few of the reasons that wire-free sensors are gaining wider adoption in mission critical facilities.
Empower Your Staff
When an alarm goes off, you don’t want your staff waiting for someone else to show up to fix the problem or standing around and wondering what to do. Implement a reporting and documentation system that includes everyone who might be called upon to act in a crisis. Ongoing training for staff allows managers to make adjustments and improvements, analyze and rectify errors, and avoid making similar missteps in the future. When it comes to water and other fluid incursions, seconds truly count. Make sure everyone can act and knows what to do.
There are a wide variety of threats that facilities managers have to contend with on a daily basis: hardware failure, hackers, natural disasters, and of course, human error. Thankfully, downtime due to fluid intrusions in the data center are one of the few threats that can be prevented, thanks to the advances made in modern facility monitoring technology. By employing these four best practices, you can ensure that small leaks are stopped before they cause big problems.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 5:38p |
Hortonworks Makes Enterprise Hadoop Platform Cloud-Agnostic Hortonworks has updated its enterprise Hadoop solution called Hortonworks Data Platform, saying it can now be automatically provisioned in any cloud environment. The company said it is now also easier for enterprises to adopt, simpler to administer, and more secure.
The open source platform provider relies on value-add around Apache Hadoop, focusing on making Hadoop easy to deploy securely across distributed clusters. One of the other big new features in release 2.3 is proactive cluster monitoring.
Hortonworks has also integrated several open source technologies other than Hadoop in the latest release around security and data governance.
Earlier this year, the company teamed with others to create a common enterprise Hadoop data governance framework. One of the fruits of that labor include Apache Atlas, a new open source project to build governance services that now ship with Hortonworks. Atlas features a scalable metadata service, integration with Hive metastore and SQL metrics, as well as a user interface for searching metadata and lineage.
Cloudbreak, the feature for automated provisioning of HDP clusters in any cloud environment, is a result of integration of technology gained through the company’s recent acquisition of SequenceIQ. Hortonworks also recently became an official service on Google’s cloud.
Hortonworks introduced SmartSense proactive monitoring service for subscription customers. It provides insights and recommendations about cluster utilization and health. SmartSense is used for better long-range cluster resource utilization and capacity planning, as well as quick and easy log-file capturing.
The release reportedly has better data protection with transparent data encryption along with an encryption key management store provided by Apache Ranger. Ranger manages authorization and audit policies, while Apache Knox takes care of authentication with bi-directional SSL support and LDAP data caching for improved performance.
“EMC and Hortonworks have a shared vision of a Business Data Lake which provides the ability to bring together data, analytics, and applications to deliver meaningful business outcomes for companies,” Aidan O’Brien, senior director for Big Data solutions at Hortonworks partner EMC, said in a statement. “The new security and data governance capabilities and improved user interface in the latest release of the Hortonworks Data Platform [are] going to make it easier to achieve these outcomes in a more sustainable and secure way.”
An example of user experience improvement in the latest release of the enterprise Hadoop platform is a set of guided configurations of HDFS, YARN, Hive, and HBase, making each easier and more predictable. Hortonworks has also focused on making it easier to optimize your setup. A customizable dashboard shows a cluster’s key performance indicators. | | 6:13p |
New Insights About the Hadoop Infrastructure and Data Since Hadoop 1.0.0. was released in December 2011, it has grown as a strong resource for storing and processing extremely large amounts in data. Now, Hadoop is becoming a cornerstone of IT environments and rapidly moving to full production.
With more and more high-value data being stored and processed within Hadoop, there are new security and compliance challenges facing the data center industry. In a new webinar next week, Hadoop experts from Centrify and MapR will be sharing important insights for you to ensure your environments and data are secure and meet compliance requirements (SOX, PCI, HIPAA, etc) before they are moved to production.
Register Now for the live event on Wednesday, June 17 at 2:00pm ET. Attendees will learn about:
- New security challenges of Big Data
- Extending Active Directory’s Kerberos and LDAP capabilities to Hadoop clusters
- Enabling MapR distribution to run in secure mode
- Enforcing least privilege policies and eliminate the use of root privileges
- User activity auditing with logs and correlation of activity across the cluster
- Health Check services for security and compliance
Meet the Experts
David McNeely
VP, Product Strategy
Centrify
View Bio Here
Dale Kim
Director of Industry Solutions
MapR
View Bio Here
Register Now
Can’t make the live event? No problem. All registrants will have access to the on-demand version. | | 6:20p |
Pivotal Buys Startup Quickstep to Accelerate Enterprise Analytics To boost its enterprise analytics capabilities and drive a new query execution framework, EMC‘s Pivotal has acquired Quickstep Technologies, a University of Wisconsin Big Data startup led by Jignesh Patel.
Key members of the Quickstep team, including Patel, will join Pivotal, as the technology is integrated into Pivotal HAWQ and Pivotal Greenplum Database. Earlier this year Pivotal made its entire suite of enterprise analytics tools open source.
Pivotal expects the improvement in SQL query execution performance in its Big Data suite will target an order of magnitude acceleration for business intelligence, ad-hoc queries, analytics, and data science workloads. In a Pivotal blog post Patel notes that the data processing engine will run efficiently on modern hardware with multi-core processors, large main memories, and newer non-volatile memory technologies.
Funded in part by the National Science Foundation, Quickstep developed a relational data processing engine that incorporates what it calls Bitweaving, which it says allows software to exploit and be in lock-step with advances in hardware utilization and optimization.
Patel is a Professor in the Computer Sciences Department at the University of Wisconsin-Madison and has started and sold two other companies, including mobile data analytics provider Locomatix, which was acquired by Twitter.
Patels says that with the Quickstep project the company has “rethought from the ground up the algorithms that make up the DNA of data platforms so that the platform can deliver unprecedented speed for data analytics. It is time to move our ideas from research to actual products. There is no better home for this technology than at Pivotal given Pivotal’s formidable track record in delivering real value to their customers in Big Data.”
Pivotal says that along with this acquisition it has also licensed technology from the Wisconsin Alumni Research Foundation. | | 7:22p |
EMC’s Virtustream Gets Federal Cloud Certification Virtustream, a cloud service provider going through the process of being acquired by EMC, has received authorization to provide services to US government agencies.
Called FedRAMP Provisional Authority to Operate, the certification confirms that Virtustream’s Infrastructure-as-a-Service offering meets security requirements of civilian agencies.
The FedRAMP certification system was devised to speed up the government’s transition to using commercial cloud services to lower its IT spending levels. Instead of each agency going through the process of verifying whether or not a service provider meets its security requirements, all agencies can simply pick from a list of pre-approved providers.
More than 30 cloud services are FedRAMP-compliant today. That includes multiple services by a single provider and services operated by numerous government agencies themselves.
Virtustream will be another way for EMC to gain government cloud contracts. Its other subsidiary, VMware, received FedRAMP certification in February. EMC announced it would acquire Virtustream for $1.2 billion in May.
Having a provisional authorization puts Virtustream in company of more than a dozen cloud service providers, including heavyweights such as VMware, Oracle, Microsoft, HP, Lockheed Martin, CGI, AT&T, and Akamai.
Providers gain another kind of FedRAMP authorization, called agency authorization, by working directly with customer agencies using their services. This group includes Amazon Web Services, Verizon, Microsoft, and Salesforce, as well as the US Department of Agriculture and Department of the Treasury, among others.
Federal cloud is a massive opportunity for IT service providers. One presentation by an Office of Management and Budget official pegged the total 2015 US government cloud budget at $3 billion.
But competition for those federal cloud dollars is fierce, both giants like Amazon and Microsoft and smaller providers like Virtustream battling for market share.
Amazon and Microsoft have built data centers specifically to serve government cloud customers. Data center provider QTS recently bought Carpathia Hosting, a company with existing relationships with government clients. |
|