Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, January 12th, 2016
Time |
Event |
1:00p |
Digital Realty Leans on IBM, AT&T to Hook Enterprises on Hybrid Cloud Digital Realty Trust is teaming up with some of its largest cloud-provider customers, starting with IBM and AT&T, to court enterprises, most of whom it expects to eventually transition to a hybrid infrastructure strategy, combining dedicated on-prem infrastructure with cloud services.
Put simply, the plan is to invest in supporting what the San Francisco-based real estate investment trust calls “Scale” customers — the cloud providers — while letting them take the lead in selling hybrid cloud to enterprises. It intends to spend up to $900 million to support its leasing pipeline, mainly for cloud providers.
Top Digital Realty execs outlined the strategy at last week’s Citi Internet, Media and Telecom conference in Las Vegas in a session moderated by Citi telecom analyst Mike Rollins.
As it has described earlier, the REIT is looking to leverage its global wholesale data center fleet in combination with colocation and interconnection expertise of Telx, which it acquired last year, to expand colocation offerings adjacent to existing large-footprint cloud deployments.
The company’s three strategic priorities for 2016 are supporting its cloud customer base around the world, integrating Telx, and expanding interconnection and colocation capabilities, initially in the US and then globally, Digital Realty CEO Bill Stein said.
Scaling the Cloud
Digital Realty CTO Chris Sharp, who joined last year after leaving the company’s rival Equinix — which also happens to be a major customer — provided much of the color on the operating strategy, and a lot of the discussion centered on a comment he made, saying, “Hybrid cloud is the end-state where every customer is going to end up.”
Read more: Equinix CEO Unveils Aggressive Plan to Court Enterprises
Sharp clarified that Digital’s cloud focus was to support its large-footprint, or Scale customers in a joint effort to cultivate enterprise business. It has the facilities expertise and owns the largest global data center network, and its team is happy to leave the “heavy lifting” of hybrid cloud architecture to experts like IBM and AT&T.
The REIT doesn’t plan to be completely hands-off in its cloud partnerships, however. It intends to “…align efforts with their demand cycles, and get a tighter funnel that is multiple years out,” Sharp said. Telx placing colo next to cloud nodes will help optimize latency, security, cost, and control issues for hybrid cloud customers, he added.
Big part of it is also leveraging the cloud providers’ expertise in the market and salesforce. “IBM and AT&T take foundational services out to market to penetrate thousands of enterprises,” he said.
Read more: IBM to Take Over AT&T’s Managed Hosting Business
Minding the Store
According to the company’s CFO Andy Power, Digital expects to start realizing the $15 million in Telx cost synergies beginning January 1, as well as $148 million in EBITDA from the 20 Telx facilities for fiscal 2016. While revenue synergies will begin to show, the impact will be felt primarily in 2017 and beyond.
Power expects Digital to continue its focus on same-store performance from aggressively trying to renew leases at higher rates, keeping operating expenses in check and “…being mindful of our cost structure.” While growth can be expensive, Digital is clearly focused on delivering for shareholders in 2016.
Expanding Telx
Stein, the CEO, said expansion priorities for Telx include Northern Virginia (Ashburn), Dallas (Richardson), and Chicago (Franklin Park). Internationally, he expects to see a large Telx role in Digital’s expanding campus in Singapore, as well as its campuses outside of London and Dublin.
Digital is in the process of consolidating redundant office space in New York City, turning it into additional colocation space for Telx at the massive Google-owned carrier hotel at 111 8th Ave., one of the world’s most important network interconnection hubs. On the West Coast, the colocation building at 365 Main St. in San Francisco will be handed over to the Telx team.
 Digital Realty Trust’s 365 Main data center in San Francisco, the REIT’s home base (Photo: Digital Realty Trust)
Stein pointed out the importance to Digital of smaller retail installations of five to 10 cabinets, which can act as incubators for customers to grow into multiple facilities. He gave the example of a 25 kW customer in Sydney, which according to him has grown into one of Digital’s top five customers globally.
 The exterior of 111 8th Avenue, one of the premier carrier hotels in Manhattan.
US Market Healthy, Save for New Jersey
Commenting on supply-demand dynamics in the US, Stein said, “All markets were in good shape except New Jersey.”
Across the country, the supply equation remains far more rational than two years ago, while demand has been accelerating. Meanwhile, New Jersey remains challenged by power costs, which are much higher than in Northern Virginia, and a “challenged” financial services vertical, which comprises the primary data center customer base in New Jersey.
Read more: DuPont Fabros Wants to Sell New Jersey Data Center, Exit Market
Stein observed that there was very little demand for “just shell” or powered-base data center product. Digital would only entertain building more of that product at a customer’s request, he said. Turn-key is the typical solution in demand by those Scale customers.
Business Update
The uptick in acquisitions in 2015 by smaller publicly traded data center REITs has not had any notable effect on Digital Realty, its execs said.
Powers pointed out it was too early in the game to comment on Digital’s current valuation as a public company in light of the recent M&A activity, sch as CyrusOne’s acquisition of Cervalis, the QTS acquisition of Carpathia, or its own $1.9 billion Telx deal. With many telco data centers now on the market, there should be more data points which will give Digital “…a better read later this year,” he said.
DLR shares, which have traded in a 52-week range of $59.53 – $77.34 per share, hit a new intraday high of $77.67 on Friday January 8, the first full trading day after the Citi conference. | 4:00p |
The Hybrid IT Mash-up Matt Gerber is CEO of Digital Fortress.
Love it or hate it, hybrid IT is here in force and it’s here to stay. The global market for hybrid cloud computing is estimated to grow from $25.28 billion in 2014 to $84.67 billion in 2019, according to a 2015 study published by Markets & Markets. Nearly half (48 percent) of enterprise respondents say they will adopt hybrid cloud systems and services in the near future.
Public cloud purists don’t like the idea of companies taking a steppingstone approach to cloud adoption; yet the reality is, many large companies are not ready to make a wholesale change. Compliance and regulatory requirements may stand in the way, or, they have invested too much money in on-premise systems that are still business-critical and don’t transition easily to the cloud. Hybrid cloud is, for many companies, a wonderful blend of the old and the new, offering a highly practical and manageable approach to innovation. You can maintain your highly customized, workhorse ERP system inside your own data center, while adding new agile customer-facing apps to the cloud.
Integrating networks and deploying new management systems which provide visibility into private and public infrastructure is a must, yet the technology platform for hybrid cloud is just one consideration. Static IT and organic IT worlds must come together. Developers, infrastructure managers and architects from across the IT organization must work together to ensure seamless operations. Hybrid IT is first a human problem requiring the resolution of conflict between internal teams, the division and sharing of responsibilities, and the creation of new roles.
Mixing old and new IT models is not for the weak at heart. Success depends upon a leader who can help these two worlds peacefully coexist. At the same time, CIOs should take the lead in moving staffers toward an agile, collaborative workflow which matches the requirements of deploying the modern IT stack encompassing cloud, mobile, Internet of Things and social media technology.
Start by Rethinking the IT Organization
How long have we been talking about getting rid of IT departmental silos? Too long. As companies adopt hybrid cloud infrastructure, IT teams will need to shift from silos working in isolation to cross-functional teams working on a specific project, such as a big data platform. This is the new reality of the DevOps world. That means combining developers, testers, integrators and infrastructure experts on project teams to execute toward a specific goal. New organizational models are imperative because of the interdependence of technology and the rapid pace of change and development in the public cloud reality. Silos don’t fit well in the world of software-defined everything and virtualization. There will be some individuals reticent if not resistant to change. Creating a “tiger team” of well-liked leaders to help lead the charge for that first hybrid project is an effective way to start getting everyone on the same page.
Adopt New IT Metrics for Success
In a converged world, departmental metrics such as uptime for infrastructure or lines of code written for developers are less relevant than broader based measures relating to how DevOps teams achieve and delivers business goals. Those may include at the highest level team’s time to market, time to resolution, and end-user department satisfaction. Look to business managers for help in designing new metrics. Cloud and Agile development also call for shared accountability. No longer is the CIO or CTO on the hook for every success or failure; the team members down to entry-level engineers and testers should all have a stake. Introduce specific project-based goals and individual incentives (preferably financial) for your teams if goals are met.
Divide and Conquer
A company that is blending on-premise systems with private cloud hosted and/or public cloud workloads has a lot to juggle. It makes sense to divert all of the day-to-day infrastructure management responsibilities which are not strategic to achieving the business objectives – no matter where it resides –to one group, while moving end-user application and business-aligned IT projects to another group. A company can set up these teams internally, if it has the right skills, or use a third-party outsourcer to help fill the gaps. In some cases, a third-party that can bridge the gap between the “old” and “new” worlds of IT may also be helpful in delivering strategic guidance, oversight and conflict resolution among the teams. A third party may also be useful for offloading tasks that are not strategic, such as monitoring and management of the old and new IT infrastructures.
Lead for Speed
Organic IT, characterized by cloud computing and DevOps, means that the computing environment is always in motion. Infrastructure resources may change by the hour in the cloud; requirements from users or business leads can change by the week. Senior IT leaders will need to get their hands dirty by inserting themselves into day-to-day operations. This may mean more status meetings, or simply and roaming the floor more to check in on teams and help with decisions. It also means empowering people to make changes independently when needed, while also deploying automated technologies to make the no-brainer changes to code or servers as conditions dictate.
Cross-Training
As discussed earlier, hybrid IT requires the breaking down of silos and developing cross-functional DevOps teams. Individuals with narrow job roles will need to learn the basics about other functions, along with the higher level skill of working collaboratively. In companies with no plans to get rid of internal data centers nor large legacy applications in the near future, some individuals will be retained to focus on those “static” or traditional IT responsibilities. Not everyone will be happy about watching colleagues walked bravely into the future while they are forced to “keep the lights on” for the sake of business continuity.
Giving traditional IT staff ample opportunities to work in new areas, such as with the DevOps team, can boost morale while also developing skill sets needed for tomorrow. As well, DevOps and cloud-focused employees should occasionally rotate into projects on the internal IT team. Walking in the shoes of others helps all team members understand the bigger picture of hybrid IT, and how the pieces work together. This can lead to continual improvement, better integration and faster innovation, as well.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 6:33p |
QTS and Carpathia Become Single Company 
By The WHIR
QTS Realty Trust announced on Tuesday that it has completed the integration of Carpathia Hosting operations and that all of its data center, managed hosting, and cloud services are offered under the QTS name.
QTS acquired Carpathia Hosting in June 2015 for $326 million to grow its geographic footprint to international markets and enhance its portfolio with government cloud services. The completion of the integration of Carpathia gives QTS 24 data centers on three continents.
“With the substantial completion of the integration process, our over 700 employees have created a bigger, better, stronger QTS,” QTS CEO Chad Williams said in a statement. “QTS is uniquely poised to deliver world-class infrastructure with value added technology services to our over 1,000 customers.”
Read more: Why QTS Dished Out $326M on Carpathia Hosting
QTS chief product officer Peter Weber led the integration process, according to a report by REIT.com.
According to QTS, the company merged Carpathia’s “resources and cultures into its existing operational structure, identifying efficiencies and streamlining duplication.” The company is now well-positioned to serve enterprise and government clients internationally, according to the statement. In an interview with REIT.com, Williams said that more than 40 percent of QTS’s revenue comes from customers buying more than one of its products.
“Our current customers will continue to benefit from our expanded product and services offerings, while receiving the same premium customer experience they have come to expect from QTS and the former Carpathia,” Williams said. “We look forward to building on our legacy of success and positioning QTS to meet the ever changing and complex IT needs of businesses today and tomorrow.”
This first ran at http://www.thewhir.com/web-hosting-news/qts-realty-trust-completes-integration-of-carpathia-hosting | 8:18p |
Tableau Launches First European Data Center for Location-Conscious Tableau Software, based in Seattle, has launched a data center in Dublin, Ireland, its first in Europe, in response to growing interest from customers in choosing physical location of their data.
The annulment of Safe Harbor laws by the European Court of Justice last year has created a lot of uncertainty about legality of hosting European citizens’ data in data centers located overseas, and hosting their data within Europe’s borders is one sure-fire way to ensure compliance.
“With the opening of our European data center, we are responding to a desire from customers to choose where they host their data,” James Eiloart, Tableau’s VP of operations in Europe, said in a statement. “From data solutions and deployment options to data discovery paths, we are now enabling customers to choose where they want their cloud analytics data stored.”
Existing customers can move data to the Dublin data center, and new ones can select it as the location for their files from the get-go.
Read more: Safe Harbor Ruling Leaves Data Center Operators in Ambiguity
Tableau did not specify which data center in Dublin it is using or whether it is leasing data center capacity (the more likely scenario) or operating its own facility.
The company, whose software offers companies advanced analytics capabilities to generate business intelligence, currently has more than 35,000 customers in more than 150 countries, it said. About 3,000 of them use its software as a cloud service, and half of those users are not in the US. | 8:26p |
Competition, Innovation Drive Down Enterprise Cloud Costs: Report 
By The WHIR
The price of enterprise cloud computing services has dropped by two-thirds since 2013, but is starting to stabilize, according to a report released Monday by Tariff Consulting LTD. The Pricing the Cloud 2 – 2016 to 2020 report puts the cost of entry-level cloud instances at about 12 cents an hour with Windows OS.
The report follows up a 2014 report, with research running through November 2015. It surveys published prices for more than 20 public cloud providers among 45 companies considered worldwide and breaks down into public and private clouds, with the pricing categories pay as you go, hybrid, and private.
Read more: Digital Realty Leans on IBM, AT&T to Hook Enterprises on Hybrid Cloud
Furious competition in public cloud price and “rapid product innovation” are driving cloud prices down, and TCL counts over 500 product features launched since 2008 by AWS, which is the largest provider, with one quarter of the IaaS market. AWS also made its 51st price cut just a week ago.
TCL notes that competitors like Rackspace have narrowed the range of pricing over the two-year period. Reductions in private cloud prices are also encouraging hybrid adoption, and creating a market opportunity for cloud integration.
Read more: How Microsoft Plans to Win in Enterprise Hybrid Cloud
Cloud pricing has also become “more rational” in the past two years, with free tiers of service being replaced by one or three-month free trial periods.
As the price range decreases and service innovation is becoming the main differentiator, with Compute Instances being offered for specialized and intensive needs, and analytical services for cloud applications being introduced. This conclusion confirms what Faction CEO Luke Norris told the WHIR in 2014, when the cloud price cuts were fast and furious. Peer 1 found shortly after that customer were not entirely thrilled at the market’s rapid changes.
TCL forecasts cloud pricing will fall by a further 14 percent over the next four years, while public cloud revenues treble to $82 billion USD.
This first ran at http://www.thewhir.com/web-hosting-news/competition-rapid-product-innovation-drives-down-enterprise-cloud-costs-report | 8:55p |
Emerson Adds Data Center Cooling Management in 3D Emerson Network Power has added 3D visualization capabilities for data center cooling management in the latest release of its DCIM software suite Trellis.
The new environmental monitoring and management module is called Thermal System Manager. It tracks the data center’s thermal profile to the device level, the company said.
3D visualizations of the environment are generated in real time, and users can pan and rotate the models to understand airflow and temperature profiles in order to fine-tune their data center cooling sytems.
Read more: Why CA Stopped Selling its DCIM Software Suite
Another major part of the update is integration with DSView management software, which enables remote access and inventory management simultaneously. Aimed at IT staff, the feature enables IP-based remote infrastructure management.

Example of 3D visualization of the thermal conditions on the data center floor by Trellis (Image: Emerson)
Other new features:
- Performance and scalability enhancements specific to event processing, which address complex environments and growing needs in data centers of all sizes.
- User interface improvements, including a user-friendly landing page and global header bar.
- Enhancements to the Trellis power calculations, which reduce the risk of downtime due to power phase imbalances, and also reduce stranded capacity by balancing loads across all three phases.
|
|