Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, January 21st, 2016

    Time Event
    1:00p
    Exclusive: How QTS Plans to Keep Momentum After Two Years of High Growth

    QTS Realty Trust has been one of the fastest-growing publicly traded data center REITs since its 2013 IPO, and its shares returned more than 80 percent price appreciation to shareholders for the last two years.

    Can the company maintain this momentum going into 2016? That’s the question we asked its CIO Jeff Berson and COO Dan Bennewitz in a recent interview.

    Last week, JP Morgan selected QTS as one of two data center REITs with an Overweight rating, along with sector peer CyrusOne.

    Read more: JP Morgan: Data Center REIT Stocks Undervalued

    Carpathia Integration

    QTS completed the integration of Carpathia, the colocation and managed hosting company it acquired last year, earlier this month, expanding its geographic footprint domestically and internationally, adding a substantial managed hosting capability, and gaining a lot of government customers. The international markets where QTS now has presence are Toronto, London, Amsterdam, Hong Kong, and Sydney. Carpathia added over 230 new logos to the QTS tenant roster which has now grown to more than 1,000 customers worldwide.

    Read more: Why QTS Dished Out $326M on Carpathia Hosting

    Bennewitz said sales efforts have now been combined into one product team. He confirmed that QTS intends to take a “measured approach” to migrating former Carpathia customers in leased facilities to QTS-owned campuses.

    QTS will work with customers on a case-by-case basis to assess the pros and cons of moving over to a larger and potentially more efficient QTS facility. Bennewitz hopes that by “being a good partner to customers,” QTS will reduce potential churn, as many Carpathia tenant lease terms expire in 2016 and beyond.

    The Big Picture

    Businesses are showing an increased interest in outsourcing in-house data center operations. According to Berson, QTS will reap the benefits as this slice of market gets bigger.

    There is one consistent theme that QTS is hearing from nearly every CTO during lease discussions. Their IT stack is becoming more complex, and there is considerable uncertainty on how best to approach colocation and cloud for different applications.

    QTS is actually an acronym for Quality Technology Services, the company’s prior name. Since 2003, it has steadily built its capability to offer a “full hybrid IT offering in-house.” Berson emphasized this capability was the reason QTS is now engaged in higher-level discussions with customers in 2016.

    In-House Solutions

    QTS offers customers a suite of solutions it calls C3, C1 being wholesale data center space, C2 retail colo, and C3 cloud and managed services. Notably, over 40 percent of its monthly recurring revenue comes from customers who utilize more than one product.

    Berson pointed out that while growing data center usage is a demand driver for the entire industry, QTS and “the other larger incumbents” have been successful in landing “more than their fair share” of the deals in the marketplace. The C3 cloud and managed service offerings are one of the main differentiators for QTS as it competes for data center customers.

    One-Stop Shop Allows Flexibility

    QTS customers are concerned about the future hybrid environment they might require as needs evolve, taking into account: multiple vendors, managed services, hosting, direct connect, consulting services, compliance and security.

    COO Bennewitz emphasized that customers don’t want to become locked into a one-size-fits-all solution. When they contract with QTS, there is flexibility to modify the contract as needs change. One example would be reducing a footprint while adding additional services.

    Berson contrasted the QTS one-stop approach with situations where customers have entered into a contract to lease space from a landlord, while critical services are being provided by third parties. Since the revenues are going to multiple entities, it isn’t a simple matter to modify the agreements and go in a different direction once the deals have been inked.

    Some data center companies offer colocation and hosting options for customers but lack a wholesale solution for larger-footprint data center deployments. Conversely, not all data center REITs that offer Infrastructure-as-a-Service have engineers on staff to help customers migrate their IT stack from company-owned facilities into a colo and cloud environment and offer assistance with hybrid-cloud solutions.

    Business Drivers and Updates

    Booked-Not-Billed – The record C1 wholesales leases signed in 2014 continue to provide visibility into future growth. QTS continues to build out large chunks of this space in phases. Moving forward, Bennewitz expects this should result in a “normalized” book-not-billed in the $30 million to $40 million range.

    Healthcare – Concerns regarding HIPAA compliance and security in the healthcare industry have resulted in some big wins for QTS in the Atlanta market. Hospitals faced with investing large amounts of capital to upgrade legacy data centers are attracted to solutions available at the QTS Atlanta-Metro campus. In addition to a secure solution for data, secondary gains include freeing up capital for high-priority projects and freeing up physical space on the hospital campus.

    Federal Government – While governmental agency business is “an exciting potential opportunity,” a lack of visibility regarding the timing of any lease signings makes it difficult to model future revenue contributions.

    Chicago – QTS continues to execute on its strategy of purchasing underutilized infrastructure-rich properties at a steep discount to repurpose into data center campuses. Former “brownfield sites” in Atlanta, Dallas, and Richmond, Virginia, are currently in operation as state-of-the-art QTS data center campuses.

    Bennewitz confirmed that the initial phase of the former Chicago Sun-Times Press facility should come online in the second half of 2016. Initially, 8MW of raised floor will be developed with an additional 47MW available for expansion.

    In a similar fashion to the first phase in Dallas last year, QTS has begun construction without an anchor tenant already signed. There are property tours being conducted, and Bennewitz is seeing considerable demand in Chicago for a brand new data center with access to long-haul fiber networks located “minutes away” from the major downtown carrier hotel 350 East Cermak.

    New Jersey – QTS owns the $75 million McGraw Hill Financial data center located on a 194-acre campus near Princeton, New Jersey. In July 2014, the company entered into a 10-year lease and strategic alliance with global IT powerhouse Atos SE. QTS provides McGraw Hill with IaaS and critical facilities management services alongside Atos IT service offerings.

    McGraw Hill also leases space from QTS in Jersey City. QTS does not believe that the spin-off of the MHFI educational services unit will have any negative impacts moving forward. It is too early to speculate if this glass might actually be half-full.

    Notably, the Atos strategic alliance has already paid dividends for QTS in Dallas. Atos is headquartered in Bezos, France, and has over 17,000 employees based in 52 countries around the globe.

    Bottom Line

    The QTS business model has delivered stabilized unlevered returns of at least 15 percent ROIC for the past nine quarters. This is a blended average of all of the QTS campuses, with the existing 12MW MHFI facility delivering north of 10 percent ROIC at one end, and the 36MW 89-percent-built-out Atlanta-Suwanee site pumping out a 27.4 percent ROIC at the high end.

    As each new phase is developed, the ROIC increases for QTS, which creates a virtuous cycle. In addition, the QTS campuses have over 235 acres of land for expansion.

    In Dallas, where it has access to 140MW, QTS opened a 54,000-square-foot phase within a 292,000-square-foot powered shell. In the 110MW Richmond facility it has 121,600 square feet within a 557,000-square-foot powered shell, which delivers 12.8 percent ROIC with only about one-fifth currently built-out. Atlanta-Metro (72MW), is 79 percent occupied and delivers an ROIC of 17.6 percent.

    The more mature campuses are supporting growth of the newer QTS pins in the map while still delivering an overall 15 percent stabilized ROIC for shareholders. Berson quipped that he doesn’t worry that he will “wake up each morning” and discover that QTS has decided to change its business model or product.

    5:24p
    Going ROBO: Making the Most of Your Remote Locations

    Jesse St. Laurent is the VP of Product Strategy for SimpliVity.

    Think about the average company today: thanks to technology, most employees can work from anywhere. The evolution of the office space makes geographical limitations nonexistent, and small and medium-sized companies (SMBs) are finding it easier and easier to open up remote office, branch offices (ROBO) and truly go global. That’s not to say, though, that challenges do not exist.

    When not managed correctly, ROBO can be left out on the sidelines: under-staffed, under-funded, and under-supported. Given limited staffing and/or financial assets, management often needs to consider how it will dedicate resources to the space, specifically to the new remote site’s IT department. They may find it requires some level of technology support from headquarters. It is difficult enough to maintain, manage, protect, and secure a multi-vendor legacy IT infrastructure as it is – now try applying it to a remote site.

    ROBO environments must be dynamic and keep pace with shifting business requirements, though that’s easier said than done. Take, for example, a growing financial company with remote offices around the globe, where applications run on a highly transactional database. Applications cannot be centralized and instead must be run locally at each site to ensure performance is met. IT needs to guarantee that the application environment is free from issues that will cause downtime and that future growth can be accommodated (while at the same time keeping capital and operational expenses low).

    Now let’s get extreme: think of an oil and gas company, transporting product via ship. These ships serve as a ROBO with full-time staff, but no IT staff, and can even have its own small data center. One common problem that data centers on remote ships run into is that they cannot backup data because the satellite link introduces over 500ms of latency and a significant amount of packet loss. These sites, therefore, can lose data in the blink of an eye. For that reason, disk and tape storage systems remain staples of ROBO protection strategies, and improving data backup and recovery tops IT’s priority check-list at these sites.

    So how does a company address this type of IT problem while keeping pace with data growth and reducing storage costs?

    The solution lies in hyperconverged infrastructure: a single-vendor solution that enables cloud-like economics and scale with the performance, reliability and availability you would expect in a data center. Hyperconvergence can have a “data center in a box” approach that combines all infrastructure below the hypervisor in a highly-scalable data center building block, removing the need for other software/storage products. By doing so, IT is dramatically simplified and efficient.

    This approach also allows data to become efficient in terms of capacity and performance by writing and reading less data to and from disk, making it ideal for ROBO. Because of the centralized management capabilities of hyperconvergence, IT can manage and operate activities at remote sites via centralized management, making data backups, transfers and recovery simple – even if it’s from a ship in the Atlantic Ocean.

    Hyperconvergence also eradicates the need for additional backup software, hardware, deduplication, replication and cloud solutions. IT can create policies that automatically control the the frequency of the backups, where data are stored (local, remote or cloud), and the retention time.

    Hyperconverged infrastructure, therefore, takes the remote out of ROBO. Management has the luxury of applying its IT infrastructure to a branch office, allowing them to manage resources wisely and ensure data protection while keeping costs low and supporting the site.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:39p
    Google Open Sources Dataflow Analytics Code through Apache Incubator

    the var guy logo

    By The VAR Guy

    Google is open-sourcing more code by contributing Cloud Dataflow to the Apache Software Foundation. The move, a first for Google, opens new cloud-based data analytics options and integration opportunities for big data companies.

    Cloud Dataflow is a platform for processing large amounts of data in the cloud. It features an open source, Java-based SDK, which makes it easy to integrate with other cloud-centric analytics and Big Data tools.

    Although the Dataflow SDK has been open source for more than a year, Google took the bigger step this week of proposing to turn the platform into an Apache Incubator project. That move paves the way for Dataflow’s codebase to eventually become a full-fledged Apache Software Foundation project.

    Google has partnered with Cloudera, data Artisans, Talend, Cask and PayPal in issuing the proposal. Those partners are already celebrating the proposal, which — if approved, which it should certainly be — will make it simpler to build Dataflow’s scalability and integration features into commercial Big Data platforms in an open source, vendor-neutral way.

    Talend, for instance, had this to say: “Developers leveraging the Dataflow framework won’t be ‘locked-in’ with a specific data processing runtime and will be able to leverage new data processing framework as they emerge without having to rewrite their Dataflow pipelines, making it Future-proof.”

    For the channel, Google’s proposal means the cloud and big data are set to grow closer together — and that it will be easier for open source big data companies to keep the future of data analytics open.

    This first ran at http://thevarguy.com/open-source-application-software-companies/google-open-sources-dataflow-analytics-code-through-apach

    6:00p
    Sensitive Data Leaks Cost Average Organizations $1.9M: Report

    Talkin Cloud logo

    By Talkin’ Cloud

    The ease of use of cloud-based collaboration and file sharing applications may be putting organizations at risk as they are unaware that 26 percent of documents stored in cloud apps are broadly shared – meaning any employee can access them, and in some cases are discoverable in a Google search.

    This is according to the Q4 2015 Shadow Data Report released by Blue Coat’s Elastica Cloud Threat Labs team on Wednesday. The study is based on insights into 63 million enterprise documents within leading cloud applications including Office 365, Google Drive, and Salesforce.

    The report identifies shadow data as any sensitive information that is uploaded and shared in cloud apps without the knowledge of IT security teams. This isn’t the first time Elastica has explored the risks of shadow IT, having investigated the risks to the healthcare industry in particular in its Q2 2015 report.

    Indeed, the healthcare industry is at greater financial risk from the leakage of sensitive cloud data than the average organization, with a potential impact reaching up to $12 million compared to the $1.9 million cost to an average organization.

    “We’ve reached a point in the security lifecycle where shadow IT should no longer be the primary focus. By now, organizations should have a grip on cloud applications available and have enforceable policies in place with the ability to control which are in use,” Blue Coat Systems and Elastica founder Rehan Jalil said in a statement. “It’s time to start focusing on the real problems, which are the need to know what types of information employees are sharing, who is able to access data and how to stop high-risk exposures that lead to data breaches.”

    According to the report, one in 10 documents shared broadly contain data that is subject to compliance regulations, such as Protected Health Information (14 percent) and Payment Card Industry data (5 percent).

    The Q4 2015 report can be downloaded from Elastica’s website.

    This first ran at http://talkincloud.com/cloud-computing-security/sensitive-data-leaks-cost-average-organizations-19-million-report

    8:21p
    Getting to a Digital State and Evolving Your Data Center

    We are entering a digital revolution where more companies and users are utilizing even more data and applications. Cisco recently pointed out that annual global data center IP traffic will reach 10.4 zettabytes (863 exabytes per month) by the end of 2019, and that global data center IP traffic will grow three-fold over the next five years. This growth makes the data center an absolutely critical component for IT and the modern business. The challenge, however, becomes updating and integrating everything with modern data center architecture. Most of all, organizations are looking at ways they can optimize the delivery of their resources and create true efficiency.

    In a new whitepaper sponsored by NTT, we learn about the next-generation systems that are impacting resource and environmental utilization within the modern data center. The paper outlines the critical points to consider when creating a data architecture that can align with market demands:

    • Identifying the challenges around power density
    • Understanding IT equipment thermal management
    • Limitations of some cooling designs
    • How to create system energy efficiency
    • Optimizing the TCO of your entire data center

    Download this whitepaper now to understand how you can create a data center environmental control design that is truly agile. Plus, learn how to align cost objectives with life cycle management and TCO optimization, and also see how you can build better high-density designs.

    9:31p
    Switch Data Center in Michigan to Run on Renewable Energy

    Switch has agreed to buy renewable energy for its future data center in Michigan.

    The Las Vegas data center provider, known for its massive high-security campus in the Sin City, will start by procuring bundled energy and Renewable Energy Credits through the Green Generation program by Consumers Energy, the utility that serves the area in Michigan where the Switch data center will be built, Adam Kramer, the company’s executive VP of strategy, said. However, it is in negotiations with the utility about building a new utility-scale renewable generation project in the state.

    “We will have a new generation resource in the market,” Kramer said. “In all likelihood it will be wind.”

    The future renewable project will have enough capacity to offset the entire energy footprint of the future Switch data center, he said.

    Read more: Cleaning Up Data Center Power is Dirty Work

    Power for Consumers Energy’s Green Generation program is generated by a mix of wind and landfill gas. Whether energy that comes from reclaimed methane in gases generated by landfill can be considered renewable is controversial, and Switch will not be buying energy generated that way, according to Kramer.

    “We’re not going to be using landfill gas,” he said, and once the new renewable project is built, the facility will be using its energy output exclusively.

    Fate of the Switch data center project in Michigan was in question until late last year. The company had plans to convert the pyramid-shaped former office building outside of Grand Rapids, but said it would execute those plans only after the state changed its tax code, adding new tax breaks for data center users and operators. A bill to create the tax breaks was rushed through the legislative process under deadline pressure from Switch, which said it wouldn’t pursue the Michigan data center project unless the tax breaks were secured before the end of the year.

    Rendering of the future Switch SuperNap campus outside of Grand Rapids, Michigan (Image: Switch)

    Rendering of the future Switch SuperNap campus outside of Grand Rapids, Michigan (Image: Switch)

    In mid-December, after the bill was passed by both houses of the legislature, the company’s CEO Rob Roy told us the Switch data center project in Michigan was a go.

    Switch, which is currently also building data centers in Italy and Thailand, has been actively focused on renewable energy for its data centers since last year. The company made a public commitment to power all of its data centers with 100-percent renewable energy, joined President Barack Obama’s American Business Act on Climate Pledge, and more recently became a signatory of Corporate Renewable Energy Buyers’ Principles.

    The Buyers’ Principles were developed by a group of corporations together with the World Resources Institute and World Wildlife Fund to make it easier for companies that consume a lot of energy to procure renewable energy, which remains a big challenge in many American states.

    Read more: Switch Joins Obama’s Business Climate Pledge, Plans 100MW Solar Project in Nevada

    The bulk of the signatories are major tech companies that operate many megawatts of data center capacity in the US and globally, including Salesforce, Intel, eBay, cisco, Google, Amazon, Facebook, Yahoo, Microsoft, and HP. There are also some data center service providers on the list – Switch’s competitors IO, Digital Realty Trust, and Equinix.

    Switch has signed two utility-scale Power Purchase Agreements for energy that will be generated by two massive solar farms in Nevada. The contracts, 180MW total, will offset energy consumption of the entire huge Switch data center campus in Las Vegas and the future one the company is building near Reno, Nevada.

    << Previous Day 2016/01/21
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org