Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, July 10th, 2017

    Time Event
    12:00p
    DCK Investor Edge: QTS’s Hybrid IT Strategy — Short-Term Pain, Long-Term Gain?

    All publicly traded companies must balance Wall Street expectations for quarterly performance with being good stewards of capital and executing on a strategic vision for long-term growth.

    QTS Realty (QTS) historically has done an excellent job for shareholders by acquiring large infrastructure-rich properties at steep discounts and re-purposing them into state-of-the-art data center campuses.

    A focus on both physical and logical security and compliance has been a differentiator for QTS, including multiple FedRAMP approvals for government agencies and contractors to operate. However, the government sector has a notoriously long and unpredictable sales cycle.

    The other component of the QTS strategy is providing a complete suite of customer solutions: wholesale, colocation, plus in-house cloud and managed services. This approach has been ideal for co-creating hybrid IT solutions with enterprise customers.

    However, QTS has noticed an “inflection point” this year – the complexity of the hybrid IT stack delaying enterprise deployments.

    A Slow Start

    QTS has been a growth story since its 2013 IPO. However, a confluence of events has created a headwind which has recently caused its shares to under-perform its publicly traded peer group. This included a one-off Q1 2017 churn event with a government contractor vacating a 2MW sub-leased space in N. Virginia (while QTS still had two-years remaining on its own property lease).

    Source: QTS Q2’17 investor presentation (this and all other slides included in this article)

    Ironically, some of QTS’ strengths may have contributed to a lack of visibility into when deals close and/or delay when lease revenues are recognized.

    The company recently announced it will be holding an Investor Day at its newly acquired Fort Worth data center in November to update “the company’s financial outlook and strategic plan to position QTS as the leader in hybrid IT data center solutions.”

    Here is a preview of some of the key areas that management will have the opportunity to address.

    Hybrid IT: Friend or Foe?

    Clearly, the enterprise paradigm shift to a hybrid cloud end-state can be a doouble-edged sword. The days of customers just routinely adding additional servers and expanding colocation cages are now increasingly in the rear-view mirror.

    As QTS CFO Jeff Berson told Data Center Knowledge in an interview, the company “has seen an ‘inflection point’ this year with enterprise leasing, as IT stacks are increasingly becoming more complex.”

    Berson and his colleagues are proud that QTS has been working for years to architect a complete solution set to offer customers. However, he also pointed out that distributed hybrid IT architecture tends to encompass a much larger decision matrix involving more stakeholders, which slows down the sales process.

    Overall, the QTS plan seems to be working. Berson mentioned that QTS has a sales funnel which is at least 30 percent larger than this time last year. However, hybrid IT does not lend itself to “easy wins” and often lacks visibility into when an individual customer requirement will turn into a signed lease.

    Backlog: Friend and Foe?

    Last week, QTS announced its “HyperBlock” program to underscore the upside embedded in leasing smaller C1 wholesale data halls to public cloud service providers and global SaaS firms.

    It is essentially a “land and expand” strategy. These firms can initially lease smaller chunks of space with either contractual provisions to expand or with options and rights of first refusal.

    The good news is that the contractual provisions to expand over time help give QTS visibility into its pipeline. The other side of the coin is that a strong quarter of leasing announcements does not translate into revenue growth in 12 to 16 weeks.

    The QTS leasing backlog cuts both ways. It doesn’t help next quarter’s financial metrics but it helps to increase earnings visibility and de-risk the future years.

    New Campuses Are Dilutive

    In January, QTS announced a $50 million sale-leaseback deal in Fort Worth with insurance giant Health Care Services Corp. This also resulted in HCSC becoming an initial 1MW anchor tenant at what is now QTS’ second metro-Dallas data center campus.

    Read more: QTS Buys Large Dallas Data Center from Insurer HCSC

    The good news for investors was QTS’s ability to buy another high-quality corporate data center, plus adjacent land for expansion, for $6 million per MW. However, in the short run, QTS is paying $50 million for a facility with just 1MW leased. This is dilutive to the QTS stated goal of an overall data center portfolio average of 15% ROIC.

    The acquisition of the former DuPont Fabros Technology facility in Piscataway, New Jersey, and the opening of QTS Chicago a year ago (not shown above) were also initially dilutive. Over the next few years, the goal is to increase utilization and operating leverage which will drive higher returns on invested capital, or ROIC.

    The QTS Dallas (Irving, Texas) campus has only been in operation for two years and is already approaching 12% returns on capital. Long-term QTS investors will be rewarded down the line, as these facilities lease up and additional phases are developed.

    What Is HyperBlock?

    The re-branding of its custom wholesale offering to public cloud service and SaaS providers as “HyperBlock” is in a sense a way to re-frame some QTS custom wholesale, or “C1” deployments for Wall Street in order to emphasize client wins in the public cloud or SaaS verticals.

    QTS provides timely space, power, and the desired level of services for an initial deployment in the 2MW range. The exact size can vary depending upon the customer requirements and the facility.

    Read more: QTS Pitches 2MW Data Center Product to Hyper-Scale Clients

    On the surface, there does not appear to be much to differentiate this turnkey delivery of space and power along with a contractual guarantee of expansion space.

    However, in May QTS rolled out a new DCIM initiative, Service Delivery Platform. SDP blends in aspects of AI and Machine Learning, which could become a difference maker for customers interested in hybrid IT solutions.

    QTS – Service Delivery Platform

    According to the company, “QTS has initially launched applications for SDP that provide customers real-time visibility of Power Analytics, Security, Hybrid Cloud Management, and Enterprise IT, among others. By providing programmatic access and control of the data, QTS SDP facilitates customer collaboration to unlock a myriad of opportunities that include:

    • Predictive modeling and analytics
    • Enabling more informed and effective business decisions
    • Improving operational efficiencies and cost savings
    • Acting on trends in real time
    • Applied machine learning and enhanced automation”

    Once again, this is a more in-depth conversation than simply responding to a consultant or broker RFP for space and power. Throughout 2017, QTS will continue to launch additional SDP applications and features in collaboration with vendors and customers.

    Connectivity

    QTS has acknowledged being a bit late to the dance when it comes to monetizing cross-connects at its data centers. However, once again this glass can be viewed as half-full by long-term investors.

    While today QTS interconnection services only represent 6 percent of revenues, they are growing at a 15 percent clip. New customer signings and existing lease renewals represent opportunities to boost monthly recurring revenue as QTS begins charging for this high-margin service.

    Additionally, subsequent to Q1 2017 QTS announced deployment of Packet Fabric’s software defined networking platform to both simplify and amplify customer networking choices and connectivity at its carrier-neutral campuses. These recent connectivity initiatives should also help to reduce churn while simultaneously improving margins going forward.

    Bottom Line

    Mr. Market tends to myopically focus on short-term performance. Investor Day in November represents an opportunity for management to explain how these recently announced initiatives will combine to make it simpler for QTS customers to deploy hybrid cloud and IT solutions.

    3:00p
    US Ban on Its Data Center Switches a Setback for Arista – at Least a Temporary One

    Federal regulators barred Arista Networks from importing and selling its data center networking equipment in the United States earlier this month, giving rival Cisco a victory in a three-year-old patent infringement battle. But the ban could be short-lived as Arista appeals the decision and develops workarounds on the patents.

    The US International Trade Commission (ITC) ruled in early May that Arista infringed on two Cisco patents but delayed the ban pending a 60-day review, which ended last week. Arista, which builds network switches for data centers, announced in an SEC filing on July 5 that it is complying with the ITC’s ban.

    While it’s unclear how long the ban will stay in effect, it is at least a temporary setback for Arista, which has made inroads in the lucrative $10.4 billion data center networking equipment market. It’s been especially successful in selling to hyper-scale data center operators.

    According to IDC, Arista has captured 9.4 percent share of the worldwide market, ranking second behind leader Cisco, which has 58.2 percent of the market. Arista’s customers include Microsoft, Facebook, Netflix, and Yahoo.

    “Arista is a growing force in data center networking,” said Brad Casemore IDC’s research director of data center networks. “With their switches and network operating system, they have gained significant market share in the last few years.”

    Emergency Motions

    Arista, based in Santa Clara, California, and founded by former Cisco employees, has filed two emergency motions asking the ITC to suspend the ban and is awaiting a response. Its argument: The Patent and Trial Appeal Board (PTAB) recently ruled that the two patents in question are invalid. The company wants the ban suspended until the appeals of PTAB’s decisions are completed.

    “While the… review period… has ended, we are still awaiting the International Trade Commission’s decision on our motion to suspend its remedial orders, which are based on patent claims that the Patent Trial and Appeal Board has found invalid,” said Marc Taxay, Arista’s senior vice president and general counsel, in a statement.

    If the ITC rules against Arista, the company is already busy reworking its products and creating patent workarounds. Once complete, its executives plan to seek regulatory approval for its revised products.

    “Arista has been working on modifying its products to address the ITC orders,” the company said in the SEC filing. “If the ITC does not suspend the ITC orders, Arista intends to release these modified products as soon as practicable and work with customers on their qualification and deployment.”

    Cisco last week responded by pointing out that the ITC has ruled that the two patents are valid and infringed upon by Arista. Mark Chandler, Cisco’s senior vice president and general counsel, wrote in a blog post on July 5 that suspending the ban is unwarranted, but if a suspension is issued, Cisco would seek to reverse the decision in federal court.

    Ultimately, “the right solution, as we’ve emphasized from the beginning, is for Arista to stop using technology they copied from Cisco,” Chandler wrote.

    The product ban is just the latest twist in a three-year patent tussle between upstart Arista and networking giant Cisco, which first sued Arista for patent infringement in 2014. Last December, for example, a federal court judge denied Cisco’s demand for $335 million in damages and a jury ruled that Arista did not infringe a Cisco patent or Cisco’s copyright on its user manuals.

    Situation “Murky”

    IDC’s Casemore said the current situation is murky for Arista. It could win its appeal in the coming days, weeks or month – or it could lose. If the ban remains in effect for a lengthy period of time, it has the potential to affect Arista’s gross margins, he said.

    “Obviously, it could cost them more to manufacture or assemble their products domestically, and it could potentially increase lead times,” he said.

    Casemore said decisions from federal regulators and potentially federal courts will either reinforce or invalidate Arista’s product ban – and the two networking rivals and the rest of the industry will know soon enough.

    “Arista would prefer not to have to go through a redesign. They want a ruling that says they are not infringing on any of the patents, so they can go back to business as usual. But in the interim, Arista has to abide by [ITC’s original] decision,” he said.

    3:30p
    Be Proactive in Data Center Earthquake Mitigation

    Gary Wong is Director of Applications Engineering at Instor Solutions.

    Of all the natural disasters that can affect data centers, earthquakes are among the most damaging. Given the data center industry’s continued growth and expansion throughout California, these potentially catastrophic events are always top of mind for data center owners and operators.

    With the passing of the 27th anniversary of the 6.9-magnitude Loma Prieta earthquake, centered within 10 miles of Santa Cruz, now is the time for data centers across California and other areas prone to seismic activity to reevaluate their earthquake disaster strategies and look at the availability of proactive protection plans.

    Across the world, there are an estimated 500,000 detectable earthquakes each year; 10,000 in the area of Southern California alone. These sobering facts lead to some important questions: If an earthquake like the Loma Prieta were to strike again, how are data centers better protected now than 27 years ago? What would the projected loss be to your company and customers if a major earthquake hit? What is your company doing to protect the valuable data and physical assets in your facility?

    Earthquake damage can be particularly devastating to the data center industry for a variety of reasons. Going beyond the health and safety of its staff, the loss of uptime resulting from an earthquake can be financially devastating. If a seismic event occurs and the facility is unprotected, the physical damage to servers and IT equipment can also be beyond repair. This combination of loss of equipment and downtime for clients will likely result in the loss of the business in its entirety.

    While we have yet to develop the technology to accurately predict where and when an earthquake will take place, there are precautions that owners and operators can take to help protect data centers from substantial earthquake damage. Whether planning a new build or retrofitting an existing facility, these forms of seismic planning should be a priority.

    Cabinet Earthquake Mitigation Solutions: Rigid Bolting

    The most commonly used method of earthquake protection is rigid bolting the data center equipment racks directly into the slab floor. If your data center is located in an area at-risk for seismic activity, this is the minimum method that should be used to protect your assets. Cabinet bolting physically protects personnel that are working within a data center during an earthquake, as the racks have less risk of falling over.

    While the protection of personnel should always be the top priority, this method of securing racks is not the best solution to protect IT equipment. Because they are securely bolted to the floor, server racks will vibrate along with the ground. If the earthquake is relatively mild the sensitive equipment may escape unscathed, but more likely vibrations will rattle and shake equipment, causing costly damages. Rigid bolting is the absolute minimum level of protection a data center should employ, but there are other earthquake mitigation options to help protect IT equipment as well as data center personnel in the event of an earthquake.

    Base Isolation Technology

    A preferred method used for many years to protect buildings and bridges from earthquake damage, base isolation technology has been adapted to protect data center equipment and personnel as well. In brief, seismic base isolation systems work by decoupling strong seismic ground motions and vibrations from a structure, eliminating or drastically reducing the path through which damaging shock waves and vibrations can travel. The advantages of using this technology in data centers in earthquake prone areas is clear. Its ability to channel shock waves away from sensitive IT equipment goes far to protect assets, helping to ensure uptime in the event of a natural disaster.

    Additionally, because the vibrations are mitigated server racks are also protected, thus helping to ensure that they won’t fall onto data center personnel. The benefits of seismic isolation are so great that this technique is the primary method used by data centers to achieve Tier 4 ratings in areas with substantial amounts of seismic activity. The results of utilizing seismic isolation can be dramatic, as this video of a data center in Anchorage, Alaska in the midst of a 7.12 earthquake shows.

    Disaster Recovery Plans

    The above technologies have been developed to prevent and mitigate earthquake damage, but every data center located in areas of seismic activity should also have a current business continuity and disaster recovery plan. Should damage to equipment or staff make it through your defenses, each member of the data center’s personnel needs to know what their role is in an emergency. Working in conjunction with the deployment of the above solutions, the IT and facilities staff should work together to identify the critical aspects of their data center responsibilities and the risks that earthquakes pose to them.

    These risks should then then be compiled, reviewed and prioritized on an ongoing basis with the end result being a business continuity and disaster recovery plan that staff are drilled and tested on. Seismic events require an immediate response to mitigate damage, followed by a plan for long-term continuity, to ensure protection of staff, equipment, and client assets. Ongoing training for data center staff allows administrators to make changes and refinements, analyze and correct problems, and ultimately avoid making the same mistakes going forward. When it comes to a seismic event, no matter the size, seconds truly count. A well-written plan that staff has trained on can be the difference between maintaining uptime or going out of business.

    Summary

    There are a wide variety of crises that data centers owners and operators need to protect against, both human and environmental. While an imposing prospect, implementing new or upgrading existing safeguards that mitigate damage from earthquakes should be implemented without delay. Proven solutions and methodologies are available to help prevent damage to sensitive equipment and personnel, and recovery plans should always be developed, revised and trained against if your data center is located in an at-risk area. The stakes are simply too high to do otherwise.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.
    8:46p
    Azure Stack to Ship in September; Hardware on Sale Starting Today

    Brought to you by IT Pro

    Azure Stack will finally be ready to ship in September. Microsoft announced that its partners Dell EMC, HPE and Lenovo are taking orders as of Monday.

    Microsoft made the announcement at its Microsoft Inspire 2017 conference taking place in Washington, D.C. this week. Microsoft said it has delivered the Azure Stack software to its partners, which allows it to begin the certification process for their integrated systems.

    Azure Stack allows customers to run Azure services on-premises, giving customers control over where applications and workloads reside. Applications can be built and deployed using the same approaches in Azure Stack as if they were being deployed to Azure public cloud.  Because of this, Microsoft said it enables “a truly consistent hybrid cloud platform.”

    At Microsoft’s partner conference last year it delivered news about Azure Stack that irritated customers; Microsoft announced that customers would have to buy certified hardware from select partners, instead of running on customers’ own servers as initially promised.

    According to a recent report by ITPro, “The goal of the [Azure Stack] appliance is you do not worry about the operating systems/software/configuration of the appliance, that is all managed for you and you focus is utilizing its capabilities.” 

    Microsoft also released details on pricing for the pay-as-you-use and capacity-based models. The Azure Stack Development Kit (ADSK) is available for download today, which can be used to build and validate applications for integrated systems deployments.

    10:04p
    China Is Said to Close Major Hole in its Great Internet Firewall

    (Bloomberg) — China’s government has told telecommunications carriers to block individuals’ access to virtual private networks by Feb. 1, people familiar with the matter said, thereby shutting a major window to the global internet.

    Beijing has ordered state-run telecommunications firms, which include China Mobile, China Unicom and China Telecom, to bar people from using VPNs, services that skirt censorship restrictions by routing web traffic abroad, the people said, asking not to be identified talking about private government directives.

    The clampdown will shutter one of the main ways in which people both local and foreign still manage to access the global, unfiltered web on a daily basis. China has one of the world’s most restrictive internet regimes, tightly policed by a coterie of government regulators intent on suppressing dissent to preserve social stability. In keeping with President Xi Jinping’s “cyber sovereignty” campaign, the government now appears to be cracking down on loopholes around the Great Firewall, a system that blocks information sources from Twitter and Facebook to news websites such as the New York Times and others.

    While VPNs are widely used by businesses and individuals to view banned websites, the technology operates in a legal gray area. The Ministry of Industry and Information Technology pledged in January to step up enforcement against unauthorized VPNs, and warned corporations to confine such services to internal use. At least one popular network operator said it had run afoul of the authorities: GreenVPN notified users it would halt service from July 1 after “receiving a notice from regulatory departments.” It didn’t elaborate on the notice.

    It’s unclear how the new directive may affect multinationals operating within the country, which already have to contend with a Cybersecurity Law that imposes stringent requirements on the transfer of data and may give Beijing unprecedented access to their technology. Companies operating on Chinese soil will be able to employ leased lines to access the international web but must register their usage of such services for the record, the people familiar with the matter said.

    “This seems to impact individuals” most immediately, said Jake Parker, Beijing-based vice president of the US-China Business Council. “VPNs are incredibly important for companies trying to access global services outside of China,” he said.

    “In the past, any effort to cut off internal corporate VPNs has been enough to make a company think about closing or reducing operations in China. It’s that big a deal,” he added.

    China Mobile Ltd., the Hong Kong-listed arm of the country’s biggest carrier, declined to comment. Representatives for publicly traded China Telecom Corp. and China Unicom (Hong Kong) Ltd. couldn’t immediately comment. The ministry didn’t immediately reply to an email seeking comment.

    << Previous Day 2017/07/10
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org