Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, June 24th, 2016

    Time Event
    12:00p
    Uptime: Colocation Firms are Building Fewer Data Centers

    If you look at recent earnings reports by the biggest data center providers, you’ll get the impression that the industry is booming.

    And it is. Enterprises are moving more workloads either to the cloud or to commercial colocation facilities, and data center providers are benefiting from both. As more companies use cloud services, cloud providers are racing to lease as much data center capacity as they can get their hands on, resulting in a boom for the big data center providers who can’t build new facilities fast enough to satisfy all the demand.

    Read more: How Long Will the Cloud Data Center Land Grab Last?

    The sound of champagne corks popping after earnings reports by the biggest players in the market, however, can mask the fact that in general, the amount of new data centers being built for lease by one or multiple tenants in the US has been declining.

    This doesn’t necessarily mean the amount of new data center capacity being brought to market is shrinking. This is the age of the mega data center: service providers may be building fewer facilities, but the size of each individual building is getting bigger and bigger.

    Market studies by IDC in 2014 and 2015 found that the trend in the data center provider industry was toward building fewer but larger buildings.

    In a more recent study, 24 percent of colocation providers Uptime Institute surveyed this past February said their company had built a new data center within the previous 12 months. That’s down from 29 percent in 2015 and 45 percent in 2014.

    Interestingly, construction slowdown in the data center provider industry has been more drastic than the slowdown in enterprise data center construction. Fifteen percent of enterprise IT respondents said their company had built a new data center within the previous 12 months both this February and the year before. Eighteen percent said so in 2014.

    Uptime 2016 survey colo budget chart

    Source: Uptime Institute Data Center Industry Survey 2016

    Far from Perfect

    While overall colocation customer satisfaction levels are high – only 7 percent of respondents to Uptime’s survey said they were dissatisfied or very dissatisfied with their primary data center provider – colocation isn’t the perfect answer for everybody. According to Uptime, 40 percent of enterprise IT respondents were paying more for colo contracts than they expected to pay when they signed those contracts.

    See also: Slow Waning of the Enterprise Data Center, in Numbers

    Nearly one-third said they had experienced a data center outage at a colocation site, and the bulk of enterprise respondents said downtime compensation in their agreements with colo providers was insufficient. About 60 percent said the cost of data center outages overshadowed whatever downtime penalties were included in their Service Level Agreements.

    Lots of Business Still on the Table

    While many of the biggest data center providers are chasing the multi-megawatt wholesale deals with cloud giants, there is a huge portion of the enterprise market that remains untapped, and companies like Equinix, QTS Realty, and CyrusOne, as well as the cloud giants themselves, are pursuing that opportunity.

    Uptime enterprise IT share of cloud colo onprem

    Source: Uptime Institute Data Center Industry Survey 2016

    Enterprise-owned data centers still host 71 percent of enterprise IT assets, according to Uptime. Data center providers have 20 percent of those assets, while the remaining 9 percent is in the cloud.

    The big question today is how much of that 71 percent will go to the cloud, and how much of it will end up in colocation data centers.

    Further reading: Why Keep the Enterprise Data Center?

    3:00p
    Arista Improperly Uses Cisco Technology, US Trade Agency Rules

    (Bloomberg) — Arista Networks uses technology owned by Cisco Systems in networking gear made in Asia, a US trade agency ruled in a patent case, setting the stage for an import ban.

    The ruling Thursday by the US International Trade Commission leaves Arista to ask the Obama administration to veto the import ban, or request that an appeals court throw out the case.

    Arista dropped 4.2 percent, to $70.73, in after-hours trading at 5:07 p.m.

    Cisco claims that Arista, which was founded by former Cisco executives, has built its business using copied technology. It has filed multiple actions: the complaint decided Thursday, another case at the ITC, and a copyright-infringement suit scheduled for a November trial in California.

    Arista has in turn accused Cisco of using unfair tactics to maintain its dominant position in the market.

    Read more: Arista CEO on Cisco’s Lawsuit: “It’s Not the Cisco I Knew”

    Trade Judge

    A trade judge in February said Arista infringed three of five patents held by Cisco, the world’s biggest networking-equipment maker.

    Arista maintained that it doesn’t infringe any Cisco patents in this or the other case. It’s filed challenges to the validity of the patents at the US Patent and Trademark Office.

    Arista has been winning customers in one of Cisco’s most important businesses: the sale of machines called Ethernet switches used in data centers by internet companies, banks and other large companies that are shifting to more cloud computing.

    Arista’s switches are based on software that is designed to be easier to use and more cost effective than Cisco’s IOS software, which has been almost as widely used by network administrators as Microsoft Windows is for running computers or Google’s Android for smartphones. Arista said the claim against its Sysdb product does not involve core architecture, but “external management.”

    More about Arista:

    A different judge is scheduled to release findings in the second ITC case brought by Cisco Aug. 26.

    The case is In the Matter of Certain Network Devices, Related Software and Components Therof, 337-944, U.S. International Trade Commission (Washington).

    3:30p
    Conditioned Data Center Power as a Service?

    One of the oldest arguments for moving enterprise applications to the cloud sounds like something an accountant would like: Cloud services, paid for on a monthly basis, are an operational expense, which is better than the capital expense of building or expanding a data center – a big, expensive, and depreciating real estate asset.

    The other big argument is cloud services let you pay only for what you use, which is better than investing in a data center whose capacity will probably be underutilized for the bulk of its useful life.

    But what if you can have both of those benefits in your own data center? A new partnership between data center UPS maker Active Power and a Geneva-based company Burland Energy was formed to do just that, but only for UPS systems.

    Put simply, Burland will buy a flywheel-based UPS system from Active Power and have Active Power install and manage it in your data center, while charging you a flat per-kWh rate based on the amount of conditioned electricity that UPS system is feeding to your data center floor. The companies call it UPS-as-a-Service, or UPSaaS, imitating the cloud industry’s X-as-a-Service nomenclature.

    The model, they say, will enable customers to utilize capital more effectively and reduce operating costs associated with electrical infrastructure in their data centers. “UPSaaS challenges the conventional approach to purchasing critical backup power equipment by offering conditioned power to customers based on a fixed rate per kilowatt hour (kWh) of electricity conditioned by the UPS,” the partners said in a statement.

    See also: Modular Cooling System Enables On-Demand Data Center Capacity

    4:00p
    Elliott Targets Tech Buyouts With Evergreen Private Equity Arm

    (Bloomberg) — Sometimes-activist hedge fund Elliott Management is a familiar name in 13D filings. As of Monday, its new private equity arm began making a name for itself in technology buyouts.

    Jesse Cohn, who runs Elliott’s US activist investments, this week debuted the private equity affiliate of Paul Elliott Singer’s $28 billion investing empire — Evergreen Coast Capital — by agreeing to acquire Dell’s software unit in partnership with Francisco Partners Management.

    Read more: Dell Sells Software Unit to Francisco Partners, Elliott

    Cohn is infamous throughout technology boardrooms and Silicon Valley C-suites for amassing near 10 percent activist stakes in public companies and aggressively pushing for strategic moves. He has agitated for changes at technology providers including Citrix Systems, NetApp, Juniper Networks, and Brocade Communications Systems.

    The next time Elliott discloses a stake in a public technology company, the hedge fund may have its own buyout in mind, rather than its usual activist playbook of agitating for a sale to someone else.

    The buyouts won’t necessarily be previous targets. Evergreen Coast’s first tech private equity deal — said to be valued at more than $2 billion — comes as Dell sells assorted assets to reduce debt amid its $67 billion takeover of EMC. Elliott holds an active stake in EMC.

    Opportunistic Investing

    Cohn, 36, for about a decade has split his time between Elliott’s New York headquarters and Silicon Valley. He will continue to run Elliott’s activist strategy, while also overseeing the private equity arm.

    “Elliott has always sought to develop our capabilities, through deeper research, true operational understanding and purpose-built teams to approach each situation,” Cohn said Wednesday. “These capabilities have served to both strengthen our public and activist approach, as well as enabled us to invest opportunistically into private equity situations.”

    Elliott recruited former Golden Gate Capital principal Isaac Kim as Evergreen Coast’s managing director, and under that unrelated and less threatening name has quietly built a private equity team at its new offices on Sand Hill Road in Menlo Park.

    Started by Singer in 1977, Elliott Management’s two funds invest across all its strategies, which include long-short hedge funds, distressed credit, arbitrage, real estate, shareholder activism — and now private equity. Evergreen Coast means Cohn can profit by antagonizing for an auction, and potentially bid on it too, positioning Elliott to also benefit from any longer-term upside that follows a buyout.

    Argentina Default

    Aggressive even by hedge-fund standards, Elliott is probably best known for a battle with Argentina’s government over that nation’s 2001 bond default. Under Cohn, Elliott’s activist focus has usually been on lower-profile enterprise software and technology hardware companies which provide essential — but largely invisible — cogs in data networks.

    Cohn’s target companies have often ended up in the portfolios of a shortlist of tech-savvy PE firms — Thoma Bravo, Permira Advisers, Vista Equity Partners, Francisco Partners, Silver Lake Management and Bain Capital — who regularly repackage them for strategic buyers.

    Private Equity

    Elliott has already participated in some private equity deals. It partnered with Insight Venture Partners on the buyout of E2open, and rolled its stake in Metrologic Instruments when Francisco Partners acquired it. The investor also rolled its activist stakes into takeovers of BMC Software and MSC Software.

    Activist targets have sometimes rebuffed Cohn’s advances, only to be acquired eventually anyway. In 2010, Elliott’s bid for software maker Novell led to its sale eight months later to Attachmate, with the activist rolling its stake. Elliott also bid for Epicor Software, later acquired by Apax Partners, and for Compuware before it was sold to Thoma Bravo.

    At Packeteer, Elliott both bid for the company and then rolled its shares into its acquisition by Blue Coat Systems — which was subsequently acquired by Thoma Bravo, then Bain, and this month, Symantec, where Elliott already holds a stake.

    Tech Targets

    Riverbed Technology said in March 2014 it had no “credible” offers, after Elliott bid $21 a share for the company in a Cohn campaign that included a 78-page draft merger agreement that any interested suitor could adopt. Riverbed later sold itself to Thoma Bravo and Teachers’ Private Capital.

    The list of Elliott’s targeted technology companies that have subsequently sold is even longer, including Qlik Technologies, Informatica and Emulex. The firm also recently pushed Polycom and Mitel Networks to merge.

    On Monday, Elliott disclosed it has a 9.8 percent activist stake in cyber-security software maker Imperva, and last week revealed an 8.8 percent active holding in identity theft protection monitor LifeLock. The next move may be Evergreen Coast’s to make.

    4:30p
    Is Parcel Protocol Right for Your Data Transfer Needs?

    Bill Chellis is the RoundTrip seed drive manager at Datto.

    In a constantly connected world, we take for granted the methods for getting data from one place to another. It just happens so fast, seemingly in the blink of an eye. Improvements to the vast number of data transfer protocols IT professionals use to make this magic possible occur every day.

    It is safe to say that the HTTP/S duo does much of the heavy lifting for applications on the web. But there is truly an exhaustive menu of protocol choices available, each tailored for a specific use and benefit. Companies can pick which option works best in certain times and conditions and move forward with confidence.

    Frequently, though, the best option for file replication and off-site backup isn’t a matter of using the fastest Internet connection or consuming the most bandwidth. Bulk data inbound for the data center can often find a cheaper, more cost-effective ride. And the method takes advantage of a tried-and-true technology that’s been in use for centuries.

    It’s the parcel – which, even in a cloud-dominated world, remains perhaps the simplest, most common-sense answer for moving bulk data between a client and an off-site data center.

    Off-site backup strategies work best the sooner you can achieve parity between local data sets and off-site servers. When a local data set is compact enough to move quickly with an Internet-based transfer protocol, off-siting with such an option is ideal. But that isn’t always the case. Larger data sets call for careful planning. You need to answer the following conflicting questions:

    • How can you best move all of your local data to off-site storage quickly?
    • Can you do this transfer without overwhelming your local network resources?

    It needs to be done quickly, but it can’t limit the speed of your local network. Employees need to keep working after all. Moreover, downtime is never acceptable, even if it happens in the name of creating an off-site backup. Using parcels to ship data off-site means companies can ship entire hard drives – data buckets measured by the terabyte – where they need it to go quickly.

    It would, of course, be faster if entire servers could be replicated at the push of a button. The likelihood that this can be achieved safely and quickly without any issues or network delays is small, though. After all, how many companies have end-to-end fiber broadband connections capable of handling this workload at the desired speed?

    Using traditional shipping methods means companies can get their off-site backups up and running in a matter of days. FedEx, DHL and the like are not usually included in the ranks of IT vendors. They are not who comes to mind when you talk in terms of transferring data files. Yet they’ve helped countless organizations get their backup plans in place.

    Off-site replication is a core service for most backup providers. It makes sense to use standard shipping methods for data seeding when the amount of client data is large enough. After the initial replication has occurred, smaller follow-up incremental backups can typically be sustained through the plethora of data transfer protocols we usually associate with the Internet.

    Backup and disaster recovery vendors need to specialize in whichever cloud-based protocols their customers prefer, of course. Simple backups and replications need to be done almost instantly. Vendors also must be realistic. Sometimes shipping data devices to and from specific locations is the best answer. So as much as they need the expertise and capability to move data with a mouse click, they need logistics teams as well – entire groups designed specifically to handle the movement of these data devices. Fleet teams responsible for the logistics part of preparing, testing, stocking, shipping, receiving, syncing, wiping, billing and managing parcels are required. When this concept is baked into the way a backup vendor operates, customers can be certain their larger file transfers are handled with care and precision.

    Hybrid approaches to data storage and backup have emerged as the standard for many companies. Companies need a significant portion of their file transfers and replication to happen instantly. Backup and disaster recovery vendors have built-in cloud options to account for this market demand. However, taking the time to build out your own parcel protocol is just as significant. Vendors with a dedicated, defined standard for shipping larger data sets demonstrate a clear dedication to customer satisfaction and premium service.

    It’s funny to think about, really. Moving data from one location to another has been a hallmark of IT for decades. As a result, the methods for doing so have become increasingly advanced. People can now send more data in less time, further around the globe than ever before. Despite all of this, having a formalized process and standards for shipping data sets and devices in parcels can differentiate one backup vendor from another. Have you consider taking advantage of the benefits of the parcel protocol? It just might be the “whole package” answer you have been looking for.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    << Previous Day 2016/06/24
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org