Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, May 26th, 2016

    Time Event
    12:00p
    Why Equinix Data Center Deal is a Huge Win for Digital Realty

    The sale of eight European data centers to Digital Realty Trust wasn’t an ideal scenario for Equinix, but the company was under deadline pressure, and Digital was both willing and able. As a result, Digital scored a deal that will put the growth of its new interconnection-focused retail colocation business on fast track in Europe.

    Data centers are a capital-intensive business, and Digital’s CEO William Stein emphasized that the strength and flexibility of the company’s balance sheet is a huge advantage, speaking earlier this week at a JP Morgan conference. The Equinix deal was a demonstration of that advantage.

    Last week, in conjunction with the $875 million acquisition of the eight-piece European portfolio consisting of Equinix and TelecityGroup data centers, Digital announced a $1.2 billion equity raise. According to Stein, “a clear use of funds opened a window,” which then allowed for the sale of even more shares than required to fund the Equinix deal.

    Digital is likely to have sold $525 million more in shares than it will need to fund the transaction. That figure takes into consideration about $200 million it will receive when Equinix acquires the Paris data center facilities it now leases from Digital, an option that was agreed to as part of the deal.

    EC Deadline Helps Digital

    Equinix was under the gun to divest these data centers. It agreed to sell them as a condition for approval of its Telecity acquisition by European Commission regulators.

    Back in January, Equinix CEO Steve Smith said there was considerable demand for the facilities, and he felt that “…they could be disposed of in a manner which would avoid creating a competitor.” Meanwhile, time was running out to sign a definitive deal with an eligible buyer, and Digital was a company Equinix knew could close the deal.

    This was a huge win for Digital, which is looking to grow its interconnection and colocation business with the Telx portfolio it acquired last year at the core. The deal will significantly accelerate expansion of the Telx platform in Europe, where it hasn’t been before. It can take many years to create interconnected customer ecosystems from scratch, and it’s a completely different data center offering than wholesale, Digital’s traditional bread and butter.

    However, the deal further increases complexity of the relationship between Digital and Equinix, one of its biggest customers.

    Stein was quick to point out during his presentation that Equinix remains a large client and partner. He described the Digital approach to the enterprise market as being “complimentary” and said the firms rarely compete directly for the same deal.

    Equinix’s Smith presented at the JPM conference the next day and acknowledged that he “never had a premonition that all the assets would go to one provider…” However, he said he felt comfortable being in “coopetition” with Digital, describing it as a rational allocator of capital and a quality organization to transition his employees to.

    He also said he views the option for Equinix to purchase the Paris facility as the unique piece which finalized the deal.

    Read more: Interxion Expands as European Data Center Market Heats Up

    What Digital is Getting

    The Equinix and Telecity properties that Digital will be acquiring are “highly connected” hubs serving over 650 customers.

    DLR - May'16 s63 DCK snip 8 locations

    Source: Digital Realty – May 2016 presentation

    Equinix does not typically offer customers large footprint options, which presents an opportunity for Digital to cross-sell its wholesale “Scale” product offerings to customers in the European facilities it is buying.

    The acquisition will be immediately accretive to earnings, even though the facilities only have 71 percent average utilization. This allows for a significant leasing upside.

    DLR - May'16 s69 DLR after EAP Deal

    Source: Digital Realty – May 2016 presentation

    The deal immediately adds 5 percent to Digital’s colocation revenues while barely moving the company’s overall vacancy needle.

    Perfect Timing Plus Flexibility

    After the entire data center REIT sector had posted strong Q1 results, Digital’s shares were trading at new highs, as bullish investors continued to bid up the sector.

    Read more: How Long Will the Cloud Data Center Land Grab Last?

    This momentum allowed Digital to negotiate a $96 price for a forward sale of 13,225,000 shares (including the underwriter option). During the past 52 weeks, Digital shares traded from just below $60 to $98.49 per share. The last time Digital conducted an equity raise, its common shares were trading in the high $60s.

    The company really pulled off a coup by structuring the equity raise as a “Forward Sale.” This allows flexibility to take down the $1.2 billion in new shares incrementally, up until May 19, 2017. This locks in the price without the need to rush to deploy capital.

    Investor Takeaway

    Digital’s Equinix deal seems like a great way to get Telx up and running in Europe without taking any huge risks. This seems to fit Stein’s vision of conservative growth with an eye on margins while growing cash available for distribution.

    Digital already has phased expansions underway in Northern Virginia, Chicago, and Dallas. Stein mentioned that there will be additional expansions “not yet announced” in some US markets.

    The $500 million of equity dry powder should also allow for a wholesale-focused acquisition, perhaps in one of the West Coast markets between Silicon Valley and Seattle. An opportunistic telecom data center portfolio might also make sense at the right price.

    The bottom line is that after a period of austerity, Digital, the sole data center REIT with an investment-grade balance sheet, is once again flexing its muscles as the “800lb Gorilla” in the sector.

    4:00p
    The Top Three Barriers to Cloud Adoption and What to Do about Them

    Kong Yang is Head Geek for SolarWinds

    It’s been said before and it’ll be said again: The cloud is here to stay, but it’s not for every workload. The results of a recent SolarWinds survey of IT professionals showthat integrating cloud services is easier said than done, with security concerns, the need to support legacy systems, and budget limitations creating significant adoption challenges for many organizations.

    Businesses of all sizes have something to gain from implementing a hybrid IT strategy, whether it’s greater agility or simply the relief of shifting a workload or application to the cloud where it can be managed by a team of experts—it’s just a matter of how. On that note, here are the top three barriers to cloud adoption and what to do about them.

    Barrier 1: Security and Compliance Concerns

    When it comes to the cloud, it’s probably no surprise that security and compliance concerns are considered the number one barrier to adoption. The biggest challenge is overcoming the persistent perception that the cloud is inherently less secure than data that is housed on-premises. The numbers consistently show that public cloud environments are actually safer than physical locations, with a majority of attacks originating inside the organization. Common internal attacks can stem from an employee falling victim to a phishing scheme that introduces malware on the network, DDoS attacks, or even accidental end user errors that stem from an inadequate understanding of potential security threats.

    This is not to say that security and compliance concerns shouldn’t be key priorities. Many of the larger providers already implement compliance programs for some of the most stringent policies, including HIPAA, PCI DSS, FEDRAMP, SOX and many others. Every time a provider adds a new service or feature, those compliance certifications must be re-upped to ensure they meet the requirements of clients and specific SLA contracts. How many IT organizations have that level of compliance across the board for on-premises infrastructure?

    Half the battle is having an awareness of any potential security threats and ensuring counter measures are in place. You must do due diligence in terms of understanding what is covered with respect to security and compliance for each platform. By having a fundamental understanding of the provider’s approach to securing your data, you can create a solid “handshake” between the data stored on-premises and data hosted in the cloud.

    A great place to start is by leveraging the NIST cybersecurity framework that encourages IT professionals to develop a framework—based on existing standards, guidelines and practices—for reducing cyber risks to critical infrastructure. In the hybrid IT era, processes like encryption in flight, encryption at rest, VPN tunnels, monitored user access and accountability are critical to ensuring data remains secure when it’s traveling from your server closet to the cloud and back again.

    Barrier 2: Support for Legacy Systems

    One of the most intimidating aspects of cloud adoption is the chaos it may introduce into your data center. The cloud is by no means a plug-and-play solution, but, in fact, requires a very strategic and thoughtful approach to implementation. This is further complicated when organizations have legacy systems to maintain. So, how can you redesign your data center to accommodate technologies that best serve the business overall?

    You must work together with management to lay out what IT support looks like over the next 3-5 years before deciding which legacy systems your organization should keep and migrate to the cloud or dispose of altogether. Ask, “What platforms will you be supporting? Where is the biggest area of growth? If you choose to migrate any legacy systems to the cloud, what is the monetary impact of that temporary downtime? Do you have the people in place to do a lift and shift or a build-out from the ground up?”

    In many cases, the ability to quickly pivot and take advantage of new data center technology is hindered by a lack of skilled administrators to facilitate. You should begin cultivating critical, “next-generation” skill sets like hybrid IT management and monitoring, application migration, distributed architectures, automation and programming, vendor management and others in order to find success not only implementing a hybrid IT strategy, but also in today’s overall IT landscape.

    Barrier 3: Budget Limitations

    What organization isn’t asking its IT department to do more with less these days? When you consider that IT budgets are decreasing, a shift to the cloud may seem costly when your organization has likely already invested a significant amount of capital and operational expenses into not only your existing infrastructure, IT professionals to manage it. The benefit of hybrid IT is that when implemented strategically, it allows you to run leaner and get the most out of the cloud provider’s services, your existing infrastructure and your budget.

    If cost is a significant concern for your organization, you should conduct a thorough review of not only pricing structures in cloud provider SLAs but also leverage monitoring and management tools to track workload allocation and usage metrics for existing on-premises hardware to identify any wasted resources. However, an equal focus should be placed on building administrator skill sets so your organization is able to maintain a hybrid environment once implemented.

    For example, it’s possible that your IT department might not have the skills required to stand up and successfully operate a hybrid IT environment—perhaps, as a whole your team is familiar with Hyper-V and vSphere, but not Amazon Web Services, Microsoft Azure or the complexities of SLAs. This is a legitimate concern: Without the proper research and understanding of each cloud provider’s services, features and associated charges, your organization could be subject to a very large bill at the end of your first month.

    At some point, organizations that have thus far been wary of hybrid IT and cloud adoption will need to re-invest in their IT departments to avoid becoming bogged down by “tech inertia,” a scenario in which IT innovation is hampered by perceived budget limitations or, as discussed earlier, attempts to preserve legacy systems because “that’s just the way it’s done.”

    Your department should be prepared to make a case to the organization’s leadership team for hybrid IT so you’re able to get the required funding to train your teams and move all potential workloads into a test environment then monitor their performance closely. This will help you better understand potential limitations, opportunities for improvements to the end-user experience or cost-savings, etc., before undergoing an entire overhaul. During this process, you should look to cultivate key skills that will help bridge the gap between traditional on-premises management and the skills needed to architect, design and operate cloud services in the most strategic—and lean—way possible.

    Conclusion

    Although hybrid IT is the foreseeable future of IT—and the reality for most businesses in the near future if not already—getting it is not always an easy task. By conducting thorough due diligence with respect to cloud provider security and compliance certifications, and having conversations with management to discuss future growth plans and an eye on critical skill development, you and your organization can get off to a solid start working toward a cloudy or cloudier environment.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:35p
    Ex-Cisco CEO Chambers’s Next Act: Grandpa Startup Investor

    (Bloomberg) — On his most recent annual retreat to Davos, John Chambers decided somewhere between cocktails and dinner that he was passionate about government-finance data.

    The Cisco Systems executive chairman spent the evening in January learning all about OpenGov from co-founder Zac Bookman. The Silicon Valley startup aims to make financial data from governments more accessible and, in turn, more transparent.

    Chambers said he has served as an informal adviser to OpenGov since the encounter at the World Economic Forum. He made things official on Thursday by joining the board of directors, alongside venture capitalist Marc Andreessen and Joe Lonsdale, an investor and OpenGov co-founder. Chambers said he invested a “substantial” sum into OpenGov but declined to say how much.

    Since stepping down last year after two decades as Cisco CEO, Chambers has been the subject of some curiosity over what he’ll do next. Never to shy away from the spotlight, he’s been bandied about as an eventual force in politics. The outspoken Republican said at the Bloomberg Breakaway Summit on Wednesday that he expects Donald Trump will win the 2016 presidential race due to the candidate’s momentum.

    “Transitions wait for no one,” Chambers said at the conference. “You’ve got to be willing to make these transitions and disrupt and take the risk and take the criticism as you disrupt.”

    At Cisco, Chambers ran the company during a period when its networking technology became the backbone of the internet. While his tenure delivered phenomenal growth at the equipment maker, briefly making it the world’s most valuable company, his last few years in the job saw that growth slow — and then in 2014, decline.

    Critics of Chambers said he was slow to embrace the types of trends that startups like OpenGov in some ways embody: the shift to cloud-based computing from proprietary software and hardware. Under his successor Chuck Robbins, Cisco is realigning itself to offer open hardware and software before its customers switch to alternatives. In an interview on Bloomberg TV, Chambers graded the new CEO’s performance an eight out of 10.

    In Chambers’s new role as startup investor and adviser, he must help effect the sorts of technological changes he fought against while running Cisco. He said he’s excited about companies that help digitize information, a huge potential market that could change industries over the next five to 10 years. He plans to invest in two to four such startups in what he called the “digitization” sector. He backed Airware, a video-drone company, and joined its board in March.

    “Like a grandparent, I get to give all the advice and coaching,” Chambers said in an interview. “This is what I will be doing in this next chapter of my life.”

    Since it was founded in 2012, OpenGov said it’s raised $47 million from Andreessen Horowitz, 8VC, Intuit co-founder Scott Cook, actor Ashton Kutcher and other investors. It employs more than 100 people, and its CEO has ambitious goals for the startup.

    “This is a mission-driven company, but we are venture-backed,” said Bookman. “We obviously want to become a multi-billion dollar company.”

    OpenGov said it counts more than 1,000 government agencies in 46 U.S. states among those using its tools to extract and analyze payroll, accounting and other data. The Redwood City, California, company faces competition from other data-analytics upstarts, such as Tableau Software and Domo, as well as established software providers, including SAP and Oracle, which are still used widely throughout many governments.

    Annual contracts for OpenGov customers range from $5,000 to more than $1 million, said Bookman. The closely held company declined to discuss its revenue or other financials. Bookman said he’s looking to expand OpenGov to local and county governments, as well as schools. He said the various institutions could learn a lot from one another once they start sharing data.

    5:00p
    Billions Cut to Federal IT Leave Nation’s Nukes Controlled by Floppies

    ITPro logo

    Brought to You by IT Pro

    We all knew federal IT was bad, but a recent report found that it was way worse than almost anyone could imagine: The nation’s nuclear arsenal, for example, is coordinated with 8-inch floppies, while a number of important tax and benefits were written in COBOL and are becoming increasingly difficult to maintain.

    The Government Accountability Office found that, like with a lot of companies, agency IT budgets just haven’t found their way to the top of the list, particularly with cuts across the board. But that save now, pay later approach has a problem: It’s now later, and penny pinching over the past decade has left a crippled technology infrastructure.

    In fact, the GAO found a $7.3 billion decline in spending on modernization since 2010.

    federal it spending

    What’s worse is that, while almost all the agencies say that they plan to make the needed modernizations, they have no time table (or budget) in which to make the changes.

    Instead, the increasing cost of maintenance eats up the IT budget, making needed upgrades even harder to find the money for.

    That leads agencies to be further and further behind, as the Department of Defense recently discovered: A planned upgrade to Windows 10 ran into problems when theMarines found that their new computers were often too outdated to run the latest from Microsoft.

    We purchase yesterday’s technology tomorrow, Brig. Gen. Dennis Crall, the Marine Corps CIO, quipped.

    This first ran at http://windowsitpro.com/windows/billions-cut-federal-it-leave-nations-nukes-controlled-floppies

    5:30p
    Amazon to Battle Google With New Cloud Service for AI Software

    (Bloomberg) — Amazon is testing a new service that will make it easier for businesses to run artificial intelligence software on computers rented from the company, people familiar with the situation said, stepping up competition with Alphabet unit Google, Microsoft, and IBM.

    Amazon Web Services, the company’s cloud business, rolled out a limited offering in this area last year. The new service will let businesses run a wider range of AI software on Amazon’s computers, leading to more powerful applications capable of tasks like pattern recognition and speech transcription. Some customers are testing it, according to the people, who asked not to be identified because the offering hasn’t been announced yet.

    This is a rare example of Amazon chasing rather than leading in public cloud computing, the business of renting out computer power, storage and related services over the internet. In March, Google released a wide range of cloud services centered around a type of AI known as machine learning, which the internet giant has used internally for years. IBM, Microsoft and startups including Clarifai Inc. and MetaMind Inc. have built similar services. MetaMind was acquired by Salesforce.com Inc. in April.

    Read more: Google Using Machine Learning to Boost Data Center Efficiency

    An Amazon spokeswoman said the company is working on other machine learning capabilities for cloud customers.

    Google, Amazon, Microsoft and others have begun to release special programming software known as frameworks that let developers create and control neural networks — a powerful and increasingly popular form of AI which helps computers study data with the same kind of intuition as people.

    Amazon’s new service will let businesses run different frameworks on its computers, including those created by rivals such as Google, which introduced one called TensorFlow last year, according to the people familiar with the situation.

    The service will use a new type of rentable computer from Amazon that has eight graphical processing units from Nvidia, up from four available on Amazon computers today, the people said. GPUs are a common type of semiconductor widely used to run AI programs. The more GPUs, the faster Amazon’s computers can run AI software and crunch related data.

    See also: Google Has Built Its Own Custom Chip for AI Servers

    It’s unlikely to end with eight GPUs in one computer, though. Last year, an Amazon researcher showed how the company could run this kind of AI work on as many as 80 computers as the same time.

    Read more: What Cloud and AI Mean for Google’s Data Center Strategy

    7:16p
    Nlyte Virtualizes Data Center PDU Functions

    Nlyte Software launched a data center infrastructure management feature it describes as a virtual power distribution unit, or V-PDU. Available as part of its DCIM software suite, it provides many of the functions users normally get from intelligent data center PDU hardware but at 10 percent of the cost, according to the company.

    Those functions include tracking historical power and energy usage, chargeback capabilities, and power-draw aggregation, among others.

    The feature is another step toward virtualization of functions traditionally performed by hardware systems in data centers, according to Nlyte.

    “For far too long IT and Data Center Managers have been held hostage by PDU hardware vendors,” Nlyte president and CEO, Doug Sabella, said in a statement. “I am pleased to say that those days are over. With our new V-PDU capabilities we are enabling enterprises to access vital data about energy consumption at a fraction of the price.”

    Virtual PDU features, according to Nlyte:

    • Automatic calculations of historical power and energy usage
    • Ability to charge back within departments
    • Automatic aggregation of power draw on Virtual PDUs
    • 2 network ports and provisioned addresses per panel, versus 2 per cabinet
    • Support for more than 10 cabinets per panel (assuming 42 breaker panel)
    • Insight into risk at the rack level
    • Access to Nlyte Energy Optimizer (NEO) – Branch Circuit Analytics (BCA)
    • Automatic support for data center monitoring & alarming requirements

    See also: Who is Winning in the DCIM Software Market?

    << Previous Day 2016/05/26
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org