Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, October 14th, 2015

    Time Event
    12:00p
    Telx Acquisition Closed, Here’s Digital Realty’s Plan

    While some of the world’s biggest cloud data centers were designed and built by the cloud providers themselves – companies like Amazon, Microsoft, and Google – they represent only a part of the picture. A lot of the world’s cloud capacity lives in facilities cloud companies lease in big chunks from wholesale data center providers, such as CoreSite, DuPont Fabros, and Digital Realty Trust.

    Now that its $1.9 billion acquisition of Telx is closed, Digital hopes it can make its facilities more attractive for companies like that. San Francisco-based Digital already provides space to the likes of IBM SoftLayer, CenturyLink, Amazon, and Oracle, but the acquisition gave it control of Telx meet-me rooms, where lots of players in the cloud and network services ecosystem interconnect. Interconnecting those meet-me rooms with its big wholesale facilities over private network links is Digital’s new value proposition.

    Combining access to rich interconnection environments with large chunks of data center capacity is something the world’s largest data center interconnection broker, Equinix, has not been interested in providing, according to Chris Sharp, who recently joined Digital as CTO after two years at the helm of Equinix’s cloud interconnection business.

    “It’s not in Equinix’s interest to go after that larger footprint,” Sharp said in an interview with Data Center Knowledge. Equinix has a different business model, centered on high-margin retail colocation and interconnection services. “There’s only a few select providers, such as Digital Realty, that are able to fulfill [wholesale data center capacity requirements].”

    In developing its interconnection play, Digital has a delicate balance to uphold, since Equinix happens to be one of its biggest customers. Scott Peterson, chief investment officer at Digital who architected the Telx acquisition, told us in an interview earlier this year that the company had no plans to compete head-to-head with Equinix, which remained “a very important customer.”

    Combining wholesale capacity with interconnection is a model that more closely resembles services offered by CoreSite, one of Digital’s biggest competitors. CoreSite has traditionally offered both retail colocation and private suites, as well as cross-connects and access to internet exchanges in its facilities.

    The value of a wholesale offering is the ability to deploy data center capacity at scale – leveraging cost advantages that come with scale – and to ensure there is room to expand capacity at the same sites in the future. The value of being in a retail colocation facility with robust interconnection capabilities for its tenants is access to a bigger group of players. Digital’s new pitch is enabling customers to “leverage the best of both worlds,” according to Sharp.

    A company can connect its infrastructure deployed in a wholesale suites to a Telx meet-me room over a private network link in the same building or in a different Digital data center located in the same metro over a Digital-operated Wide Area Network.

    One example of a major cloud provider already taking a substantial amount of data center capacity at Digital’s facilities that will benefit from access to Telx’s “internet gateways” is IBM SoftLayer, Sharp said. The cloud provider occupies the wholesaler’s facilities in about a dozen markets in the US, Europe, and Asia Pacific.

    The focus on interconnection is a new phase in the development of Digital as a company. Operating as a real estate investment trust, it has traditionally focused on providing wholesale data center space, with some retail colocation as a small part of its footprint. The Telx deal doubled the size of its retail colo business and was the biggest step it has taken yet in the process of transforming its business model.

    The company currently has more than 100 data centers in the US, more than 20 in Europe, and five in Asia Pacific. Of the 20 Telx data centers around the US Digital gained control of as a result of the acquisition, 11 were in its own facilities. Some of the most important meet-me rooms under Telx’s control include one at the Google-owned carrier hotel at 111 8th Avenue in New York and another one at 350 East Cermak in Chicago, but the company offers interconnection in all of its facilities across the US.

    For the time being, Telx will operate as a line of business, reporting to Digital’s COO Jarrett Appleby, Sharp said, but will eventually be integrated more tightly into Digital.

    3:00p
    Harvest Your Data with DCIM Software

    Shekhar Dasgupta is the Founder of Greenfield Software.

    While data centers manage petabytes of business or customer data, the data from most data center operations is either trashed or simply ignored. Add to that the fact that some data centers have no monitoring systems in place and, therefore, little data exist from their data center operations, and therein lies the irony.

    While DCIM software helps to improve availability, we also derive significant insights about the operating conditions of the data center through detailed analysis of data captured in the DCIM. This matters because DCIM analytics can yield significant savings in operating and capital costs of the data center.

    Let us see what kind of data is generated out of data center operations and the lifecycle of this data. The first set comes from the IT systems: servers, storage and networks. The second set comes from the physical infrastructure: power systems, cooling systems, smoke detection and fire prevention systems.

    IT systems generate data such as CPU and memory utilization of servers, Input-Output operations per second (IOPS) of storage, bandwidth, and latency of networks. Data centers with more advanced operations capture power consumption, temperature and airflow from these devices. Physical infrastructure generates data such as power consumption, and if the right systems are in place, temperature, humidity and air quality. While such data is generated, it needs to be monitored. DCIM is one of the tools used for data center monitoring.

    The most common but rudimentary reason for data center monitoring is to provide alerts. It is like rainfall monitoring. Just as trending rainfall data over time along with other weather-related data allows predictions about the quality of monsoon in the coming years, data center monitoring allows us to predict an immediate failure and take actions to prevent a catastrophe. If we know that a UPS failure can cause fire, we can isolate that UPS or shut down the IT equipment connected to it if we get danger signals in advance through real-time monitoring of the UPS. DCIM’s earliest adoptions came from data center managers that required a single monitoring system for their physical and IT infrastructure.

    Where the purpose of monitoring is to only get alerts, the value of data is transient and mostly purged within a month. This is rare among DCIM users, however, as they realize their DCIM is a gold mine.

    DCIM software users have taken that leap forward to retain and analyze the massive amounts of data captured real-time. As repository of data captured from every single device in the data center – from both IT and physical infrastructure – DCIM lends itself to deep analytics that can help data center managers make major cost saving decisions. A couple of examples, include: Removing ghost servers and increasing rack density can improve space utilization which can defer the need to build or rent more data center capacity; and increasing temperature in certain zones of the data center, where servers have lower utilization in certain periods, can reduce cooling costs.

    DCIM analytics is like rain water harvesting that goes beyond monitoring. Just as rain water harvesting is a vital way to conserve water, DCIM has become the de facto platform for reducing data center operations’ costs as well as improving availability to higher levels through correlations of data coming from multiple devices.

    5:41p
    Mirantis and Netapp Announce OpenStack Cloud Storage Partnership

    varguylogo

    This post originally appeared at The Var Guy

    Mirantis and Netapp partnered this week to simplify cloud storage on OpenStack, the open source operating system for the cloud.

    Through the deal, Mirantis, which pitches its OpenStack distribution as the only “pure-play” version of the open source platform, and Netapp, which develops cloud storage products, will offer a validated reference architecture for storage on OpenStack.

    More specifically, the partnership entails the following:

    • Validation of Netapp’s block storage drivers on Mirantis OpenStack version 7.
    • Validation of the Fuel plugin, which lets Mirantis OpenStack connect to storage infrastructure, for use with Netapp’s products.
    • The next OpenStack release from Mirantis will feature support for Netapp’s ONTAP clustered storage solution.

    “NetApp is a vital contributor to OpenStack, a charter member of the OpenStack Foundation, and a visionary in bringing mission-critical data to the cloud,” said Kamesh Pemmaraju, vice president of Product Marketing at Mirantis. “The partnership will help put our joint expertise into the hands of the user. It leverages the latest release of Mirantis OpenStack to address high-availability use cases for both cloud-native and traditional applications targeted for high-SLA (service level agreement) environments.”

    “Enterprises rely on NetApp for their enterprise data management requirements, including protection and storage efficiency of their mission-critical data,” said Jeff O’Neal, senior director of OpenStack at NetApp. “Now enterprises can trust that NetApp’s Data Fabric ready portfolio is interoperable with Mirantis OpenStack for production deployments, bringing mission-critical data management and protection to the cloud.”

    The partnership is just the latest in what has been a period of steady growth for Mirantis. In August, the company announced its second $100 million round of funding. Its first $100 million funding round was announced in October 2014.

    This first ran at http://thevarguy.com/open-source-application-software-companies/101415/mirantis-and-netapp-announce-openstack-cloud-storage-part

    6:46p
    IBM Taps Microsoft’s Cloud Data Center Provider in China to Host Bluemix PaaS

    Expanding its Platform-as-a-Service offering to China, IBM has tapped Chinese data center provider 21Vianet to host and operate the solution in its data centers, the Armonk, New York-based giant announced today.

    IBM claims its Bluemix PaaS is the world’s largest deployment of the open source PaaS software called Cloud Foundry, which came out of VMware but was later spun off as an independently managed open source project.

    21Vianet, one of China’s largest data center providers, has had a lot of success helping US cloud services giants take their offerings to the Chinese market. It is the operator of Microsoft’s cloud services in the country, including Azure and Office 365 – a business arrangement that was recently expanded for additional four years – and has been providing managed cloud services on IBM’s behalf since last year.

    The data center provider raised about $300 million in funding from some of China’s biggest technology companies in late 2014. Investors included software giant Kingsoft Corp. and mobile phone maker Xiaomi.

    21Vianet also recently entered the US data center market, taking space at Server Farm Realty’s Silicon Valley data center to serve Chinese customers that need infrastructure in the US.

    IBM Bluemix provides a software-building environment for developers with many tools, including its flagship Big Data analytics and “cognitive computing” services called Watson. The point is to give developers an environment with tools they need while automating infrastructure management for their applications on the backend so they don’t have to worry about it.

    This is a second announcement IBM has made this week of a major move to go after the developer market in Asia Pacific. Earlier this week it announced the launch of the first cloud region in India for SoftLayer, its Infrastructure-as-a-Service cloud, citing exploding growth in the country’s developer population.

    China is the other big tech market in Asia Pacific. The country is home to 10 percent of all of the world’s developers, IBM said, citing statistics by the market research firm IDC.

    6:55p
    Cisco Researchers Dismantle Key Distributor of Ransomware

    varguylogo

    This post originally appeared at The Var Guy

    By Elizabeth Montalbano

    Cisco Systems has done its part to help rid the world of ransomware by striking a major blow to one of the largest exploit kits on the market for this type of security threat.

    The company’s Talos Security Intelligence and Research Group has disrupted a significant international revenue stream generated by the Angler Exploit Kit, which rakes in as much as $60 million a year, the company reported recently on its blog.

    “This is a significant blow to the emerging hacker economy where ransomware and the black market sale of stolen IP, credit card info and personally identifiable information are generating hundreds of millions of dollars annually,” Cisco threat researcher Nick Biasini said in the blog post.

    Angler is one of the largest exploit kits available on the market for creating and spreading malvertising or ransomware campaigns, the company said. These types of malware prevent or limit users from accessing their system or getting back data that was taken until they pay a ransom through an online payment method.

    Angler is so dangerous because it’s designed to bypass security devices and attack the largest number of devices possible, making it the most advanced and concerning exploit kit on the market for ransomware, according to Cisco.

    Talos was able to take action against the kit by first determining that a very large number of proxy servers used by Angler were located on servers of service provider Limestone Networks, Biasini said in the post. The primary threat actor in the scenario was responsible for up to 50 percent of Angler Exploit Kit activity, targeting up to 90,000 victims a day and generating more than $30 million annually.

    Key partners also helped Talos garner more intelligence about Angler activity on Limestone’s servers. Working with Level 3 Threat Research Labs, Talos also gained additional visibility into the global activity of the network, while a collaboration with OpenDNS provided a view into the domain activity associated with the adversaries, according to the post.

    Once it identified the malicious activity, Cisco took action against Angler through a number of steps. The company shut down access for customers by updating products to stop redirects to the Angler proxy servers, Biasini said. It also released Snort rules to detect and block checks from the health checks, rules that are being released to the community, he said.

    Other steps Cisco Talos took to thwart Angler activity included publishing communications mechanisms including protocols so others can protect themselves and their customers. The company also is publishing indicators of compromise (IoCs) so that defenders can analyze their own network activity and block access to remaining servers used by Angler, according to Biasini’s post.

    This first ran at http://thevarguy.com/network-security-and-data-protection-software-solutions/101415/cisco-researchers-dismantle-key-distributor-

    8:10p
    Analyst: US Telcos Should Consider Selling Hosting Units

    logo-WHIR

    This article originally appeared at The WHIR

    Investment banking firm Cowen and Company initiated coverage of CenturyLink with a call for the telco to sell its cloud-based web hosting business. The analysis is part of a broad examination of CenturyLink’s revenue streams and is representative of the telecom cloud provider segment more generally, according to Cowen analyst Gregory Williams.

    “The investor community has pressured the company to sell the hosting business,” wrote Williams. “However, management remains adamant that the colocation/hosting business is a key differentiator and integral growth driver of the story.”

    In giving CenturyLink a neutral rating and a target price of $28 per share, Williams notes that the company’s S&P-best 8.4 percent dividend is balanced by a need to stabilize revenue with diverse performance across different business divisions. CenturyLink shares were up slightly in Wednesday afternoon trading at just above $26.

    “Since acquiring Savvis … the hosting product segment could be seen as a disappointment as the segment experienced elevated (customer) churn, a $1.1 billion asset write-down in 2013 and grew just 4.5% in 2014,” Williams said.

    He also noted that telecom companies have generally struggled in the cloud computing field, and that the two larger US telecoms Verizon and AT&T could also sell their “colo/hosting business assets.” Verizon jumped into the cloud business with the 2011 acquisition of Terremark for $1.4 billion.

    Williams also points out that Congress could renew a tax law which would allow CenturyLink and similar companies to deduct capital equipment purchase costs immediately instead of gradually, which would reduce their tax burden.

    While time will tell whether CenturyLink management are correct about the importance of its cloud-based hosting services, or the investor community is right and they should leave hosting to the more IT and networking-focussed companies which seem to be having more success, CenturyLink appears to be fully committed to the hosting business.

    Savvis was merged into CentruryLink in a $2.5 billion deal completed in 2011, before the Savvis brand was retired early in 2014. Since then CenturyLink has grown its hosting and cloud services portfolio with several product launches and acquisitions, including the April purchase of NoSQL DBaaS company Orchestrate. CenturyLink was even rumored to be a possible buyer for Rackspace last year.

    This first ran at http://www.thewhir.com/web-hosting-news/us-telecoms-could-consider-selling-hosting-units-analyst

    << Previous Day 2015/10/14
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org