Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, December 9th, 2015

    Time Event
    1:00p
    What’s Driving Change in the Colocation Data Center Industry?

    LAS VEGAS – The business of providing colocation data center services is changing in numerous ways and for different reasons. Customers are getting smarter about what they want from their data center providers; enterprises use more and more cloud services, and the role of colocation data centers as hubs for cloud access is growing quickly as a result; technology trends like the Internet of Things and DCIM are impacting the industry, each in its own way.

    Some of the trends are having a profound effect on the competitive makeup of the market, where even some of the largest players are making big strategic changes and spending lots of money on acquisitions to adjust to the new world they are doing business in.

    Bob Gill, a research director at Gartner, outlined eight of the most consequential current trends in the colocation industry at the research and consulting giant’s annual data center operations summit here this week:

    Bifurcation

    While some providers are branching out, adding higher-level services, such as hosting, cloud, interconnection, or managed services, others are sticking to what’s often referred to as “pure-play colo,” providing space, power, and cooling, and doing it well and at a low cost to the customer.

    Examples of the former would be ByteGrid, which went from pure wholesale to hosting, cloud, and managed services, or Digital Realty Trust, which after the acquisition of Telx has moved into the retail colocation and interconnection space in a big way. An example of the latter is DuPont Fabros Technology, which after dabbling in retail colocation last year has changed plans and refocused squarely on wholesale space and power.

    Flexible Design

    As colocation customers’ understanding of their workloads and data center needs gets more sophisticated, they want more capacity and density options from colo providers. Companies often want multiple power densities in the same space – something providers like Vantage Data Centers and DuPont Fabros have designed for.

    Customers also want more flexibility in capacity commitments. Aligned Data Centers, for example, launched this year offering colocation capacity customers can scale up or down and pay only for what they use.

    Growing Role of DCIM

    Data center infrastructure management software is having an impact on colo providers both internally and externally.

    Internally, providers use it to improve efficiency and resiliency, making more informed decisions about their infrastructure using empirical data. Externally, many use DCIM to provide customers with insights into their power usage.

    While a full-suite approach to DCIM is useful for internal use, colocation customers don’t need all the bells and whistles that come in a typical suite. An average customer doesn’t need 3D visualization of their environment, for example but will find visibility into their power consumption very useful.

    Mergers and Acquisitions

    The colo industry has never been a stranger to M&A, but this has been an especially acquisitive year. Some blockbuster deals went down, including Equinix’s purchase of TelecityGroup, and Digital Realty’s Telx deal. Smaller examples include the acquisition of Carpathia Hosting by QTS Realty Trust, and CyrusOne’s entry into the New York market by buying Cervalis.

    Acquisitions have both positive and negative effects, according to Gill. On one hand, they create more opportunity for the providers, expand their reach both geographically and in terms of product selection.

    At the same time, big acquisitions sometimes create gorillas in certain markets. Equinix, for example, will become the biggest player in Europe by far if the Telecity deal is completed successfully, and it will be nearly impossible for any company to challenge its dominance.

    In some cases, companies also make acquisitions to take advantage of a single strength of a company, discounting the rest. And, of course, acquisitions always bring about change in procedures and priorities.

    Changing Face of the Buyer

    Colocation providers increasingly find themselves talking to people who are in different roles within their organizations. Instead of staff in facilities roles, a data center provider rep may have to pitch to a cloud architect, for example.

    Since colocation is becoming a major access point to cloud services, people within customer organizations who oversee cloud strategy are involved in colocation decisions.

    Edge Computing

    Cloud service providers and digital media companies need to store more and more data in population centers that haven’t traditionally been major data center markets. This is driving growth in the edge data center space.

    Providing services in edge markets, however, is a complex business model, since it requires lots of connectivity to and from the data center, internal interconnection as well as WAN capacity.

    Edge data center providers also need to be able to attract a critical mass of last-mile ISVs, long-haul carriers, and content providers to make the model work.

    Internet of Things and Big Data

    What do you call all the IP-connected temperature and humidity sensors, power meters, and CPUs that transmit operational server data if not the Internet of Things? Internally, IoT for data center operators means instrumenting the facilities and analyzing data from that instrumentation to make more informed operational decisions.

    Externally, colocation facilities can act as aggregation points for clients’ IoT data, as well as places where processing of that data takes place.

    Cloud

    “Cloud and colo are natural-born allies,” Gill said. “The cloud has to live somewhere.” While companies like Amazon and Microsoft build their own data centers to house much of the infrastructure that supports their cloud services, they are also massive colocation customers, using data center providers to expand in places where for one reason or another they don’t build their own facilities.

    Colocation data centers are also becoming hubs where enterprises access a multitude of cloud service providers. Enterprises seldom use a single cloud provider. Most of them use different providers for different things, and instead of connecting to those providers from their on-premise data centers, it’s a lot easier to access them from a single colocation site, where their infrastructure already sits.

    4:00p
    Third-Party Data Center Market Heating Up in Nordic Region

    The third-party data center market in the cool Nordic region is about to heat up, according to a new report by BroadGroup which covers Iceland, Norway, Sweden, Finland, and Denmark.

    Over the next three years, an estimated €3.3 billion will flow into the region with nearly half coming from overseas internet players such as Google, Apple, Yandex, and Facebook. The influx of new business will also increase space by two and a half times and triple power requirements from current levels by the end of 2017.

    Denmark specifically is benefiting hugely from the construction of Apple’s new data center in Viborg, central Jutland.

    According to the report, reduced taxes, an abundance of low-cost renewable energy, investor incentives, and a highly educated workforce in the region combine to attract business. While the majority of companies delivering colocation, hosting, and cloud services are telcos in the Nordic region, that will likely change as new players come onto the scene through 2017.

     

    5:28p
    Where Will Your Data Center Be in 20 Years?

    John M. Hawkins is VP of Corporate Marketing and Communications for vXchnge.

    Twenty years in technology may seem more like 100 years when compared to other industries. In just one year a company’s landscape can change significantly. Think about how businesses scale and operate on a functional level, then add in changing technologies along with the exponential increase of data and dynamic content needed to drive business.

    Will your data center strategy survive 10, or even 20 years? Will the company grow like your key stakeholders expect? If so, you may need multiple data centers, strategically located, just to handle your requirements. On the other hand, your CFO might have a more conservative estimate and is responsible for how much is actually spent on a data center(s).

    In addition to size, you have to consider whether your data center(s) might become obsolete in 5, 10, 15, or 20 years.

    According to Pitt Turner, executive director with the Uptime Institute, there is no set lifespan to a data center. “A data center that is designed with flexibility really doesn’t have a life expectancy,” says Turner. “Over the life of the data center, you need to replace the capacity components just like you replace the tires on your car.” Turner further states, “Chillers, UPSes, that sort of stuff, needs to be replaced and you have to have a data center infrastructure that will allow you to do that.”

    Software-Defined-Data-Center Platforms

    The Internet-of-Things and cloud will no doubt have a big impact on the data center in the future. We will see this throughout the various layers in the data center stack from the carriers, up through the facilities, and continuing to the data or business layers.

    In the future we will see a tighter cooperation and coupling between the logical layers in a data center. So rather than each of the layers (i.e., carrier, facilities, hardware, software) working in silos, we will see more integration and ultimately the need for more orchestration and automation.

    Network Optimization

    Data centers are the physical location where the cloud (IaaS, PaaS and SaaS) calls home. There is a current technology gap, which will need to be solved to help support the needs of tomorrow’s businesses.

    The current challenge is that a network is like a chain, where the weakest link can cause performance issues even on the most finely tuned networks. The networks of tomorrow will need to leverage multiple carriers to provide network redundancy.

    These backbones will most likely be stretched layer 2 networks that are optimized for speed and security that will provide low latency high-speed networks for applications. Looking down the road, these technologies will become mainstream.

    Automation is the Key

    Many of the automated tasks that we see in today’s data centers will be completed through automation. This could be assumed in the Software-Defined-Data-Center paradigm, however, the type of automation that we will see in the data centers has more to do with the automation of business-related tasks and activities. Process engines will continue to get smarter and smarter to the extent that they will be able to handle complete decision trees – almost like an artificial intelligence of sorts for the data center.

    The automation process engines combined with the newer optimized networks will give integrated redundancy and failover capability unseen today. This will provide more and more flexibility as companies build their N+1 data center “edge” strategy.

    Consider Location

    Application developers are leveraging federated design models to build applications which are having an impact on data center design and selection. For example, instead of storing massive amounts of data in a few select data centers, application providers are moving their applications to “the edge,” (in locations where they can serve customers locally, and reach more businesses and consumers in more markets) in order to be closer to the consumer to reduce latencies and perform at higher data transfer rates.

    Having a data center partner that has reliable and resilient infrastructures strategically situated in multiple areas around the country will help keep data safe and processing properly.

    Planning For the Future: Data Center as a Service

    Another way of looking at the future of your data center is to consider your data center as a service. Here, you don’t have to allocate capital on buying real estate. You simply use a colocation provider that has already made the investment – and a plan to maintain the data center now and many years from now, offering flexibility to support future growth.

    Data center as a service also offers plenty of added value via in-house services as well as access to partner services that your company can easily plug into. In addition, data centers that have a diverse and abundant environment of carriers will offer more options than facilities that lack such an ecosystem. Today, more cloud, hybrid and even on-premise applications are relying on a dual carrier network strategy.

    Conclusion

    You really have two options when considering your long-term data center strategy. First, you can use your own research to guestimate the needs of your company, taking on a huge capital expense while building your own data center.

    The second option is to consider a colocation service provider that has already done the research, is building and innovating, offers a variety of locations, services and products to leverage, and is optimized and ready for growth. This makes the provider responsible for upgrading expensive equipment as needed and making sure the data center is future proof.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:42p
    Emerson: Half of Operators to Upgrade Data Center Cooling Systems Next Year

    The need for greater energy efficiency and more capacity has put cooling systems high on the list of priorities in 2016 for IT, facilities, and data center managers in North America, according to new research from Emerson Network Power. Results show that before the end of next year, more than half of all data center cooling systems will be upgraded, according to Emerson.

    That’s on top of the 40 percent of respondents that already did so in the past five years and another 20 percent in the process of doing so. While many are upgrading voluntarily, a combined 39 percent said the need to meet state energy codes or Energy Star LEED requirements were the catalysts.

    The size of the data center seems to matter as 62 percent of the upgrades will occur in data centers under 10,000 square feet and 18 percent in those larger than 50,000 square feet. Inefficient cooling systems are an especially widespread problem in smaller data centers.

    “Reliable performance and efficiency have always been critical to large data center performance. As the edge and cloud computing become ubiquitous, ensuring the health of cooling systems at smaller, localized data centers and computer rooms is crucial. Thermal upgrades are allowing companies to improve protection, efficiency and visibility within all these spaces,” said John Peter Valiulis, VP of thermal management marketing for Emerson in North America, in a press statement.

    6:00p
    IBM Boosts Video Capabilities With Clearleap Acquisition

    talkincloud

    This article originally ran at Talkin’ Cloud

    IBM announced on Tuesday that it has acquired cloud-based video services provider Clearleap. Financial terms of the deal were not disclosed.

    IBM will integrate Clearleap into its IBM Cloud platform in order to offer enterprises, educational institutions and government organizations a way to “manage, monetize and grow user video experiences” while delivering them securely across devices, according to a press release. Organizations can build video into applications with Clearleap’s open API framework.

    The Clearleap video platform is optimized for massive scalability, allowing clients to support millions of concurrent users within seconds, IBM said, which is critical when supporting special events. A recent report found that the cloud-based video conferencing services market could be worth $2.9 billion by 2020, which marks a clear opportunity for IBM.

    “Clearleap joins IBM at a tipping point in the industry when visual information and visual communication are not just important to consumers, but are exploding across every industry,” Robert LeBlanc, Senior Vice President, IBM Cloud said in a statement. “This comes together for a client when any content can be delivered quickly and economically to any device in the most natural way.”

    Clearleap’s video services will be offered through IBM Cloud data centers around the world. IBM will offer the Clearleap APIs on IBM Bluemix in 2016 for clients to easily build new video offerings.

    “With consumer demand for video growing exponentially, the business of creating compelling and personalized video experiences is booming. This makes the acquisition by IBM, a global leader in technology, a perfect fit,” Braxton Jarratt, CEO of Clearleap said in a statement. “As a part of IBM, we can extend the capabilities and global reach of the Clearleap innovations to grow and scale like never before.”

    This first ran at http://talkincloud.com/cloud-computing-mergers-and-acquisitions/ibm-boosts-video-capabilities-clearleap-acquisition

    << Previous Day 2015/12/09
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org