Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, June 2nd, 2015

    Time Event
    11:00a
    Telco Consortium Buys Cloud Provider Codero

    Consortium of 32 regional telecom and broadband providers called BLM Acquisition Corp. has acquired cloud service provider Codero.

    Financial terms of the deal have not been disclosed, but Codero CEO Emil Sayegh said it was at an above-average multiple. However, this latest cloud acquisition isn’t about just equity.

    “These entities in their own right are technology companies with presence across the United States,” said Sayegh. They will be a channel partner, putting Codero services closer in terms of transit, and in terms of data center footprint.”

    Codero not only gains the equity needed to expand, it gains several avenues of established customer pools in underserved markets to expand into. A network of strategic investors with “boots-on-the ground” knowledge of communications and technology needs will help guide the company going forward.

    Codero customers will have enhanced connectivity options, and Codero will gain access to one of the largest combined fiber networks in the U.S., all facilitating the delivery of latency-sensitive applications.

    Long-Time Investor Catalyst Partners Exits

    Catalyst Partners has been the company’s backer for 9 years – a very long time considering most private equity firms look for an exit and profit in 5 years.

    “They’ve been an excellent partner, but it came to a point where the investment has been one of their oldest funds, and it needed to be closed,” Sayegh said. “We were looking for another investment partner, not necessarily get acquired. We were looking to take some of Catalysts’ equity out. PE funds are limited and regulated to fund a current investment from another fund. As we wanted to expand to Europe, we needed another source of capital.”

    Cloud Transforming Hosting Industry

    Codero’s roots are as a dedicated and managed hosting company. Sayegh took the helm in 2012 and helped it transform and turn around, as the dedicated-hosting industry was slowing. Automation transformed what used to be a timely rack-and-stack physical server provisioning process into cloud. Dedicated hosting is bare metal cloud’s father.

    Amidst a sea change in the hosting industry, as it evolves toward cloud, there has been a lot of “rollup” and consolidation. Telecoms and cable companies have made big cloud acquisitions, such as Peer 1’s acquisition by Cogeco, for example, to form the basis for a cloud play. There have also been several consolidation deals where two companies joined forces, such as Datapipe’s acquisition of GoGrid.

    The Codero deal is unique in that many of the stakeholders have services businesses themselves, and will essentially “franchise” Codero services in their data centers.

    The consortium consists of regional telecoms serving various audiences across the country that are often overlooked by large service providers. Individually, these are small regional plays, but Codero is a unifying front. Combined, the consortium’s subscriber numbers are immense. One consortium member is itself a consortium of eight or so telecoms.

    The acquisition makes Codero a major cloud play in the cities that aren’t core markets and in smaller and rural towns. Codero is in good position to provide edge cloud services to local businesses in towns and regions you wouldn’t necessarily think of.

    Codero will be putting hardware in their facilities. The company can expand in a variety of different ways, from having a relay at a local central office or putting a large footprint at a customer location. Some may choose to offer certain Codero services at a given location, and some will house mini Codero data centers inside of their data centers. Sayegh said the company has designed a very repeatable footprint.

    Sayegh will be chairman of the board and retain his position as CEO and president of Codero.

    12:00p
    Equinix: We Didn’t Want to Stay Number-Two in Europe Forever

    Had it allowed the Interxion-TelecityGroup merger to go through, Equinix could end up forever relegated to a number-two position in the European data center market, which wasn’t a future the company’s leadership wanted.

    For one, if the Redwood City, California-based data center services giant watched the planned merger between two of its biggest European competitors close, antitrust regulations would make it nearly impossible to buy the combined entity, Equinix CFO Keith Taylor said in an interview with Data Center Knowledge. It would also open up the possibility of another competitor snapping up the combined giant, also making a leadership position in the region nearly unattainable.

    Playing Both Offense and Defense

    Equinix’s offer to buy London-based Telecity for $3.6 billion in cash and stock, which was announced last week, and which blocked Telecity’s previously announced merger with Interxion, was Equinix playing both defense and offense, Taylor explained. It was defensive in the sense that it served to block creation of an unchallengeable giant in the European data center market. It was offensive because it would add a huge number of strategically valuable assets to the U.S. colocation and interconnection company’s portfolio.

    If completed, the deal will take Equinix into seven new countries and greatly expand its presence in some key European markets where it already operates. It will broaden the provider’s ability to go after software developers and cloud service providers in Europe, which will in turn increase activity on its worldwide interconnection platform – a cornerstone of its business strategy.

    “That’s just perfect for our acquisition,” Taylor said. “It’s intrinsically value-positive. That was the right deal for us to do.”

    Building Out European Data Center Empire

    The seven new countries are Ireland, Italy, Sweden, Finland, Poland, Bulgaria, and Turkey. Equinix is looking at Telecity’s data centers in Dublin, Milan, Stockholm, and Helsinki as an opportunity to expand into already key European data center markets. The company considers Telecity data centers in Warsaw, Sofia, and Istanbul “launch pads for future growth.”

    Another new market where the U.S. provider will gain instant presence is Manchester. Equinix has had lots of capacity in the London market for years, but never in the north of England.

    Still, London is central to the acquisition. Equinix’s assets in the London market are concentrated in Slough, a suburb west of the city, while Telecity has massive data center presence in the Docklands, an east-London area. The five-data center Docklands campus is a crown jewel in its portfolio, housing a big chunk of the infrastructure for the London Internet Exchange and a plethora of financial services companies, cloud providers, and customers in other business verticals, such as healthcare, energy, and mobile.

    “Having access to the Docklands, which is a major network point in the London market, is going to be very attractive for a company like Equinix,” Taylor said.

    Amsterdam, Frankfurt, and Paris are other core markets where the deal would substantially increase Equinix’s existing footprint.

    The deal is subject to regulatory review and approval. Taylor said the company expects to close the transaction in the second half of fiscal 2016.

    By the numbers:

    Equinix Telecity numbers chart

    Chart courtesy of Equinix

    Interxion Deal Likely on Horizon

    All eyes are now Interxion, the Netherlands-based service provider with 34 data centers across 11 countries in Europe that’s ripe for takeover by a rival. But that rival won’t be Equinix. “I don’t think you get to buy both,” Taylor said. “I don’t think, from regulatory perspective, acquiring that entity is something that would be easily done.”

    By agreeing to acquire Telecity, Equinix has put an Interxion merger with another company into play. “It is our expectation that it would merge with somebody else,” he said.

    3:30p
    Debunking the Myths of the Open Source Community

    Rafael Laguna is CEO and a co-founder of Open-Xchange.

    Once a particular belief or habit has been hammered into the mind, it’s difficult to shake it. That’s especially the case in the open source community, where there are ingrained perceptions around open source software development and its leaders.

    The Linux operating system is the most popular open-source software in the world and has been ported to more computer hardware platforms than any other operating system. Readers will know the story of the underdog who rose to become the world’s leading server operating system. Android especially, a Linux derivative, has caused a stir in recent years with two out of three tablets and 75 percent of all smartphones using the Linux derivative operating system.

    The Linux Foundation has published an interesting document about Linux Kernal Development. Since 2005, 11,800 developers have worked from 1,200 different companies on the Linux Kernel. More and more paid professionals are working on developing the Linux platform; at least 88 percent of the recent improvements have come from full-time developers. Among the companies that contribute most to the Linux kernel, we find hardware manufacturers such as Intel, IBM, Samsung, AMD and Nvidia as well as software companies like Red Hat, Oracle and SUSE. These companies have earned good money for years with the Linux systems and invest accordingly in its development.

    It’s time we debunk these myths around the open source community.

    Open Source is More Than Linux

    Open source software is far more than just Linux. The internet is principally based – fortunately for us all – on open source software. This includes the Apache HTTP server project, the e-mail server Dovecot, the domain named software BIND, and PowerDNS or MySQL/MariaDB.

    Although many of these are crucial components of the modern internet, their development is sometimes dependent on the work of a few. This is most obvious in the encryption software GnuPG which was principally authored by renowned German developer Werner Koch. The project’s continued funding seemed insecure before February of this year, but now happily seems assured.

    Other software projects include organized charitable foundations, such as the Apache Foundation or Document Foundation which focus on generating public donations; this ultimately results in continuous funding from the IT industry.

    Open Source as a Business Model

    A number of companies in recent years have managed to build a sustainable business around open source software. Admittedly, the obvious example, Red Hat, has long shined the brightest of all the stars in the open-source community.

    However, the recent changing of the guard of client-server architectures by internet-based services has offered tremendous opportunities for new business. Open source software guarantees interoperability through compliance with open standards and that adds huge value for the user in the form of cost advantages, competition, innovation, speed, vendor independence and investment security. Cloudera is the first of this new generation of software providers to finally generate more than $100 million in annual turnover. Other success stories that have made a business model on the back of open source software include the American powerhouses Hortonworks, MongoDB and Docker.

    We in Europe and Germany also have a few hidden gems in the open source business: the MariaDB database, based in Scandinavia, has a good chance to repeat the successes of MySQL. And under the umbrella of Open-Xchange, after the recent mergers with Dovecot and PowerDNS, a heavyweight in the e-mail, collaboration and office software is growing.

    The Future of Open Source

    When it comes to open source software, it is clear that the community is worth a closer look. It has been a long time since open-source was nothing but some “free software” on a few thousand workstations in Munich. It’s clear that the time of open-source software in the cloud has has arrived – and the open community of open source is much broader than we think!

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:33p
    Second Google Data Center Coming to Singapore

    Google is building a second Singapore data center for an estimated $380 million. The facility will round out the company’s data center investment in the country to about $500 million.

    It will be constructed on a plot of land next to its current site. The second, multi-level Google data center is expected to open in mid-2017, with the company still in design stages. Google has provided an early but not final rendering of the future facility (above).

    Google’s Joe Kava, vice president of data centers, wrote in a blog post that the new data center will be built on the same principles as the first facility.

    “It will be one of the most efficient and environmentally friendly sites in Asia, using 100 percent recycled water for its critical operations (i.e. just about everything other than drinking water),” wrote Kava.

    Google will continue to work with the community through programs and bi-annual grants.

    The first Google data center in Singapore was launched in 2013 at a cost of $120 million. In the last year and a half, more than 400,000 Singaporeans went online for the first time, wrote Kava. Smartphone penetration is up, and so are internet access speeds. The same trends are occurring throughout Asia Pacific.

    The other Google data center in Asia is located in Changhua, Taiwan. The land was purchased in 2011 and the $600 million data center opened in 2013. Google made another $66 million investment in the site earlier this year.

    Singapore and Taiwan are both high-tech hubs with reliable infrastructure. Their governments support innovation and foreign investment.

    Google initially had plans for a Hong Kong data center as well, but the project was scrapped.

    5:00p
    Nokia Launches Data Center IT Package for Telco Cloud

    Nokia has launched a cloud infrastructure package for telcos called Nokia AirFrame Data Center Solution. It is meant to enable telcos to set up localized, or “edge” data centers running on Intel servers to provide cloud infrastructure that addresses rising demand for mobile traffic.

    The first wave of telco clouds was about taking core applications to a virtualized environment in centralized data centers. The second iteration is more distributed for delivery of services at closer proximity, while addressing the demand for higher processing power.

    IT and telco domains are merging with a new breed of telco-focused clouds. Nokia said its cloud employs advanced telco cloud security practices, which have been tested and approved at the Nokia Security Center in Berlin. As telcos move from centralized to distributed models of serving customers, there are security, orchestration, automation, and lifecycle management issues.

    Nokia has built pre-integrated racks consisting of ultra-dense servers, high performance switches, and software defined storage called AirFrame Cloud Servers and Switches. These modules include Nokia Networks-specific enhancements to make it more efficient at running Virtualized Network Functions. Ideally, they can be installed next to base station gear to connect local callers to wider voice and data networks.

    AirFrame is complemented with a suite of professional services geared for implementing, monitoring, and operating a telco cloud data center.

    The more flexible and distributed cloud architecture enables mobile operators to serve locally, reducing latency. The ultra-dense servers can also handle demand for higher processing needs. The offering is 5G-ready, according to the company.

    Nokia believes it is merging the IT and telco domains with the offering. The distributed cloud is architected to run data-demanding telco applications like mobile network Virtual Network Functions (VNFs) but is also fully compliant with IT standards to run your common IT application in parallel to telco cloud needs.

    “We are taking on the IT-telco convergence with a new solution to challenge the traditional IT approach of the data center,” said Marc Rouanne, executive vice president, Mobile Broadband, Nokia Networks, in a press release. “This newest solution brings telcos carrier-grade high availability, security-focused reliability as well as low latency, while leveraging the company’s deep networks expertise and strong business with operators to address an increasingly cloud-focused market valued in the tens of billions of euros.”

    The move puts Nokia in competition with other network-focused cloud offerings being developed to help telcos better serve customers locally.

    Mobile competitor Ericsson is also developing similar data center services. It has a cloud for telcos in the works built atop of OpenStack. The company has also invested in SDN providers like startup Pluribus and purchased data center management software firm Sentilla last year.

    Behind Nokia’s cloud is a lot of research and development. Nokia also announced it opened an R&D facility for data center technology development.

    5:30p
    HP Lowers Cost of Flash for Data Center Storage

    Pursuing an all-flash data center storage future, HP announced at its Discover conference in Las Vegas this week new innovations in its 3PAR StoreServ storage line, including a new class of massively scalable flash arrays and flash-optimized data services.

    After a banner 2014 for HP in the solid-state arrays market the company is hoping to push more enterprises into flash and realize an all-flash data center by lowering the cost of the medium to $1.50 per gigabyte.

    This week the company launched the 3PAR StoreServ Storage 20000 enterprise flash family of Tier I all-flash arrays for data center storage with increased speed and decreased footprint requirements. All of the StoreServ models are built on a single architecture with one operating system and interface, and offer a common set of enterprise data services.

    Noting that $1.50/GB is just the beginning, HP says that new 3.84 terabyte SSDs are further enhanced with its flash optimization software and hardware-accelerated data compaction that increase usable capacity by 75 percent. The cost reduction is due in part to the cost of new cMLC SSDs coming down in price. HP says the new 20850 model boosts performance with over 3.2 million IOPS at sub-millisecond latency, and over 75 GBps of sustained throughput, and that the new 20800 converged flash array is able to scale up to 15 petabytes of usable capacity.

    The new 3PAR line supports unified block and file workloads as well as object access, and is powered by the new Gen5 Thin Express ASIC for data validation. For environments that require extreme low-latency solutions, HP says it now provides the Alcatel-Lucent 1830 Photonic Services Switch, which, combined with 3PAR remote copy, will deliver synchronous replication over fibre infrastructure up to 130 kilometers with in-flight encryption.

    HP says it also enhanced native data center storage federation capabilities with new 3PAR Peer Motion support for non-disruptive, bi-directional data movement. This allows the creation of a storage federation across multiple storage arrays with up to 60 petabytes of aggregate usable capacity, according to HP. With 3PAR software exclusive to the StoreServ 20000 line, HP says that workloads can move between members of a federation to dynamically rebalance storage resources.

    The systems will ship in August 2015, with retail prices starting at $75,000.

    7:13p
    ClearDATA Gets $25M in Funding to Support Healthcare Cloud Growth Trend

    logo-WHIR

    This article originally appeared at The WHIR

    The abysmal state of healthcare data security is a huge opportunity for cloud providers focused on providing secure services to healthcare providers. On Thursday ClearDATA announced $25 million in Series C funding to support its continued growth. The company is in it’s fourth year of 100 percent increase in its YOY subscription revenue.

    ClearDATA intends to use the funding to maximize growth in the health information technology cloud computing sector. According to Frost & Sullivan estimates in October 2014, the total US healthcare cloud market revenue is at $900 million.Marketsandmarkets estimates the US cloud healthcare market at $6.5 billion by 2018. ClearDATA projects, “IT cloud penetration in the US healthcare industry will grow to 50 percent in the next four to five years, representing approximately $6.5 billion of a $13 billion market for multi-cloud computing and managed services.”

    Under this estimate, healthcare cloud would represent half of the overall cloud market.

    The difference in these numbers represents a huge opportunity for cloud providers choosing to focus on the healthcare industry. Obviously ClearDATA is on board with this huge growth trend in both the need for services due to increasing regulations and security breaches. Just this year Anthem and Premera were breached by hackers.

    “Security breaches, HIPAA and compliance regulations, and data itself are amassing at levels that quickly outgrow the average existing health IT infrastructure,” said Darin Brannan, CEO, ClearDATA. “Healthcare organizations recognize the need for a purpose-built healthcare exclusive provider to properly address these issues—and so do investors. We are excited to continue our momentum in the healthcare cloud market, which this new funding further accelerates.”

    The release says ClearDATA solves key market problems such as managed infrastructure with HealthDATA™ cloud and SaaS HIT cloud management. “ClearDATA secures and protects patient health information in a HIPAA-compliant environment for data interoperability, aggregation and access…providing a single interface to manage and monitor resources in real-time across multiple platforms.”

    This funding round was led by Heritage Group, HLM Venture Partners and Flare CapitalPartners.

    This first ran at http://www.thewhir.com/web-hosting-news/cleardata-gets-25m-in-funding-to-support-healthcare-cloud-growth-trend

    8:00p
    Google to Spend $300M on Second Atlanta Metro Data Center

    Google announced at a press conference today that it is undertaking a $300 million data center expansion in Douglas County, Georgia. It will be adjacent to the existing 500,000 square foot Google data center in Lithia Springs, a town just outside of Atlanta. Construction will begin this summer with estimated completion by 2016.

    This is the second Google data center construction project announced today, following an earlier announcement of a data center expansion in Singapore.

    There were initial rumors and reports of a potential expansion there in in March. The facility will create an additional 25 data center jobs, with that figure expected to grow. Google currently employs over 350 people at the Douglas County facility.

    “Data centers are the engines of the internet, and as the internet grows, our data centers are growing too,” Jason Wellman, Google data center operations manager, said in a press release. “Douglas County and the state of Georgia have been excellent partners, enabling us to grow our presence in the state. This expansion will allow us to continue to provide fast and reliable service to millions of people around the clock.”

    Douglas County officials recently approved a package of tax breaks for the project to entice the Mountain View, California-based internet giant to commit to the build, according to a local news report. State and local government officials use tax breaks to attract data center construction projects, which are seen as strong drivers of local economic development.

    Governor Nathan Deal was present for the event, calling Google “one of the magic ingredients” that helps lure firms to Georgia, according to a quote in the Atlanta Journal-Constitution. Deal said that economic development prospects often cite the availability of high-speed internet in the metro Atlanta region

    “Much like Google, Douglas County continues to rapidly grow — evolving from a rural area to the economic hub of west Georgia,” said Tom Worthan, chairman, Douglas County Board of Commissioners, in a statement. “The expansion of Google’s data center demonstrates our commitment to growth and innovation. We are excited that the Google team has chosen to continue to invest in our community by expanding its home here with us.”

    Google has continuously expanded its data centers, which form the backbone of current and future services. The company has 13 data centers worldwide.

    The first Atlanta metro facility is one of a handful of Google data center sites that use recycled water for all of their cooling needs.

    Google said it helped provide $2.76 billion of economic activity for 62,000 Georgia businesses in 2014, according to Atlanta Business Chronicle.

    There are 100 billion Google searches every month, according to the company. In 2008, there were one trillion web addresses. Now, there are 60 trillion.

    << Previous Day 2015/06/02
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org