Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Wednesday, December 3rd, 2014
Time |
Event |
1:00p |
Can CoreOS Win the Container Fight it Has Picked? That CoreOS CEO Alex Polvi’s blog post Monday about a new application container standard the company had in the works caused a stir would be an understatement.
Co-founder of one of the current San Francisco IT startup darlings said Docker, the app container technology it had so devotedly supported all along, was fundamentally flawed. Docker is also a San Francisco IT startup darling that has enjoyed a lot of support throughout the industry.
In an interview, Polvi played down the assault-on-Docker aspect of his post, saying his team was simply addressing a technological problem. But that’s not how Docker devotees (and there are now a lot of them) took it.
“I’m not saying [CoreOS] is wrong, but that blog post is aggressively worded and seems to be intended to create uncertainty about Docker,” Mark Imbriaco, vice president of technical operations at DigitalOcean, wrote in a Tweet.
DigitalOcean is a major cloud service provider that is very popular with developers. It has been a big supporter of CoreOS, adding native support for CoreOS images on its cloud in September.
“Worse: it’s a copy-paste of our own explicit design roadmap,” Solomon Hykes, Docker founder and CTO, Tweeted back. Hykes was commenting on the design principles behind App Container and Rocket – the container standard and container runtime CoreOS is proposing.
In an interview, Joyent CTO Bryan Cantrill was direct, calling Polvi’s post “chancy,” “foolish,” “needlessly caustic,” and not “technically accurate.” Joyent is a San Francisco-based cloud service provider that has been integrating Docker into its own portfolio of services.
How “Broken” is Docker’s Security?
Docker, according to Polvi, has a “broken security model” because the entire platform is a daemon that runs as root. “Everything is in that one Docker daemon,” he said.
His other problem with Docker is the company’s platform approach, its aim to build all kinds of tools into its Docker runtime – things like tools for launching cloud servers and clustering systems – instead of focusing on a simple and composable Docker container.
Docker’s management has consistently been upfront about its plans to build tools for the entire application lifecycle in the past, but Polvi said his team was not aware of those plans until recently.
Cantrill dismissed the daemon issue as a fairly trivial problem to solve. The bigger security issue was not a Docker issue but a Linux issue, he said. That issue is that Linux containers (different from Docker containers) were not designed for multi-tenancy, which makes them unsecure, he explained.
“The OS-level problem, the kernel-level problem, is much more acute (than the daemon problem),” Cantrill said. Writing daemons that execute securely “is a solvable problem.”
It’s worth noting that Cantrill and Joyent have a horse in the race. The company’s cloud runs on its own operating system, and it has used a Docker API to build a tool that enables users to run Docker containers on its own OS rather than on a Linux substrate, which Cantrill says is unsecure in a multi-tenant situation.
Pivotal, Mesosphere Voice Support for Docker Alternative
CoreOS isn’t alone. Pivotal, the EMC-controlled software startup led by former VMware CEO Paul Maritz, has expressed support of App Container and Rocket. Pivotal engineers reviewed the App Container spec before CoreOS published it and responded positively, Polvi said.
Mesosphere, another well-known startup whose software pools disparate compute resources and presents them to applications as uniform clusters, has also expressed support for the App Container initiative. Mesosphere has been a big Docker supporter.
DigitalOcean’s Imbriaco didn’t necessarily argue the points Polvi made, taking more of an issue with the tone of Polvi’s post.
We reached out to Docker for comment, but the company’s representatives could not respond in time for publication. Docker CEO Ben Golub posted his initial comments on Polvi’s post Monday.
Steep Climb Ahead of CoreOS
The attempt by CoreOS to introduce an alternative standard to Docker is doubtless a long shot, considering how widespread support for Docker already is, and how quickly the company and the open source technology gained that support (it has been around for less than two years). People like Cantrill don’t think Rocket or App Container stand a chance.
Yet, CoreOS has also enjoyed a lot of support, often from the same people that support Docker. In fact, a lot of its success can be attributed to the success of Docker, and it has billed its Linux-based operating system as the best OS to run Docker on. But support for Polvi and his team’s proposition by top engineers at companies like Mesosphere and Pivotal is not something to ignore. | 4:30p |
Merging Old and New: Embracing the Hybrid Cloud Moe Abdula is the Vice President of Cloud Software Product Management at IBM, where he provides direction for worldwide product development and go-to-market initiatives across the cloud portfolio.
These days, most CIOs have their head in the cloud. But if they’re leading substantial, established enterprises, they also have their feet in a well-designed, carefully-secured data center.
CIOs across the board are feeling pressure – whether it’s from their CEO, their employees, customers, partners or the general shift in the market – to implement the latest cloud-based IT technology for efficiency, agility and economy. But they also need to take advantage of their existing IT infrastructure to maximize return on investment.
Enter, the Hybrid Cloud
Leaders have found that adopting a hybrid cloud architecture can create the best of both worlds. They can cut costs by storing and sharing some data and applications internally in a private cloud, and they can nimbly develop new applications and store voluminous amounts of unstructured information for big data analytics in public clouds. They lease that capacity from cloud hosting companies that specialize in data management, while integrating these capabilities into their existing on-premise infrastructure. The key is to figure out what data and what applications fit best in which place, and then figure out where they need to interact.
It requires careful planning to manage a private cloud and a third-party public cloud host. But for companies that want to get the benefit of new technology while still needing to provide bullet-proof continuity of operations, the old and the new need to work together.
Many established companies with significant IT infrastructure are making the decision to develop a hybrid cloud. For example, NiSource Inc., one of the largest natural-gas transmission companies in the U.S., recently said that it plans to move to a hybrid cloud.
Mixing the Old with the New
Companies today typically use three types of computing: dedicated servers in the corporate data center that run key applications; pooled resources in a private cloud in the corporate data center; and resources that are run by public cloud providers and accessed over the Internet. Mixing and blending any of these deployment models creates a hybrid cloud.
Most established enterprises can’t simply close their data centers and move IT to an outside provider. In some cases they have handcrafted, 20-year-old applications that are vital to services provided to a few key customers. Some ERP systems have been tuned over 20 years, and downtime would threaten corporate viability. Such applications may simply be too fragile to move. In those cases it often will be more cost effective to keep legacy applications running on isolated specialized servers rather than moving them to virtualized private cloud servers.
In other cases, there will be specialized data that is subject to regulatory or legal constraints. For example, some health care providers are reluctant to handle breast-cancer screening information outside their own walls. They don’t want to run afoul of the strict disclosure requirements for mammogram results that Congress mandated in a 1992 law, the Mammography Quality Standards Act.
When Two (or More) Become One
The data center has long been the symbolic heart of the CIO’s domain. For many years, corporate data centers continuously required more space and more electric power. But the virtualization revolution that started in 1998 meant many applications could run on a single server, and most companies have been consolidating workloads on servers ever since.
Building a private cloud in the corporate data center will cut costs and increase flexibility. In a private cloud, virtual servers can handle hundreds of workloads on a single physical server. Data center architects don’t need to dedicate storage devices to a single application. While some jobs still run on dedicated computers, CIOs treat most of the computing capacity in the data centers as a pooled resource – in effect, a private cloud that is allocated on demand.
Most companies are using some public cloud infrastructure as well. Sometimes IT departments have made a decision to use the cloud for testing or development of new applications. In other cases a marketing executive or a researcher has bypassed IT to use a new application, expensing the cost on a corporate credit card. In most companies, it’s important for the IT department to have a handle on all corporate data, and know what might be going off company servers.
Embracing the Cloud Pays Off
There are ample positive reasons for CIOs to embrace the cloud. Last year IBM commissioned a survey of top executives at 800 enterprises around the world about a range of issues. That study revealed that the majority are using cloud to integrate and apply mobile, social and big data technologies. The cloud is paying off for those companies. Those with high cloud adoption are reporting almost double the revenue growth and nearly 2.5 times higher gross profit growth than peer companies.
Embracing the cloud makes it easier for the IT department to be seen as a partner in new initiatives for other departments. Historically, IT has often been seen as a barrier. Budget constraints and the need to buy and install hardware meant that any new project required months of activity by IT before operating groups could get access. When using a public cloud, a few days or weeks of coding by IT developers can be enough to launch a new product. Our survey found that 66 percent of organizations are using cloud to strengthen the relationship between IT and lines of business.
Adopting a cloud architecture isn’t an all-or-nothing proposition. The hybrid cloud is the route to cloud computing that will be most comfortable and cost-efficient for large enterprises with long-established IT departments and data centers.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 6:23p |
DataBank Expands Kansas Data Center DataBank continues to expand, adding a third pod consisting of 4,500 square feet in Lenexa, Kansas. The second pod in its Pine Ridge data center has been filling up leading to accelerated completion time for the third.
The data center was part of DataBank’s April 2014 acquisition of Arsalon Technologies. The site now has over 12,000 square feet of raised floor, the expansion doubling the on-site utility power. An additional 1.5 megawatt generator was installed.
In the last few months, the company has also expanded data centers in downtown Dallas and north of Dallas. A data center is also on the way in Minnesota. According to the company, all recent expansions were on an accelerated schedule due to customer wins. Downtown Dallas pre-sold half of its 22,000 square foot expansion ahead of commissioning.
“Since expanding into the region this year, we have seen strong growth and interest in Kansas City,” said Tim Moore, DataBank’s CEO. “This investment reflects DataBank’s commitment to grow in this market. We believe Kansas City holds tremendous potential for us, and we are prepared to invest additional capital in the future.”
DataBank has two data centers in Kansas City from its Arsalon acquisition. Arsalon co-founder Bryan Porter was recently named DataBank’s CTO. | 6:59p |
Ciena Intros On-Demand Network Function Virtualization for Service Providers Network technology company Ciena introduced a marketplace of on-demand network function virtualization from several virtual appliance vendors that let’s a customer try the functions out on-demand and without upfront payment. The software platform is called Agility Matrix and is a way for service providers to offer up network function virtualization without paying upfront for a block of NFV licenses.
Compute and storage are flexible thanks to virtualization, and now the network continues to get the virtualization treatment. Networks remain rigid, but Ciena looks to change the business and consumption model of enterprise network connectivity. It enables a more flexible experience for the enterprise customer and service provider. It’s enabling the service provider to have a flexible network offering to upsell overtime.
There is a push by vendors to offer ways to use NFV to service providers. HP recently partnered with Wind River to sell NFV solutions to carriers. Intel (which controls Wind River) has partnered with Telefonica and Red Hat to build NFV solutions. Both partnerships are based on open source technology.
A service provider typically needs to pay upfront for a block of NFV licenses, which means expense before revenue. In the Agility Matrix model, service providers only pay for them as they are used by the enterprise customer. It lowers some of the financial risk of offering virtual functions.
It acts as a repository or library of offerings of several network hosts, and makes physical appliances and different functions all work within the on-demand billing model. The company is in trials with some service providers and said that the response has been good.
One example of a virtual function is performance testing, which requires a large upfront investment and doesn’t need to be performed 24/7. Ciena’s marketplace makes performance testing available on an hourly basis. However, Wide Area Network optimization will likely be charged on monthly or yearly basis, since it’s used constantly.
“The demand is to move away from the physical and to the virtual,” said Kevin Sheehan, vice president and general manager at Ciena Agility, a new division the company has also announced. “This shift is definitely happening.”
The new division will focus on software innovation and encompass Ciena’s existing Software Defined Networking solutions, the Agility Matrix solution announced today, as well as all future SDN and NFV development.
“We are transforming raw capacity into capability that delivers on-demand network-based services in a manner that mimics the ease and instantaneous nature of an online shopping experience,” said Sheehan. “The new Ciena Agility division is organized to quickly respond to the market demand for these customer-driven software solutions.”
Appliance vendors initially available on Agility are BlackRidge Technology, Brocade, Certes Networks, Silver Peak, and Spirent Communications. The partnerships are not exclusive, as the company is looking to create an ecosystem.
“NFV promises attractive benefits for providers and operators, but with considerable adoption risk,” said Eric Hanselman, chief analyst at 451 Research. “An approach like Ciena’s Agility Matrix could lower that risk while providing a direct path to revenue for both operators and Ciena’s VNF Market partners. Progress in NFV has been rapid and it’s operational implementations that will deliver greater value.”
Ciena expects the Agility Matrix solution to be generally available in the second calendar quarter of 2015. | 8:35p |
Healthcare.gov Contracts Extra 100 TB of Cloud Storage from Terremark to Handle Demand 
This article originally appeared at The WHIR
Healthcare.gov is adding 100 terabytes of cloud storage to handle the increased workload during the current open enrollment period that runs until Feb. 28, 2015, according to a report on Tuesday by FCW.
According to contracting documents posted Monday on the FedBizOpps website, Verizon Terremark, the service provider for the Healthcare.gov website, is getting a $1.8 million modification to its existing cloud hosting deal. The modification includes storage space and accompanying licenses, according to FCW.
“The goal of this effort is to support the addition of 100TB of storage, three Layer 7 licenses, six 5GB Virtual F5 licenses, and two 3GB Virtual F5 licenses,” the contract said. “The items that are being added are in preparation for Open Enrollment 2015.”
The services, valued at $1.8 million, will run through March 31, 2014.
The storage shortfall was detected during the readiness testing conducted in October 2014 where it was revealed that the “current storage is not sufficient to sustain operations through open enrollment.”
“Based on the program’s requirements to maintain data and to have it available upon request for successful enrollments, tracking, and auditing purposes, the storage needs to be increased to support notices and increased applicants, and additional servers.”
The Centers for Medicare and Medicaid Services said it would not have been able to predict the storage shortfall.
“The program is in its infancy and CMS has no historial evidence upon which to base systems performance parameters,” the contract said.
“Not having the additional storage in place and re-balanced to receive and store the data will cause major losses in the insurance enrollment information as well as program application capabilities.”
Healthcare.gov recently extended its agreement with Terremark in order to ensure a smooth transition to HP, the new service provider for the marketplace.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/healthcare-gov-contracts-extra-100-tb-cloud-storage-terremark-handle-demand | 9:05p |
HP Launches Big Data Cloud Called Haven OnDemand HP announced HP Haven OnDemand, which provides users cloud-based access to key components of HP’s analytics platform Haven Enterprise. Haven OnDemand runs on HP’s Helion cloud.
Haven is a big part of HP’s big data strategy. It was originally a bundle of HP analytics, hardware, and services targeted as an enterprise platform. Now the company is making key components of that platform available on-demand. HP is also leveraging Haven’s capabilities in its own software portfolio. Several services that use Haven as a cornerstone were also announced.
A cloud-based analytics-as-a-service built on HP’s Haven Big Data platform was first announced in 2013. The initial service provided an entry path for customers and focused on combining hardware and software in one package. The new OnDemand lets users tap into components a la carte and in minutes. The focus is less on an easy-to-use bundle for enterprises and more on easy-to-use components that can be set up near instantly.
The biggest trend in big data is the cloud is making properly leveraging and analyzing massive amounts of data possible for a wider swath of people. Big data used to be territory of only the bigger enterprises; now, several tech providers like HP are building big data clouds, making analytics open to anyone. Big data’s promise is not just insight into a business, it also opens the door for building unique functionality in applications that tap into analytics and data further.
Enterprise technology players have shifted cloud focus towards big data, which has long been a buzz term. Now IBM, HP, SAP, and others are trying to move analytics to the cloud, offering cloud big data tools.
Hosting and cloud providers are also getting in on the act. French hosting provider OVH is launching a big data cloud based on IBM systems. Rackspace recently launched a big data cloud as well.
Haven OnDemand analyzes all forms of data – business data, machine data, and unstructured, human information. Developers can use it to create “next-generation” applications and services, or services that better tap into data and analytics to improve and provide unique functions that weren’t possible without that data.
There are two major components to Haven, which are now both available on demand:
- HP Vertica OnDemand is an on-demand extensive set of built-in analytic capabilities that the company said is simple to use. Customers can get it running in minutes, and it’s also available as a managed virtual private cloud. It will be available next quarter.
- HP IDOL OnDemand is now available. IDOL is big data web services that developers, partners, and customers can use to build “next generation applications” that can analyze a broad spectrum of data types, including images, social media, text, video, and more. Developers can use it to enable a broad range of functions like contextual search, face detection, etc. The service has over 5,000 registered users to date.
HP also announced that it will embed its Haven assets deeper into the HP Software application portfolio. The company releases several new offerings and examples of how HP Software is leveraging Haven Big Data analytics .
IT Operations Management is a set of automation solutions. It is big data analytics to help organizations automate and optimize their applications and infrastructure technology. Haven helps to automate and optimize IT application and system management for faster time to market, cost efficiencies, and improved customer experience
HP’s new Intelligent Retention and Content Management combines HP hardware and software and allows global organizations to intelligently manage data throughout its lifecycle. Developed with Haven as its core, the solution allows customers to understand and analyze vast amounts of enterprise information. It also brings together HP StoreAll, HP ControlPoint, HP Records Manager in addition to HP Haven analytics
HP Application Defender, a cloud-managed application self-protection service uses HP Haven analytics to provide visibility and actively defend production applications.
“To succeed in today’s marketplace, businesses must be able to leverage all forms of data, at high speed and in context, in order to capitalize on emerging opportunities and manage risk and costs,” Robert Youngjohns, general manager of HP Software, said in a statement. “With today’s announcement, we are making our unique Big Data platform more accessible and adaptable than ever before, giving customers, partners, and developers an unmatched set of assets that can help them create winning, data-driven businesses.” | 9:36p |
Report: Colo Business Thrives as Enterprises Move to Cloud The colocation data center market is very well insulated against the huge shift in enterprises pushing IT workloads to the cloud, according to Synergy Research Group. The colocation industry is in fact thriving, the analysts said.
There have been quite a few reports suggesting that cloud is threatening or killing the data center. These articles might be guilty of taking a contrarian position to gain some attention, presenting near-dystopian images of massive Amazon Web Services and Google data centers containing all human data. These reports happen every year.
Mad Money’s Jim Cramer famously advised getting out of data center stocks in 2009 and again in 2011 because a new technology (cloud) meant doom and gloom.
“Get out of the data center stocks,” Cramer told viewers. “I think the data center industry is in decline. I see an industry that’s about to be brought low by new technology, so I think you should sell, sell, sell.”
To paraphrase Mark Twain, reports of the data center’s death have been greatly exaggerated.
Quarterly earnings by data center providers suggest the wholesale move hasn’t happened, but rather cloud is prompting a general move and comfort with multi-tenancy (from software to facility).
Colo Revenues Rise Unabated
Worldwide retail colocation revenues continue to grow at around 10 percent per year, with Netherlands, Germany, and the U.K. markets growing at above average rates. China’s growth rate is more than double the average, according to Synergy Research.
U.S., Japan, U.K., and China lead in retail colocation revenues, and the seven largest country markets account for close to 70 percent of worldwide retail colocation revenue. The U.S. alone accounts for over a third of the retail colocation market, about equal to the next six largest country markets.
Equinix is the dominant player in three of those markets, and Interxion has a top-three ranking in three of the markets, notably in Europe, where it does well in several countries.
Each market has its own unique mix of types of providers, said Synergy Group Chief Analyst John Dinsdale. However, Equinix has a sizable lead.
“Is anyone close to Equinix on a worldwide level? Not even remotely,” he said. “It is more than twice the size of the number-two ranked player (NTT) and more than three times the size of the following chasing pack (Verizon, CenturyLink, TelecityGroup, China Telecom, etc.).”
 The colocation market is growing 10 percent worldwide on average, with the U.S. making up a third of the market (Source: Synergy)
The Porridge Goldilocks Chose
“The relative spend on (and prospects for) colocation, enterprise data centers, and cloud are all intertwined,” said Dinsdale. “Clearly enterprises are pushing more and more IT workloads onto the cloud, which diminishes their potential spend on their own data centers. Colocation is in an interesting middle ground. The growth of cloud is a big driver for colocation growth while trends in the enterprise are inhibiting growth in enterprise spend on colocation.”
Most of the spend on retail colocation doesn’t come directly from enterprises, but from various types of service providers such as cloud, IT, telcos, and content providers. In the U.S. the current split is just over 60 percent from service providers and just under 40 percent from enterprise, according to Dinsdale.
Service Provider Consolidation
There has been and continues to be a steady stream of blockbuster deals in the colocation data center market. The driving force behind acquisitions is to expand footprint and get into complementary cloud and service offerings. “There has been a little activity in the opposite direction (e.g. Equinix selling off a few non-strategic data centers to 365 Main, which has been renamed into 365 Data Centers), but the main traffic has undoubtedly been towards consolidation,” said Dinsdale.
Multi-tenant space is growing but there is visible consolidation occurring in terms of the providers. Some of the biggest notable deals include:
China is Hot, but Chinese Provider’s Grip on Market is Tight
Worldwide retail colocation revenues continue to grow at around 10 percent per year, but China’s growth rate is more than double the worldwide average, according to Synergy. The Chinese market has garnered significant attention lately due to its potential.
Dinsdale believes the market is dominated by Chinese telcos (China Telecom and China Unicom) and some large Chinese data center operators (21Vianet, GDS, Dr Peng Data). “[While] some global or regional players have established a presence (e.g. Equinix, PCCW, Telehouse, Pacnet), there is no doubt that the market will continue to be dominated by the big Chinese operators,” he said.
Outsourcing is Good for Colo, Even With Cloud in the Picture
Outsourcing in general has become a preferred model as cloud has made businesses comfortable because it takes out the Capex guessing game and lets them shed non-core operations. The outsourcing is occurring to both global cloud giants and local providers.
Many businesses want to keep their data nearby, meaning growth is emanating out to secondary markets, not consolidating onto the Google, Microsoft and AWS clouds. The big clouds are growing, but not at the expense of private cloud and local cloud through systems integrators, VARs and service providers.
Consider Cisco’s recent Global Cloud Index – not only is data center traffic expected to triple, but private cloud is projected to grow at a 21 percent CAGR, despite losing overall market share to public cloud (from 78 percent market share to 69 percent market share).
A recent IDC report that predicts the number of the world’s data centers will decline in 2017, however this is due to moving workloads into mega data centers (be it cloud or multi-tenant data centers). Colocation industry trends suggest that, while many are moving workloads to cloud, colocation players are thriving as many of these clouds are housed within their walls. While Microsoft, Amazon and Google build their own mega data centers, not all workloads are moving wholesale into public clouds, and colocation is positioned nicely for those seeking hybrid infrastructure and multi-cloud usage. |
|