Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Monday, November 17th, 2014
Time |
Event |
1:00p |
AWS Docker Support is Major Leap Forward for Containers If you’ve been following Docker, you’ve probably heard the company’s CEO Ben Golub repeat over and over that Docker’s success depends on the ecosystem the 18-month-old startup grows around itself.
He is not exaggerating. Docker depends on standardization across the industry. Using the company’s own analogy, shipping containers wouldn’t be much more than metal boxes had the shipping industry not standardized on a uniform set of boxes that all weigh the same and have holes, locks, and tags in the same places.
Docker is a way to free a software developer from worrying about all the different parts of the infrastructure stack their application will be running on. With Docker, the developer doesn’t have to worry whether the app will run on somebody’s laptop, a VM, a cloud, or a dedicated server, or whether some part of the stack will have a different version of software the app wasn’t built to gel with.
A Docker container is a standard way for the application to describe its infrastructure requirements. It draws a clear line between the developer and IT. The developer is responsible for everything inside the container, and whoever managed the infrastructure is responsible for delivering the resources the app needs.
For it to work however, those on the infrastructure side have to accept the container as a way to communicate resource requirements. And for containers to be truly portable, those operating many different kinds of infrastructure have to accept it.
AWS Docker Support Pushes Standardization Forward
A huge piece of this standardization puzzle fell into place last week. Amazon Web Services, operator of the world’s largest cloud, launched a service designed to make it easy to deploy huge amounts of Docker containers on its EC2 infrastructure.
The other two big names in cloud, Google and Microsoft, have also been enthusiastic supporters. Google launched a cluster manager for Docker containers on its cloud earlier this month. Microsoft’s Azure has been supporting Docker containers since June, and the software giant has been busy integrating Docker with the next release of Windows Server.
Each cloud provider is leveraging their individual strength in the way they address Docker, Scott Johnston, the San Francisco-based startup’s senior vice president of product, said. Amazon, with AWS Docker support, is highlighting the massive scale of its cloud infrastructure; Google is underlining a massive ecosystem of developers it has cultivated around its cloud services; and Microsoft is leveraging both its cloud and its control over one of the world’s most popular server operating systems.
“Amazon obviously is taking advantage of their scale, their availably zones, and their infrastructure,” Johnston said. “Docker thinks that’s great. Microsoft Azure already has Docker installed on the Azure Linux node, and so a lot of the cloud work that you see in Amazon and Google you can also perform in Azure.”
AWS and Docker are a Good Fit
Amazon is in a good place to provide a set of tools to make Docker easy on its cloud, he said. With launch of the free container service, AWS wins because it makes it easy for users to consume a lot of EC2 capacity (which isn’t free), and users win because they don’t have to retrofit Amazon tools they are already using to manage Docker.
Amazon has done a good job respecting Docker’s native APIs, Johnston said. The Docker team is also happy with the way scaling is done, with capability to deploy applications across tens of thousands of containers. AWS is also going to support Docker Hub (a repository for popular application images) for its customers and contribute additional images to it. Observing which applications become popular among Docker and AWS users over time and creating images fine-tuned for the AWS platform will be an example of that standardization process Docker is hoping will continue. | 4:31p |
Infographic: Everything You Need to Know About IPv4 vs IPv6 Leigh-Ann Carroll is a Marketing Executive with Irish Telecom.
The worldwide launch of IPv6 ushered in a new era for the Internet. However, many regular online users do not quite understand what this new protocol means and how it will revolutionize the way in which the Internet will work.
Our infographic examines the differences between IPv4 and IPv6, explains how IPv6 works and outlines why IPv6 will be the Internet protocol of the future.
Click to enlarge

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 5:00p |
Data Center Jobs: ViaWest At the Data Center Jobs Board, we have a new job listing from ViaWest, which is seeking a Data Center Manager in Hillsboro, Oregon.
The Data Center Manager is responsible for managing facilities staff to deliver expected service levels to the client within the prescribed budget, managing team schedule to ensure customer support and facility coverage, serving as operational leader for the region, coordinating work assignments among building technicians, vendors, and contractors, and reviewing work orders to ensure that assignments are completed. To view full details and apply, see job listing details.
Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed. | 7:24p |
NTT Facilities Buys US Data Center Management Firm ECC NTT Facilities, the data center management arm of the Japanese giant NTT Communications, has acquired a majority stake in Marlborough, Massachusetts-based Electronic Environments Corp., expanding the business into the U.S. The company has until now operated only in Asia.
Facilities management, which can mean everything from design and construction to day-to-day management of the mechanical and electrical systems, is one of the areas of data center infrastructure outsourcing. As companies increasingly prefer not to stretch their resources too far beyond their core business, various outsourcing flavors of data center management become popular.
Two U.S. companies got into the data center facilities management business this year. T5 Data Centers launched its T5FM offerings in July, and QTS announced its CFM program in October.
Giants like HP and Schneider have been in the space for years. HP’s big move here was acquisition of EYP Mission Critical Facilities in 2007, and Schneider expanded its presence by buying Lee Technologies in 2011.
As it does in other business segments, NTT has been growing in data center facilities management primarily via acquisition. NTTF stepped outside Japan for the first time when it acquired Unitrio Technology in Thailand in 2013 and Pro-Matrix in Singapore earlier this year.
EEC is a one-stop shop of sorts. It offers everything from design and construction to maintenance of facilities and infrastructure equipment. NTT said it will invest in expanding EEC’s business in the U.S. using its already successful formula.
In a statement, Ken Rapoport, EEC founder and CEO, mentioned know-how in some forward-looking data center infrastructure technology areas NTT was bringing to the table. “The NTTF Group can offer a multitude of benefits to our organization, bringing valuable insights into technologies that can improve data center performance, such as 380V high-voltage DC infrastructure, solar power, and fuel cell infrastructure,” he said. | 8:03p |
Google Launches Autoscaling Beta on Compute Engine Intelligent horizontal automatic scaling is now available to everyone on Google Cloud Platform. The company briefly touched on the autoscaling features during the last GCP event. Today the service is officially in beta. Google frequently keeps features in beta despite their being production ready.
Autoscaling allows customer applications to be more cost effective and resilient. By scaling up or down, a customer doesn’t pay for what they don’t use but still maintains the ability to expand infrastructure in support during spikes in usage. It looks at how far the current state is from the desired target and guesses intelligently in response. Save money during lulls, keep up during high usage.
The Compute Engine Autoscaler intelligently and dynamically scales the number of instances in response to load conditions through definitions of ideal utilization levels of a group of instances, Google software engineer Filip Balejko, wrote in a blog post. The service detects changes and adjusts the number of running instances to match. It can respond to CPU load, Queries Per Second on an HTTP Load Balancer and metrics defined through the Cloud Monitoring service.
At the recent Google Cloud Platform Live, Autoscale was demonstrated scaling up to 1 million queries per second.
 The Autoscaler watches the serving capacity of the managed instance group, which is defined in the backend service, and scales based on the target utilization. (Source: Example in GCP Documentation)
Customer Wix.com, a popular do-it-yourself website-building service, is one of the first users. “Reducing [our] expenses, while giving us confidence that Google will manage the appropriate number of machines, even when a spike occurs,” said infrastructure team lead Golan Parashi.
Cloud providers continue to build out features and tools to help customers use cloud resources. Autoscaling isn’t new to the cloud world or to Google. The news is that it is now widely available on GCP.
Amazon Web Services also has autoscaling capabilities, which were first introduced in 2011. Autoscaling has been built into Microsoft’s Azure since 2013.
Google is the newcomer in the trio to the cloud compute scene, its compute service entering general availability in late 2013. In addition to innovating, Google is also playing a game of catch-up when it comes to features like autoscaling.
It’s Platform-as-a-Service offering, AppEngine, has been around much longer, initially released in 2008.
The recent Google event also saw the release of a hosted Kubernetes offering on the part of Google for Docker container management. Amazon Web Services responded at re:Invent with several services of its own, including native Docker support.
Microsoft hasn’t been quiet with Azure, also focusing on bringing in features and support for containers. | 9:18p |
Excool Launches Data Center Modules With Evaporative Cooling U.K. cooling vendor Excool has launched a prefabricated modular data center product called Excool Space.
Prefabricated data center modules offer a way to expand data center capacity quickly. The vendor usually builds them at a factory and ships them to wherever their customers need them, where they are hooked up to power, chilled water, and network and brought online.
Both infrastructure equipment vendors and IT suppliers sell data center modules. Schneider Electric has an extensive product line in the category, and so does Emerson Network Power.
HP’s EcoPOD is a well-known product in the market, and so is the Dell Modular Data Center. eBay is one of the big household-name customers that buy modules from both vendors stuffed with their servers. Microsoft is known to be using Dell’s modules to power some of the infrastructure that supports its Bing search engine.
Excool is leveraging its core products, indirect adiabatic and evaporative cooling systems, as a differentiator for its modules. The company claims its cooling systems are more energy efficient than the industry average.
“It (Excool Space) builds on Excool’s highly efficient and resilient cooling technology and brings together many years of experience in delivering modular buildings,” Excool International Business Director Paul Inett said in a statement.
German Subsidiary Launched
Excool also launched a German subsidiary called Excool GmbH to serve customers in Germany, Austria, and Switzerland. This is its first subsidiary outside of the U.K. | 10:00p |
Hackers Target US State Department in Email Breach 
This article originally appeared at The WHIR
Unclassified emails belonging to the US State Department were recently hacked and then taken offline on Friday, according to a US senior official who talked to Reuters about the security incident. The State Department breach took place at the same time the White House networks were attacked in late October.
This announcement follows a data breach at the USPS last Monday where federal employee social security numbers were compromised in a hack suspected to have been initiated by the Chinese government. The White House was hacked in late October by Russian suspects, while the National Oceanic and Atmospheric Administration (NOAA) also reported a cyberattack by China last week.
When asked whether Russia was involved in the White House attack, a spokesman for Russian President Vladimir Putin Dmity Peskov said, “We’ve been hearing a series of groundless allegations against Russia recently. So we can’t take them seriously any longer unless there’s proof.”
The state department official told Reuters on Monday that none of it’s classified systems were compromised. Portions of the system continue to be shut down as officials investigate the attack.
“The department is implementing improvements to the security of its main unclassified network during a scheduled outage of some internet linked systems,” the official told Reuters. “This has impacted some of our unclassified email traffic and our access to public websites from our main unclassified system. We expect our systems to be up and running soon.”
These government attacks join the long list of cybersecurity breaches this year. Banking giant JP Morgan, Target, Neiman Marcus, Home Depot, Kmart and Dairy Queen have all had issues with cybersecurity attacks or malware.
At press time there have been no statements issued by Secretary Kerry or other officials on the State Department website regarding the incident.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/hackers-target-us-state-department-email-breach | 10:30p |
Employees Bypass IT as Department Too Slow to Approve Cloud Apps: Report 
This article originally appeared at The WHIR
As cloud adoption grows, keeping track of all the cloud applications in use at a business becomes a challenge. According to a report released on Monday by GigaOM and CipherCloud, though the cloud market is expected to grow 126.5 percent this year, having visibility into cloud applications continues to be difficult.
According to the report, Shadow IT: Data Protection and Cloud Security, SaaS is expected to grow at 199 percent and IaaS at 126 percent this year, respectively, but the lengthy approval process for new cloud applications is frustrating employees at organizations.
Eighty-one percent of survey respondents said that they use unauthorized SaaS applications, and 38 percent of employees deliberately bypassed IT in adopting applications because of the slow IT approval process.
In an interview with the WHIR earlier this year, Cisco Services senior director Robert Dimicco said that there are typically 5-10 times more cloud services being used in an organization than are known by IT.
“Organizations are moving beyond curiosity about the cloud to actual deployment,” Gigaom Research analyst George Crump said. “SaaS is growing at 199 percent and is the typical home for shadow IT. It is growing because end-users are impatient with IT and looking for alternatives.”
Security continues to be the top cloud adoption concern of 62 percent of respondents. Forty-four percent of respondents selected cloud performance as a concern, and 41 percent said the time needed to develop cloud-related skills was holding them back from adopting cloud.
The most commonly requested cloud applications are related to communications, file sync-and-share and disaster recovery. Other applications commonly requested included CRM and other sales automation tools.
“The research underscores the rapid adoption of the cloud by businesses both by IT and line of business employees,” said Paige Leidig, Chief Marketing Officer, CipherCloud. “It also reinforces security as a key concern and the importance of encryption to ensure data is fully protected and controlled.”
This article originally appeared at: http://www.thewhir.com/web-hosting-news/employees-bypass-department-slow-approve-cloud-apps-report | 11:17p |
Comcast Building Data Center in Oregon’s Tech Hub Comcast is building a data center in Hillsboro, Oregon, a city within the Portland metro that together with nearby Beaverton has formed the state’s high-tech hub and a busy data center market.
The U.S. telecommunications and media giant did not provide any details about the upcoming facility, saying only that it was being built by Portland’s Fortis Construction.
Intel has huge presence in the area, along with a long list of other well-known names in high-tech, the likes of HP, IBM, Salesforce, Oracle, Autodesk, and Apple.
The latest comer to the local data center market is T5 Data Centers, which kicked off construction there in September. NetApp and Telx are leasing data center space from Digital Realty Trust in Hillsboro. ViaWest is another provider serving the market, listing Intel as one of its customers there.
Comcast announced its project along with opening a new store in Hillsboro. The company already employs more than 1,200 customer service reps and technicians in Beaverton.
“We appreciate Comcast opening a new store in Hillsboro which will provide more convenient customer service to our residents,” Hillsboro Mayor Jerry Willey said in a statement. “We welcome all business investments into our city.”
Ethernet On-Ramps to Cloud Launched
Besides residential Internet, phone, and cable businesses, Comcast also serves the enterprise IT market with connectivity solutions. This week, the company announced a new service that provides customers connectivity from colocation data centers to cloud providers via Ethernet .
The company is offering private cloud “on-ramps” through the Equinix Cloud Exchange, selling 10 Gbps, 1 Gbps, and sub-gigabit connections. |
|