Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, December 4th, 2014
Time |
Event |
2:00a |
With Open Commodity Switch, Juniper Goes After Web-Scale Data Center Market Juniper Networks and Taiwanese hardware manufacturer Alpha Networks have designed a “white box” network switch for web-scale data centers using principles of Facebook’s Open Compute Project and submitted the design to OCP for review and adoption.
Juniper plans to start shipping the switch in the first quarter of 2015. It will come with the vendor’s Linux-based Junos OS, but users will be able to run any network operating system they want on it.
For Juniper, the move represents a step out of the pack of “incumbent” network vendors that have jealously guarded their inseparable, proprietary (and expensive) hardware-software bundles. Companies like Cisco, HP, and Brocade have not brought commodity low-cost hardware switches to market.
Commodity Switches With Non-Commodity Software and Support
White box switches and other commodity data center hardware have been the way web-scale data center operators – companies like Facebook, Google, or Amazon – have been able to drive down the cost and drive up efficiency and flexibility of their IT infrastructure. Because these companies have extensive internal engineering resources, they have relied primarily on self-designed hardware and software to operate their data centers.
However, vendors like Juniper, Hyve Solutions, and Quanta, among others, say there is now also a growing group of companies (primarily network carriers and cloud service providers) that want the same level of efficiency web-scale operators have achieved.
Juniper’s white box switch, called OCX1100, comes with the OS and traditional enterprise-level support by the company. The vendor hopes it will appeal to those large cloud providers that want web-scale infrastructure but don’t necessarily have the internal engineering resources of a Facebook or a Google.
Jonathan Davidson, Juniper senior vice president and general manager of its Security, Switching and Solutions business, said the product fills a “functional gap between white box and traditional switching.”
There isn’t a white box switch on the market that comes with a “carrier-grade” operating system, he said. Junos OS is what Juniper hopes will be the main competitive strength of the product, since the company has extensive experience in building and deploying robust network operating systems.
But, true to Open Compute principals, the vendor is letting the user decide whether they want to use Junos or another OS. There is a number of network operating systems for open switches out there, most prominent one being a Linux variant by a startup called Cumulus Networks.
Adopting Facebook Wedge Principles
The Open Compute Project, Facebook’s initiative aimed at creating an open source hardware and data center design community similar to the open source software one, currently has switch design specs by Accton, Alpha Networks, Broadcom, Mellanox, and Intel.
Facebook has designed its own switch, called Wedge, and previewed it earlier this year. But it has yet to contribute the design or the spec to the open source project.
Juniper’s OCX1100 design is consistent with Wedge principles by disaggregating an x86 control plane and a Broadcom forwarding plane, Davidson said. Unlike Wedge, however, Juniper’s switch supports ONIE (Open Network Install Environment), which is what enables installation of any operating system on it, he said. | 3:59p |
Docker Previews First Commercial Product Docker kicked off DockerCon in Amsterdam (its first conference in Europe) with the announcement of its first commercial product. The company also announced Docker-native orchestration capabilities for distributed multi-container and multi-host applications.
Docker Hub Enterprise is a Docker image repository companies can deploy in their own data centers, or in hosted private or public cloud environments. The initial aim of the product is to give enterprises a way to use Docker in a way that satisfies their internal security and compliance policies and fits into their application development processes. But eventually, Docker hopes to make Docker Hub Enterprise the way its customers interface with everything Docker, David Messina, the company’s vice president of marketing, said.
The company’s Docker-native orchestration tools became a controversial subject earlier this week, when a company called CoreOS, which has always been a major supporter of Docker, said Docker had steered away from what CoreOS expected to be a continued focus on simple, composable application containers. CoreOS CEO Alex Polvi said building native orchestration tools into the Docker daemon diluted its purpose.
San Francisco-based Docker is both a company and an open source project the company leads and builds its business around. Through use of standardized application containers, the technology aims to enable developers to build, test, and deploy a single application on any kind of infrastructure, be it the developer’s laptop, the company’s test environment, a production cluster in its data center, or a public cloud.
Until now, users have been able to host their container images in a Docker-operated public Docker Hub registry. They can also set up private registries of their own in various public clouds or in their own data centers using open source code. The hub’s new enterprise version, however, is a Docker-designed and supported product.
The company is previewing DHE at DockerCon EU and plans to start an early-access program in February 2015.
IBM, AWS, Microsoft to Sell DHE
A number of companies will be bringing the product to market besides Docker. They include IBM, Microsoft, and Amazon Web Services.
While IBM has been involved with Docker for some time, this will be the first time the company will have official products around it. IBM will sell DHE as an on-premise solution and also as a cloud service through its Bluemix Platform-as-a-Service.
Microsoft and AWS will offer DHE as part of their public cloud services.
Docker-Native Multi-Container, Multi-Host Orchestration Tools
Docker also announced three container orchestration services to make it easier to build applications that consist of multiple containers and run across multiple servers, VM hosts, or clouds. While all three services are designed to work together to achieve the same goal, users can deploy them separately with other solutions.
The first one, called Docker Machine, simplifies portability of Docker containers across different hosts. While they have been portable before, moving a container would involve a lot of manual configuration, which Docker Machine automates.
The second service is called Swarm. It is a Docker-native clustering capability, like Mesospere, Google’s Kubernetes or Container Engine, or AWS’s EC2 Container Service.
Through Swarm’s API, developers will be able to use it together with any other clustering engine. They can use Swarm in their development or testing environment, for example, but deploy the application in Amazon’s cloud using Amazon’s container service.
The third component is Compose. It defines which containers an application is composed of in a single file. Given the dynamic nature of modern applications, it helps make sure all changes are accounted for, keeping updates consistent in that single configuration file. Adding a service takes a simple change in the Compose file. | 4:30p |
Lessons Learned: IT Infrastructure and Network Security Patrick Quirk is the Vice President and General Manager of Converged Systems at Emerson Network Power.
For more than two weeks at the height of the 2013 holiday shopping season, hackers stole the personal information of some 70 million shoppers from a major U.S. retailer. Forty million credit and debit card numbers were compromised in a massive IT security breach that cost the company and consumer banks and credit unions hundreds of millions of dollars and nearly 500 individuals—including the retail giant’s president and CEO—their jobs.
Precise estimates of the financial impact are impossible to make. We know the company’s profits dipped precipitously in the fourth quarter of 2013 compared to the year before, but how much of that was driven by consumer reaction to the mid-December crisis is uncertain. When you factor in lingering lack of trust and the real costs associated with covering fraud charges and pending litigation, the final price tag almost certainly tops $1 billion—and it’s not hard to imagine it at twice that.
And that’s just one example. Major breaches at other retailers, banks and businesses make it clear that data security remains a serious problem.
The Unexpected Loophole
In the original example, the hackers gained access through an HVAC vendor. That revelation was chilling for CIOs who already spend sleepless nights worrying about organizational data security and “traditional” cyber attacks. Intrusion through the HVAC system brought to mind Mission Impossible-style assaults with black-clad criminals slithering through air ducts. The reality was far less cinematic—and far more dangerous.
It all started with a malware-laced email phishing attack sent to employees at that HVAC vendor. The vendor had access to the retailer’s network login credentials in order to remotely monitor energy consumption and temperatures at various stores where their HVAC systems were deployed. The phishing attack turned up those credentials, and the hackers used them to access the store’s corporate network and, specifically, the company’s payment systems.
Secure From Top to Bottom, and Then Some
It’s a reminder that security starts with access control—both physical and virtual—and includes access granted to those inside and outside your organization. Think about secure facilities with gates at the parking entrance, bollards closer to the buildings, key card access at entry doors, security officers in the lobby, and so on. Effective security is layered, and IT security is no different.
A truly secure system requires vigilance. Best practices dictate layers of IT security commonly referred to as “defense in depth,” with protocols in place managing network access. Organizations must manage and rotate credentials and establish auditing systems so they can know who should be accessing the network and when. Any activity outside of the norm should trigger an alert.
Even Business System Security audits, required by law for financial institutions, don’t protect companies from all the potential risks. If critical infrastructure systems are not properly configured—and if infrastructure providers are not vigilant in their own security practices—they can provide an unexpected open door. In many cases, these systems are flying below the radar of the CIOs or CISOs of the world. But even if their own security protocols are current and robust, are they sure the same can be said about their vendors with access to their network?
It Can Happen to You, Buckle Down on Security
Despite the rash of breaches, there remains an industry-wide naïveté when it comes to these types of security concerns. Companies with remote access to physical infrastructure systems often fail to realize those systems can be used as gateways to business-sensitive corporate networks. As a result, their security protocols can be lax.
Of course, it’s easier said than done. Virtually everything in the data center—not just servers, but critical infrastructure systems including power, cooling and monitoring—has Web interfaces and therefore IP addresses. Administrators and operators often network them so they can be accessed remotely, but too often security is overlooked altogether in this process. That leaves hundreds of thousands of embedded and low-level management systems vulnerable to exploitation by anyone in the world with the right skills and motivation. And there are plenty with just the right combination of both to be dangerous.
Case in point: Undetected for several weeks, hackers penetrated 90 servers in one banking giant’s computer network. They accessed personal customer data along with a list of every application and program the bank used to protect its servers, creating a nightmare scenario where hackers could exploit potential security flaws in those programs to execute similar attacks in the future.
Stay Away From Risky Business, Separate Network Segments
There has been a belief in the past that you only need a logical or virtual separation between networks in order to keep them fully secure, but that’s risky at best and unrealistic at worst. If you’re granting access to the same physical networks, all it takes is one unseen open door and the entire network is vulnerable. The best security is when you have physically separated and truly independent network segments, and we recommend keeping management infrastructure on a separate physical network.
Responsible technology partners should not only understand these threats themselves, they should be aware of current best practices in data security and work with their customers to activate the appropriate levels of access control, auditing and alerting when installing any new systems. It’s a complex problem, but the right partners can help cut through that complexity and ensure your network—and your business—do not become the next victims.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 5:57p |
Brocade Certifies Virtual Network Tool for Mirantis OpenStack Networking solutions provider Brocade has deepened its integration and relationship with Mirantis and its OpenStack distribution. The company has certified its open source virtual network automation tool Fuel for use with Mirantis and has been ensuring its network capabilities play nice with Mirantis-based clouds.
Brocade wants to enable virtual network automation atop of OpenStack, so it is certifying its offerings on major distributions of the popular open source cloud architecture. It provides the software-driven, automated and scalable network piece to Mirantis OpenStack.
Fuel was earlier certified for Red Hat’s OpenStack distro. Enterprise interest and adoption of Mirantis made it an attractive second partner. The certification was based on a configuration of OpenStack based on Mirantis’ reference architecture.
Brocade is seeing a lot of enterprises kick the tires on OpenStack, according to Kenneth Ross, director of product management at Brocade.
“We’ve been very involved with OpenStack, with an umbrella strategy of enabling all our products,” he said. “When we get a customer deploying Ethernet fabric that wants to use OpenStack, we now have two partners in place.”
The network has been a focus in the OpenStack project for several years. Brocade and other vendors are looking to improve and virtualize the network within OpenStack and make sure their products gel with the world’s most widely used open source cloud technology.
Ongoing efforts by legacy IT vendors to build OpenStack plug-ins for their products have made some in the OpenStack community concerned that the project may be steered in the wrong direction from what they believe was it’s intended goal. Jim Morrisroe, CEO of major Mirantis competitor Piston Cloud Computing, told us earlier he thought all the plug-in efforts by legacy vendors were detrimental to the industry’s progress toward a simplified cloud infrastructure built on low-cost commodity hardware.
Many companies are building network functions atop of OpenStack. Two examples are Avaya for its SDN offering and Siaras, which lets users carve out Wide Area Networks between clouds.
Brocade is also an example of a traditional tech company increasing its “open” strategy. In addition to making sure its offerings work with OpenStack, the company is active with the OpenDayLight SDN project and HP’s OpenNFV. Cloud is being driven by open source, so these companies need to provide “open” options to capture these users. Networking giant Cisco has also focused on OpenStack.
There’s been a significant change in terms of Brocade embracing open source. Ross said that the company sees a market much more receptive to using open source, looking for open architecture with no vendor lock-in. So the company is stepping in.
The company also has plug-ins for its Vyatta virtual router and virtual load balancing. They will be certified with both partners over the couple of months, according to the company.
Juniper, one of Brocade’s biggest competitors, has taken its involvement with open source a step beyond all legacy network vendors Wednesday. The company announced it had designed a white box data center switch that would be open to use with any network operating system. Juniper also submitted the design for evaluation to Facebook-led Open Compute Project, planning to contribute it to the open source hardware design effort. | 9:00p |
Mirantis Launches Free Hosted OpenStack Tier for New Cloud Developers 
This article originally appeared at The WHIR
Mirantis, a provider of OpenStack software, training and support, has released a new Developer Edition of Mirantis OpenStack Express, which offers a starting point for new OpenStack cloud developers.
According to the Thursday announcement, Mirantis is also offering a dozen new OpenStack tutorials for free.
Mirantis launched Mirantis OpenStack Express 2.0 in September, which provides an on-demand, hosted development environment for Mirantis OpenStack with developer support. It is used by RealStatus, Tata Consultancy Services, Avi Networks, and others for application development.
The Developer Edition of MOX is specifically design for solo developers and solution providers looking to try OpenStack, and is available at no cost for the first 12 months, then $39.99 per month thereafter. It includes an OpenStack tenant with four virtual CPUs, 4GB RAM, 100GB of storage, and two floating IP addresses, as well as access to OpenStack APIs.
Developers can stack on additional resources, starting with quotas of two vCPUs, 2GB RAM, and 50GB of storage at $19.99 per month.
MOX developer edition is significant in that it helps develop new cloud talent, given that it has been traditionally very difficult for individuals to learn cloud computing software not only because of the steep learning curve but also because of the costs involved.
When asked what alternatives options developers have had for get hands-on training with a cloud platform, Mirantis co-founder and CMO Boris Renski told the WHIR:
“A developer would either have to turn himself into a systems administrator for a day or so to download and install a mini OpenStack instance on premises (typically on a laptop) and then read through thousands of pages of upstream documentation.
“Another alternative is to spend $2,500 on a 3-day class, which can also be a tall order for somebody who just wants to try things out. With MOX developer edition and the tutorials we released, a developer can get his hands dirty with OpenStack in a matter of minutes.”
The new tutorials rolled out with the launch of MOX Developer Edition are based on Mirantis’ OpenStack training, and cover common use cases such as adding images, launching VMs, and using the Murano OpenStack application catalogue, which is available in the developer edition.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/mirantis-launches-free-hosted-openstack-tier-new-cloud-developers | 9:30p |
Half of IT Networks Will Feel the Stranglehold of IoT Devices: IDC Report 
This article originally appeared at The WHIR
Within just three years, IT networks are expected to go from having excess capacity to being overloaded by the stress of Internet of Things (IoT) connected devices. The IDC FutureScape: Worldwide Internet of Things 2015 Predictions released on Wednesday gives the top 10 decision imperatives to be used by businesses IT department to direct future planning.
The IDC has been watching the IoT market for a while and feels it has matured to a point where IT departments need to be concerned with upgrading infrastructure to to deal with the devices and leverage the business opportunities created. Companies have already been heavily betting IoT will be a huge growth and revenue opportunity. In March, Cisco announced a $1 billion dollar investment in IoT. Microsoft it also investing big with an specialized IoT Azure Cloud.
“Between the possible consumer and business applications, analysts have been tripping over each other to make the most grandiose predictions: 1.9 trillion from Gartner, 7.1 trillion from IDC, 19 trillion from Cisco,” according to a Forbes article published on Thursday. “Are they referring to devices or dollars? What’s the difference? It’ll be huge. (For the record, they’re talking about dollars.)”
Cloud computing will be key to the success of the IoT. IDC predicts within 5 years over 90 percent of IoT generated data will be hosted in the cloud. This should make the access to data and thus the interaction among various IoT devices much easier facilitating growth. However, cloud-based data storage will also increase the chance of cyber attacks with 90 percent of IT networks experiencing breaches related to IoT. The amount of data generated by IoT will make it an attractive target.
“The Internet of Things will give IT managers a lot to think about,” says Vernon Turner, SVP Enterprise Infrastructure, Consumer, Network, Telecom and Sustainability Research. “Enterprises will have to address every IT discipline to effectively balance the deluge of data from devices that are connected to the corporate network. In addition, IoT will drive tough organizational structure changes in companies to allow innovation to be transparent to everyone while creating new competitive business models and products.”
Most IoT is currently focused in a few industries but within five years all industries are expected to have IoT initiatives. Local governments will be one of the areas that will realize the value of IoT to create smarter cities. IDC says more than 25 percent of all government spending will be centered on IoT deployment and management by 2018.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/half-networks-will-feel-stranglehold-iot-devices-idc-report | 9:35p |
Finland Building Direct Submarine Cable to Germany Finland’s government has contracted Alcatel Lucent to build a submarine cable at the bottom of the Baltic Sea that will connect the country directly to Germany and increase available bandwidth for Finland Internet users.
Over the past several years, Finland and other Nordic countries have been competing to attract large data center construction projects as a way to boost their economies. Finnish government hopes the increase in available bandwidth between the country and the rest of the European continent will make it more attractive for data center operators.
Finland’s abundant low-cost electrical power and cool climate have already brought in several major data centers. Google is estimated to have made more than $1 billion in investment related to its massive data center in Hamina – a repurposed paper mill.
Microsoft is building a $250 million data center in Finland. Russian Internet giant Yandex announced it would built a massive server farm in the country as well.
Finland’s neighbor Sweden also scored a blockbuster data center deal in 2011 when Facebook decided to build its first data center outside of the U.S. in Luleå.
While there is already a multitude of submarine cable links from Finland to Russia, Sweden, and the Baltic states, the upcoming cable will be the first to connect the country to Europe’s largest economy.
The project, called Sea Lion, is being funded by Cinia Group, Finland’s government-owned telco. It will cost about $74 million, and is expected to be completed in 2016.
The cable will stretch about 680 miles from Helsinki to a landing station near Rostock, a major city on Germany’s Baltic coast. Finland’s national fiber-optic network will carry data traffic from across the country to and from the Sea Lion system. The terrestrial network runs along the national railroad system.
“Broadband connectivity is a major growth opportunity for the foreseeable future and the development of a robust telecommunication infrastructure is vital,” Jukka-Pekka Joensuu, Cinia’s executive vice president, said in a statement. | 10:00p |
LeaseWeb Opens Data Center in Pacnet’s Hong Kong Facility Hosting and Infrastructure-as-a-Service provider LeaseWeb has opened a new Hong Kong data center in a continued expansion into the Asia Pacific market. The company is taking space in a Pacnet facility. Pacnet is also provider of LeaseWeb’s Singapore data center.
This is the third new data center location for the Amsterdam-based provider this year, following launches in Singapore and San Jose, California. LeaseWeb is in CloudSpace2, Pacnet’s newest data center in Hong Kong.
Pacnet’s 54,000 square foot two-story facility has around 22,000 square feet of raised floor space. Located at Tseung Kwan O Industrial Estate, it is central to Pacnet’s massive submarine cable system.
One of the world’s largest hosting providers, LeaseWeb has 65,000 physical servers under management. Its customer base of 17,500 has historically been predominantly European, but the company has been expanding globally.
“Our data center operating agreement with Pacnet enables us to quickly expand our data center presence across Asia Pacific,” Bas Winkel, managing director, LeaseWeb Asia Pacific, said in a statement. Sydney and Tokyo are the company’s two other expansion possibilities going forward.
“The opening of the Hong Kong data center is in response to strong demand from customers across the Asia Pacific region,” said Winkel.
LeaseWeb has a slew of infrastructure services, and is targeting hybrid needs in the region. The company offers virtual and bare-metal, private and public cloud services, dedicated servers, shared hosting, as well as some retail colocation. It also provides managed services, such as backup and service level guarantees, and offers its own proprietary CDN.
Based in Amsterdam, LeaseWeb is part of the Ocom Group. It has several sister companies, including colo provider EvoSwitch, global network provider FiberRing and modular data center player DataXenter. |
|