Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, March 31st, 2016
Time |
Event |
12:00p |
Top Five Apps and Services That Can Benefit from SDN Today, managers are looking at next-generation solutions that will change the way cloud and data center resources are controlled. More than 80 percent of respondents to the latest AFCOM State of the Data Center survey said that have already deployed or will be deploying between now and 2016 software-defined networking or some kind of network function virtualization. Furthermore, 44 percent have deployed or will be deploying OpenStack over the course of the next year.
You read that correctly; most are already in some way deploying or looking at next-generation networking technologies. It’s important to understand the reasoning behind such big trends. Organizations are seeing the direct benefits around deploying SDN and the types of services and applications it can optimize.
With that, let’s define SDN and look at the top five applications and services SDN can directly impact and optimize.
Defining Modern SDN
With SDN, at a very high level, administrators are able to control and manage entire network through the abstraction of higher-level functionality.
This is accomplished by abstracting the layer which manages how traffic is distributed and where it’s being sent: the control plane. The underlying system helps control traffic destination: the data plane. To make SDN work, there has to be some kind of communication between the two planes, even though management is abstracted.
It may sound complicated, but it really isn’t. The goal of SDN is to create a very dynamic and highly-programmable network infrastructure that’s capable of controlling underlying infrastructure components while still being abstracted from applications and network services. This allows for better programmability across all networking layers, better agility, central management, and an open-standards architecture.
This means that SDN can drastically simplify network design by allowing administrators to aggregate physical resources, point them to an abstracted management layer (SDN), and create intelligent, programmatically configured controls around the entire network. This means you can present network resources to applications and other resources.
The administrator has visibility into the entire network flow architecture. Applications or resources using that network simply see a logical switch.
SDN’s abstraction concept fundamentally simplifies some of today’s most complicated and fragmented networking ecosystems. This is why we’re seeing so much adoption in the data center space. Organizations use SDN to deal with complexity, improve policy control, improve scalability, and remove vendor dependencies. Most of all, SDN helps with new concepts, such as the Internet of Things, cloud integration and cloud services, Big Data, and even improving IT consumerization and mobility.
Top Five Applications and Services SDN Supports and Optimizes:
- Security Services. The modern virtualization ecosystem supports specific virtual services running within the network layer. This means incorporating functions like NFV into SDN platforms. This type of network security creates a truly proactive environment capable of reducing risk and responding to incidents much more quickly. When a breach occurs, every second is critical in stopping the attack. Also important is the capability to identify the attack and ensure that other network components are safe. As the network layer becomes even more critical — and as the modern organization becomes even more digitized – we’ll see more attacks and more sophisticated advanced persistent threats. By integrating powerful security services into the SDN layer, you help create a more proactive environment that’s capable of responding to change.
- Network Intelligence and Monitoring. Modern SDN technologies are helping abstract one of the most critical layers within the data center: the network. Network architectures are much more complex and have to handle more data than ever before. This means knowing what’s flowing through your environment is critical. Do you have latency issues on a port? What if you’re running a heterogeneous network architecture? Or, are you heavily virtualized and are passing a lot of traffic through the network layer? All of these challenges are alleviated when you have a solid network intelligence and monitoring layer. However, you gain true insight and benefit by integrating these technologies into your SDN architecture. Traffic flow, port configurations, hypervisor integration, alerting, and even optimization can be integrated into network intelligence and monitoring technologies. Most of all, these types of agile systems will further help you monitor network traffic between your data center and your cloud ecosystem.
- Compliance and Regulation-Bound Applications. Major cloud vendors are now offering the capability to store and work with compliance-bound workloads. Now, organizations have the option of extending architectures which were originally very limited because of regulations into distributed environments and the cloud. But how do you segment the traffic? How do you ensure that compliance and regulation workloads are persistently secured and monitored? This is where SDN can help. Network traffic traveling between switches, network points, and even hypervisors can all be controlled in an SDN architecture. Remember, this layer abstracts virtual functions and hardware controls. This powerful layer can then span various locations, virtualization points, and even cloud locations.
- High-Performance Applications. We’re seeing a boom in new types of application technologies. Virtualization has allowed the delivery of rich apps like GIS, CAD, engineering, and graphics design software. Traditionally, these workloads needed bare-metal architectures with their own connection. However, with virtualization, applications are streamed and VDI can help create powerful desktop experiences. However, at the network layer we also see the integration of SDN into application control. Creating powerful QoS policies, securing confidential data, segmenting heavy traffic, and even creating threshold alerts around bottlenecks. All of these functions within SDN help support high-performance, rich applications which are being delivered via virtualization.
- Distributed Application Control and Cloud Integration. One of the biggest benefits of SDN is its capability to extend across the entire data center. This type of agility integrates distributed locations, cloud, and the entire organization. SDN allows for critical network traffic to pass between various locations, regardless of the type of underlying network architecture. By abstracting critical network controls you allow for easier movement of data between data center and cloud locations. Because SDN is a form of network virtualization, you can use powerful APIs to not only integrate with a cloud provider; you can control specific network services as well. This allows you to granularly manage your workloads while keeping your business agile.
Now that you have a clearer picture, know that your organization may very well have use cases for other SDN functions as well. The key, however, is understanding how SDN can positively impact your data center and your business. SDN fundamentally simplifies the entire networking layer and gives you granular control around applications, services, and your distributed data center ecosystem. Most of all, SDN helps you design a business capable of adjusting to market shifts and changes in the industry. This allows your organization to be truly agile and productive. | 2:32p |
CenturyLink Expands Security Services with netAura Acquisition
 By The WHIR
Managed hosting and cloud service provider CenturyLink has bought managed security technology company netAura ramping up its security capabilities and scale especially for public-sector organizations.
Founded in 2011, netAura is in the Washington, D.C. area, and has extensively worked with US government agencies and corporations on cybersecurity, security information and event management (SIEM), analytics and vulnerability management.
On the AWS Marketplace, netAura provides customer service management softwarethat provides enterprise-grade ticketing and IT service tracking.
CenturyLink has been recently focusing on developing its security capabilities. Last month, CenturyLink launched its enhancedManaged Security Services Suite which includes preventing, mitigating and responding to cyberattacks, and is designed to replace in-house security at organizations. In 2014, CenturyLink acquired Cognilytics, a provider of advanced predictive analytics and big data solutions.
netAura implementation services is a particular strength that it will add, according to Girish Varma, CenturyLink’s president of global IT services and new market development. “This acquisition helps us continue to deliver comprehensive security architectures to existing and future customers,” he said.
CenturyLink has been recently aiming to divest itself of costly data center space, and providing managed services like security can be delivered on infrastructure that it doesn’t necessarily own or lease. Given the competition in the colocation space driving prices down, it makes sense to provide more differentiated services.
This article first ran at http://www.thewhir.com/web-hosting-news/centurylink-expands-security-services-with-netaura-acquisition
| 2:35p |
Cloud May Make iPhone-like Impact on Storage Barry Philips is CMO and Director of Product Management at Panzura.
At risk of dating myself, I recently found a travel alarm clock in the back of my closet. My 15-year-old daughter saw it and just could not wrap her head around the concept when she has an iPhone.
While not as antiquated as a travel alarm, point-and-shoot cameras are also going the way of Dodo birds. iPhones now have great resolution and have a simple workflow to store pictures with any number of cloud providers, eliminating the need to carry a second device to take photos. I still see stopwatches when ESPN rebroadcasts NFL shows, but everyone in gyms across the country just use their smart phones. The iPhone has subsumed the functionality of these and many more devices because it is a disruptive advancement as opposed to each of these devices having separate, incremental evolutions.
We’re now seeing the cloud do the same to all forms of on-premises enterprise storage including primary storage, back-up and tape.
Analysts agree. 451 Research recently published its 2016 Enterprise Storage Outlook in which both Amazon Web Services (AWS) and Microsoft Azure become top 5 enterprise storage vendors by 2017. Further, AWS surges from sixth to second place while NetApp plummets from second to sixth. Spending on public cloud storage more than doubles between 2015 and 2017 while spending on on-premises storage falls over 17 percent.
It’s difficult for on-premises storage to compete with the cloud. Like the iPhone, the cloud is a disruptive technology that makes legacy infrastructure seem like a unwieldy point-and-shoot camera. The cloud wars have dramatically dropped the cost and features like AWS Infrequent Access reduce the cost further. Global deduplication can reduce the total storage footprint and cost even further. It’s hard to match the cloud in data protection as AWS S3 has 11 9s of durability with objects redundantly stored on multiple devices across multiple facilities. This means if you store 10,000 objects on S3, you can on average expect to incur a loss of a single object once every 10,000,000 years. To top it all off, S3 is designed to sustain the concurrent loss of data in two facilities. Very few companies can match that in their own data centers.
The scale and economics of the cloud are great if everything else is in the cloud, but how does it work for users and applications that are in offices? Users don’t want to wait for files to be downloaded from the cloud, they want the files locally with blazing fast speed all the time.
This is accomplished by putting a simple caching appliance on-premises with as much flash storage as possible. By caching appliance, I don’t mean an all-flash array. An all-flash array certainly provides the blazing speed that users need, but still requires software, hardware, and processes for DR, back-up and archiving at every site. A caching appliance relies on the cloud for durability and redundancy, but provides the local performance and enterprise features that are expected from traditional enterprise storage. Since the cloud is the “back-end” to all the sites, features like dedup can now be global in nature instead of only providing that benefit on a site-by-site basis. Of course, the caching appliance also has to be the translator between a cloud’s RESTful API and a file system interface that is used by most enterprise applications and users.
Of course, this caching device doesn’t always have to be on-premises. The caching device can also run just as easily inside the cloud. This makes a few things possible. First, applications that were developed for the corporate data center can now run without a single change in the cloud. Additionally, if an office is close enough to a cloud data center with respect to latency, that office could access a cloud-based caching appliance over a network connection. This eliminates any on-premises storage infrastructure, but still provides enterprise-grade storage performance and features to that office.
Data storage is expected to grow 800 percent over the next five years, and 80 percent of new data will be unstructured data. The trouble is, 70 percent of unstructured data has not been accessed in 60 days, and 90 percent has not been accessed in six months. This requires a more modern approach to storage architecture, and the cloud is the key to this new architecture. ‘
The cloud doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data, or the overbuilding of data centers to house increasing amounts of storage infrastructure – yet still provides the performance and features that users require.
There is no need to worry about data growth and capacity/forecasting ahead of that growth. Spending transforms from a model where you pay upfront for capacity to grow into to a model where you pay as you grow. There’s no separate software, hardware, or processes for DR, backup, or archiving as these are just part of the solution. Most importantly, IT can significantly reduce the time to manage storage as storage now becomes a centrally managed service.
Technology inflection points come and go, but there are few that are as transformative as the cloud. The cloud has already disrupted the way applications are deployed and developed with a combination of SaaS and PaaS. The next step of cloud disruption will completely change the way IT infrastructure, like enterprise storage, is procured, consumed, and managed. In just a few short years, new hire college graduates in IT will have the same reaction to tape as my daughter did to my archaic travel alarm. | 8:34p |
Equinix Tops in Market Share, Digital Realty Space It comes as no surprise that out of 1,200 firms, Equinix led all retail and wholesale colocation providers in fourth quarter 2015 revenue, and finished the year by capturing 8.1 percent of the $27 billion market total, according to a new report from 451 Research.
You can surely expect those numbers to grow this year now that Equinix closed its $3.8 billion acquisition of the European data center services giant TelecityGroup in January, establishing itself as the largest data center provider in Europe for some time to come.
The merger essentially boosted the Redwood City, California-based provider’s number of North American data centers by 30 percent or 111 to 145. Equinix also added eight European facilities to claim the top spot in that region. The additions in both areas essentially doubles the capacity for Equinix.
The 451 Research report also revealed that wholesale colocation provider Digital Realty came in second to Equinix in terms of revenue with 5.1 percent of the total market; yet first in total data center space with 132.4 million square feet.
Digital Realty nearly doubled its colocation footprint in July 2015 when it acquired Telx for $1.9 billion. Telx manages over 1.3 million square feet of data center space across 20 facilities, two of which are owned by Telx. The company already leases around a dozen of its facilities from Digital Realty, with six facilities leased by third parties.
With the cloud as the main driver behind the growth of colocation, consolidation has become commonplace in the data center space.
Last year, Digital’s rival QTS acquired Carpathia Hosting, a provider that does a lot of business with US government agencies. In April, CyrusOne, another rival, acquired Cervalis for $400 million, expanding its presence in the New York market as well as its financial services customer base.
With more and more company-owned data centers reaching out to colocation providers in some way, shape or form, it’s no wonder that 451 Research projects the market will reach $33 billion worldwide by the end of 2018.
“Colocation is quickly becoming the nexus of both cloud and enterprise IT,” said Katie Broderick, research director for 451 Research. “The colocation market is serving as data center arms dealer to both enterprises and the cloud. In this process, colocation is often becoming the strategic connection point between the two.”
|
|