Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, May 8th, 2017

    Time Event
    12:00p
    Cyxtera Puts a Fresh Spin on CenturyLink’s Former Data Center Empire

    After the recent wave of consolidation in the data center services industry, the handful of giants have gotten so much bigger that they’ve become incredibly difficult to compete with, especially for market newcomers. The two main strategies for this latter group have been rolling up regional data center providers in smaller, secondary markets into nation-wide platforms and doubling up on specialized services beyond raw space, power, cooling, and connectivity – things like managed cloud, compliance, and other managed services.

    Cyxtera Technologies, the new company former Terremark CEO Manuel Medina is building on the platform of the former CenturyLink data center portfolio, is going the second route, but it’s chosen to differentiate with a very specific focus: cybersecurity. While not inexistent, data center providers with such focus are not common. Also uncommon is the level of investment Cyxtera’s backers have put behind its security capabilities, buying four security and analytics companies in addition to the nearly 60 data centers worldwide from CenturyLink.

    Philbert Shih, managing director at Structure Research, commented via email:

    Cyxtera “intends to use this proprietary technology to target security- and compliance-minded verticals like government and financial services as part of a strategy that is focused on driving more value on top of raw infrastructure – increasingly crucial in a tightening market.”

    Private equity firms Medina Capital and BC Partners closed the $2.15 billion acquisition of CenturyLink data centers last week and in parallel announced the launch of Cyxtera, transferring control of the data center portfolio and the four Medina-owned security and analytics companies to the new firm. Total value of the assets is $2.8 billion.

    The four companies are:

    • Brainspace, whose analytics platform helps government customers around the world conduct digital investigations;
    • Cryptzone, which creates a secure perimeter around corporate networks, including in the public cloud, and sells solutions tailored to specific industries, such as healthcare, financial services, and retail, among others;
    • Catbird, whose solutions secure internal corporate networks, enabling companies to implement and enforce security policy;
    • Easy Solutions, which specializes in electronic fraud detection and prevention;

    In a statement, Medina said:

    “The last two decades have brought seismic changes to enterprise IT availability, agility and scalability, and the next era must be underpinned by a similar revolution in infrastructure security.”

    Medina is a well-respected leader in the data center and IT services space, his reputation stemming to a large extent from the successful sale of Terremark to Verizon in 2011 for $1.4 billion. Coincidentally, Verizon closed the sale of a data center portfolio that includes those former Terremark assets to Equinix the same week Cyxtera’s backers closed the acquisition of CenturyLink data centers.

    The CenturyLink deal made the Monroe, Louisiana-based telco a big Cyxtera customer who also holds a minority stake in the company. CenturyLink now owns 10 percent of Cyxtera, which it received in addition to $1.86 billion in cash for its data centers. The telco plans to continue providing colocation and a variety of managed services out of those facilities, and retaining some equity in their operator gives it a degree of control, or “governance opportunities,” as Dean Douglas, CenturyLink president of sales and marketing, put it in an interview with Data Center Knowledge.

    “We think [Cyxtera] is a good business going forward,” he said, explaining that the telco also expected its investment to pay off financially. “We think it’s a good investment, and it’s a very significant investment. [Medina] really knows this business extraordinarily well.

    Jabez Tan, research director at Structure, said via email that retaining a stake in Cyxtera is an important strategic move for CenturyLink:

    “CenturyLink retaining a minority stake in Cyxtera is necessary to maintain its current market positioning as a colocation provider even after the sale of its data center assets and essentially cements the relationship with Cyxtera moving forward as its primary partner for colocation services.”

    The deal helps CenturyLink fund its blockbuster $34 billion acquisition of Level 3.

    Commenting on Cyxtera, Tan said the real work begins now for Medina’s team as it starts to execute on its vision:

    “Execution and the integration of the services from those five firms is perhaps the more important aspect in providing a seamless end-to-end customer experience, as Cyxtera appears to have an attractive set of building blocks in place to carve out a meaningful niche in an increasingly competitive infrastructure services market.”

    3:00p
    Video: Inside the New Facebook Data Center in Texas

    Fort Worth Data Center Grand Opening

    We are thrilled to be online and serving traffic. From groundbreaking to grand opening, it has been an incredible journey in Fort Worth — and we shared that journey in this video.

    Posted by Fort Worth Data Center on Friday, May 5, 2017

     

    The newest $1 billion Facebook data center came online last week in Fort Worth, Texas. This is only the first building on what will eventually become a sprawling data center campus.

    Construction of the second and third buildings has already started, with fourth and fifth coming in the future, the Fort Worth data center site manager KC Timmons wrote in a blog post.

    Just last week the company filed for permits for a 500,000-square foot addition to the campus.

    The facility is powered by a 200MW wind project Facebook brought online together with Citigroup Energy, Alterra Power Corporation, and Starwood Energy Group.

    Everything about Facebook data centers here: The Facebook Data Center FAQ

    3:30p
    Zayo Scores Two More California Data Centers

    Zayo Group Holdings continued to expand its presence in California with a $12 million acquisition of Mexico-based KIO Networks’ San Diego data centers, according to a press release.

    The two facilities, located in the city’s metropolitan area, total more than 100,000 square feet and 2 MW, with additional power available. These additions to Zayo’s portfolio will go a long way toward meeting the growing needs of customers in IT, healthcare and professional services in San Diego, the company says.

    “California is an important hub of the global economy, and this acquisition further strengthens Zayo’s position as a leading infrastructure provider,” said T.J. Karklins, senior VP of Zayo’s zColo business segment, in a statement. “Customers increasingly require solutions that require network connectivity, colocation, and cloud infrastructure. Zayo offers all three at scale.”

    Customers will be able to connect to Zayo’s expansive 8,000-mile fiber network, which now includes a subsea fiber route up the coast of California, recently acquired from Electric Lightwave.

    Zayo’s data centers span the West Coast, from Seattle, Washington; to Santa Clara in Northern California; to Los Angeles, Irvine in Orange County; and south to San Diego. The company owns 40 data centers across North America and Europe.

    Last month, the Boulder, Colorado-based company announced a significant expansion to its Los Angeles data center with a new location at One Wilshire Building, according to a press release. This location is a key point of connectivity between North America and Asia’s Pacific Coast, making it one of the most significant carrier hotels in the world.

    The facility adds 24,215 total square feet and 2 MW to meet strong customer demand in the city. The company also announced a new agreement with a major webscale company, whose name it did not disclose, as an anchor tenant in the Los Angeles facility.

    4:00p
    Effective Risk Management in the Data Center

    Data center managers are fighting a constant battle with risk. Their jobs, aside from cramming computing resource into a constrained space using limited power and cooling capacity, involves ensuring that this resource is available, all of the time. That means identifying and managing risks from various sources.

    A standards-based risk management methodology can help with that challenge. It can help data center managers to prioritize their risks, and to prepare for a data center or critical environments audit. Where to start?

    Understanding Different Types of Risk

    Before a data center can manage risk, it has to understand the different categories of threat to operations. Kevin Read, GIO UK senior delivery center manager at French multinational IT consulting company Capgemini, is responsible for managing data center risk in his organization, which runs its own facilities to serve clients. He identifies several categories for data center managers to be worried about.

    “The first risk category in a mission-critical data center is loss of power,” he warns. This risk is existential for a data center, but there are frameworks incorporating the management of that risk. Like many other data centers, Capgemini uses tier ratings, which help to classify their exposure to disruptive risks such as these.

    “Capgemini designs and implements Tier 3 facilities to provide the resilience for its clients with N+1, & N+N UPS-backed power routes to the racks and cooling systems,” said Read. “Also, connecting duel power into the site protects against local sub-station power failure, with backup generators as a last resort.”

    The second risk involves service disruption thanks to fires from malfunctioning plants and IT equipment, he said, adding that the company uses inert gas suppression systems in all IT rooms including plant rooms to douse fires before they spread.

    “The third risk category is flooding (rivers and extreme weather), aircraft, pandemics and air contamination from other properties,” he continued. “Sites on flight paths, close to flood risk areas and close to factories that pollute or could contain explosive chemicals should never be selected.”

    Finally, Read points to security as risk category number four. This includes both physical security, and the risk of logical security breaches (hacks). The firm even lumps terrorist threats into this risk category.

    Like the other categories of risk, security naturally breaks down into many subcategories, and those can be divided still further. Within logical security, for example, managers may look at employee access to applications as a particular risk area, and mobile and device access as another.

    Some risks emerge as new technologies and become mainstream. For example, Paul Ferron, director of security solutions at CA Technologies, warns about virtualization sprawl as a particular security risk. This phenomenon, more often described as a management and resource risk, can have its consequences for data security too, he warned.

    “Virtual machines can easily be copied without the appropriate security privileges,” he warned. “When users have finished with them, they may not be shut down.”

    In this case, as with many others, designing secure processes for certain operations helps to standardize them and reduce the risk of vulnerabilities slipping through the net. The use of, say, IT service management tools to codify and automate those processes reduces it still further.

    Matt Lovell, CTO at cloud hosting company Pulsant, adds health and safety risks to the mix.

    These are multi-faceted, he warned, ranging from electrical best practice and mechanical operational safety through to environmental and noise controls, and the challenges of working in restricted space areas.

    “This requires a significant degree of compliance and safety of work measurements to ensure all personnel who work in the environment do so with the minimum of risk to themselves and others,” he said.

    Risk Management Methodologies

    These risks won’t all be equal, though. Some will be more likely than others, while some will have a bigger potential impact. Juggling them all and understanding which ones to prioritize from a budgetary perspective is an important part of the process.

    Ferron advises managers to use variations on the traditional risk management matrix, with the probability of risk along one side, and the potential business impact along the other. “This can be a 3-D graph,” he added, suggesting that a third dimension could highlight the projected expenditure to mitigate the risk in question.

    Read’s operation has a similar approach, designed to identify and quantify risks and their potential mitigation cost. Significantly, his risk management system is designed to be a living, breathing document that changes over time.

    “At Capgemini, we have put in place a monthly risk management system that logs all risks and issues with containment and action plans,” he said. “An investment budget is made available if changes are required.”

    While data centers face their own unique kinds of risks, the methods used for managing them aren’t specific to that environment. More generic risk management methodologies are as suitable for describing and handling data center risk as they are in other domains.

    One commonly understood risk management standard is ISO 31000:2009, said Lovell. This standard sets out generic principles and guidelines for risk management, and is designed to be tailored to the risk types that each user sees fit. It is more a framework for risk management than an accreditation, but Lovell said that it can also be used to audit risk preparedness within a data center.

    “The audit program must seek to identify that the correct response procedures are in place and that these are rehearsed and understood by staff, which will change over time, so they must be continually updated,” he said.

    Data centers don’t function alone, though. They exist on a broader continuum that marries technology with business objectives. Risk management in technology will be part of a broader risk management story. Competent companies will be exploring all kinds of risk, from financial through to regulatory and organizational.

    How the data center’s risk fits into this will vary between companies. In Capgemini’s case, the data center manager is responsible for the facility and will manage the monthly risks and issues process. That manager, along with the head of UK data centers, has monthly meetings with the chief financial officer’s team to forecast any major risk expenditures.

    Data center compliance teams will typically report to the board in some form, said Pulsant’s Lovell.

    “There are director responsibilities which must be managed and reported as legal obligations. This may differ from other IT governance programs which may report through various project or organizational structures,” he said.

    Ideally, there should be some separation of duties when managing risk and reporting on the results, Lovell added. “The recommendation is always to manage risk appropriately, and this should involve a level of independent management and verification of compliance outside of the operational teams which monitor and deliver data center services. This can be an independent internal or external governance team.”

    Choosing an Audit Methodology

    The key word here is verification. Quantifying, prioritizing and mitigating risk is one part of the risk management challenge, but measuring a data center’s performance in these areas is an important part of the process. An audit for risk will help internal staff—and potentially clients, if necessary—to see how well a data center has controlled the various sources of risk in the operation.

    Before choosing an audit to cover risk in the data center, managers must understand what they want to achieve from it. Is the risk audit customer-driven? If so, are there any specific standards that the customer is looking for? Are there any risk management metrics that a client particular wants the data center to hit?

    Audits may also be driven by suppliers of risk mitigation services to the data center. For example, Capgemini’s data centers are audited regularly by its own group, and by government clients, but also by Capgemini insurers, Read said.

    Audit Standards

    One of the biggest challenges for a risk audit is the diversity of risk categories involved. It is difficult to audit all of these under one standard, meaning that data center managers may have to apply a variety of standards when conducting an audit.

    When looking at security, ISO 27002 covers the code of practice for information security management. It explores a variety of different aspects, including human resource security, physical and environmental security, and access control.

    The Payment Card Industry Data Security Standard (PCI-DSS) also covers information security, and is a highly prescriptive standard focusing on the organization and retention of credit card data in the data center. It covers the building and maintenance of a secure network, the management of vulnerabilities, and network and system monitoring among other things.

    For commercial operators handling government information, other audits may be necessary. In the UK, List X is a commonly understood security clearance system for contractors handling government data, while in the U.S., Facility Clearance Levels are the alternative.

    “From a health and safety perspective, many data center operators are working toward, or at least to, the principles of OHSAS18001, which is an internationally recognized standard for health and safety management and associated systems,” added Lovell.

    Environmental protection audits will often fall under ISO14001. Data centers may wish to consider this auditing standard, and environmental risks in general, given the tendency to store diesel onsite in bulk to handle generator requirements.

    Stakeholders

    There are often multiple stakeholders involved when it comes to defining and mitigating risks, said Gavin Millard, technical director of Tenable Network Security, which sells software designed to scan networks for security threats. He divides them into three main groups: the security team, the operations team and the business.

    The problem is that not all of them have the same agendas, he warned: “As many organizations have discovered, the goals and needs of each are often conflicting, causing issues with prioritizing the actions needed to reduce each specific group’s definition of risk,” he said.

    What do these conflicts look like? One example involves software patching. This is one of the most effective ways to reduce security risks in an organization. In July 2013, the Australian Security Directorate published a set of strategies to mitigate cyber-intrusions. Patching operating systems was one of these measures, and patching applications was the other. Doing that, along with application whitelisting and minimizing administrative privileges would eliminate 85 percent of hacks, the agency said.

    The problem is that the IT security group’s priority is to focus on eliminating holes in the system through which an attacker might creep, so that it can reduce the risk of data breaches. That requires it to patch critical vulnerabilities quickly. Conversely, the IT operations team needs to minimize the risk of downtime, meaning that any changes to the system must be structured, planned, and controlled. This can often lead operations teams to ask for less frequent patching schedules to reduce availability risk.

    Business managers have their own, separate agenda: maintaining the bottom line and hitting their performance targets. So they will only want patches deployed if the benefit to the bottom line outweighs the cost of completing the work.

    “Conflicting goals can be hard to address, but one of the most effective methods of doing so is to have a highly efficient process for continuously identifying where a risk resides,” said Millard. “You also need a predictable, reliable method of updating systems without impact to the overarching business goals of the organization.”

    Managing risk effectively, then, involves not only an assessment of threats to the data center, but a willingness among team members to work together cooperatively so that all agendas can be happily accommodated. In some cases, this may create opportunities for new working practices.

    The introduction of DevOps (development/operations) disciplines to streamline the workflow between development, test, and deployment, might help to offset tensions such as the one that Millard describes.

    As with most things in IT, effective risk management is as much a people-centric process as a technology-focused one. The use of standardized methodologies and audits can help to quantify just how much risk a data center faces, and how this may affect future budgets. It always helps to measure what must be managed.

    Danny Bradbury has 20 years of experience as a technology journalist. He writes regularly about enterprise technology issues including data center management, security, software development and networking. 

    5:00p
    How to Lead in the Age of Analytics

    Andrew Roman Wells is the CEO of Aspirent.

    Kathy Williams Chiang is VP of Business Insight, at Wunderman Data Management.

    Data and analytics have redefined the way we compete. Data is a critical corporate asset that organizations are starting to monetize in new ways to get ahead of their competition. The bottom line? Companies that leverage data to drive the performance of their organization’s decisions are winning at a faster rate than their competition.

    One alarming trend for large corporations is that the size of the organization is no longer a competitive barrier to producing world class analytics. A general trend emerging in the marketplace is that the competitive advantage that large companies have in the use of analytics is disappearing as the cost of accessing, processing, and storing data is plummeting. Large teams of data scientists and millions of dollars are no longer required to drive insights from a company’s data assets. Analytical methods and tools are becoming more ubiquitous and less costly, leveling the playing fields for companies large and small.

    Executives that know how to lead in this new era of data analytics will outpace their competition. It will require a shift in how you view analytics and the importance the organization places on the building of analytical capabilities. There are five keys to leading in the age of analytics:

    Analytics as a Corporate Strategy

    Embed analytical capabilities and strategies into your corporate objectives. Having a clear vision of winning through analytics is essential to provide direction and organizational energy for the development of these needed capabilities. It is through these new methods, tools, and techniques that you will develop new products, services, markets, and opportunities.

    Monetization Strategy

    Develop monetization strategies as valuable corporate assets. A monetization strategy is a plan to achieve one or more business goals through tactics or actions that improve the bottom line, either increasing revenue or reducing costs.  In the same way, an organization might develop KPI’s to help manage and understand business performance, monetization strategies that drive a competitive advantage should be developed continuously and shared throughout the organization.

    Develop Scalable Insights and Capabilities

    Building one-off analytical solutions has been the norm in corporate America. Hours are poured into solving difficult problems to capture a revenue opportunity, only to have the analytics, once developed to support the plan, lie dormant or never utilized again. Leaders should look to develop monetization strategies and analytics that are automated, repeatable, and scalable throughout their organization. This approach will lead to analytics that are scaled throughout the organization that other departments can leverage rather than building their own siloed solutions.

    Big Data is More Than Just Big Hype

    If your organization has not started on the path toward building out a big data environment, you are behind the curve. Big data is here to stay and providing several benefits and new capabilities.  One of the primary drivers behind the first wave of implementations is to lower the cost structure of storing the ocean of data that organizations are swimming in. Traditional data platforms are costly and do not provide an economical solution for storing massive amounts of information.  By leveraging low cost commodity hardware, companies can store petabytes of information at very reasonable cost.

    Once organizations have brought together a large number of disparate datasets, they are able to drive new insights that were previously too difficult or expensive to produce. This includes lower level of granular data, social media information, search data, images, and keeping a richer history of information.  An example of this type of analysis is the use of big data by car dealerships whom are utilizing search data to determine the right stocking levels of various products based on customer search patterns. If consumers within a certain radius are searching more on trucks, they are able to better optimize inventory levels to match anticipated consumer demand.

    Artificial Intelligence (AI)

    As leaders, understanding the current capabilities of AI, what it can bring to bear for your organization, and where to start your journey are going to be key questions to wrap your head around.  There is a lot of hype around AI and how fast it will automate jobs. The truth is that we are a long way off from the masses losing their jobs to AI, but there are several ways you can begin to tap into this emerging technology. An important point you should consider is that AI is not new. It has been leveraged on retail company’s websites since the late 90’s helping to make recommendations on purchases, cross selling products, or solving consumer issues. The question for an analytical leader is how to leverage AI and where to start.  Outside of the online retail world, industry is beginning to use the advances in AI to automate research, especially in the medical field and legal fields.  Instead of doctors pouring over hundreds of articles and case files to find out the latest protocols and treatment plans, hospitals are taking advantage of AI to speed up the information collection and assimilation process freeing doctors to spend more time with their patients versus researching.  Looking for areas that require standardize, repetitive tasks or areas requiring diagnostic research as a great first place to start.

    As a leader, the adoption of analytics is a necessity to help your organization stay competitive. Having a clear vision, concrete objectives, and identifying the analytical competency to develop in your organization will help your company win in the market place.

    5:26p
    Dell Venture Arm Has Been Quietly Funding Data Center Startups

    Brian Womack (Bloomberg) — Last month, Scott Darling, president of Dell Technologies’ venture arm, sent four emails to his new chief executive officer over the course of an hour. He didn’t wait long for responses. After each message, Michael Dell responded within about five minutes, and by about 11:30 p.m. the last communication on new investments was wrapped up.

    “Speed matters,” said Darling, who led venture capital at EMC’s venture arm before its acquisition by Dell in September. “He completely gets entrepreneurship.”

    Now, Dell wants to publicize its investments. The venture arm — which kept a very low profile when it was part of EMC — is coming out of “stealth,” including disclosing portfolio companies as part of a new push to highlight young businesses. The company is spending about $100 million annually on the funding of startups, he said.

    Michael Dell — after closing  the merger with EMC — is investing in new ways to bolster his company’s lineup that provides gear and modern software for data centers. Looking for an edge against rivals such as Hewlett Packard Enterprise, the venture unit, called Dell Technologies Capital, is likely to invest in 20 new startups, or perhaps more, this year, Darling said.

    Dell CEO Michael Dell speaking at a conference in 2013 (Photo by Justin Sullivan/Getty Images)

    “He is the founder of Dell,” said Jeremy Burton, chief marketing officer at Dell Technologies, who also joined the company from EMC. “I think he sees that companies we invest in can get a lot of value from the Dell brand.”

    All told, Dell Technologies Capital has invested in more than 70 startups. It puts in around $3 million to $10 million initially and usually in the A or B rounds, Darling said. The company provides more money in later rounds.

    The investments cover a broad swath of technology — as long as it falls under the category “infrastructure,” Darling said. That includes data storage, artificial intelligence, cloud computing and analytics. He said it’s important to stay focused on these areas where his team has know-how and experience.

    “We’re trying to be very active in the places we’re domain experts — not get distracted,” Darling said.

    Darling, who had been at EMC since about 2012, made many of these investments while keeping the venture arm out of the spotlight, avoiding, for example, using its name in press releases for funding rounds. That secrecy helped ensure EMC would know about key technologies without broadcasting it to competitors.

    But while that might have made sense in the past, the deal created Dell Technologies, a behemoth in the tech world in terms of sales and employees, so it’s now time to highlight investments to others, the executives said. In addition, while Michael Dell wants to showcase these entrepreneurs, there have been requests by the startups themselves to tout the name behind the funding, Darling said.

    Dell also had a venture arm prior to the acquisition, and the two groups have merged, Burton said. Some of those investments will be in the new portfolio — but the majority will be from the EMC side, along with the philosophy and structure as well, he said.

    The venture arm offers funding and expertise — and the opportunity to sell to businesses that are customers of Dell Technologies. This can help young companies in particular looking for help in landing larger accounts.

    Some of the investments at the unit include Graphcore, which provides processors that help speed up machine learning technology. Another is Edico Genome, creator of a processor that helps with the massive workloads associated with DNA sequencing.

    BlueData Software Inc., is another startup that received funding in 2015 from then EMC’s venture unit, according to Kumar Sreekanti, the co-founder and CEO. His company provides software that aims to make the process for big data analytics more affordable and easier to manage. The investment by EMC has been a good move for the company, he said.

    “I’d describe the relationship as highly consultative — not just transactional,” Sreekanti said in an email. “Their recommendations have been very constructive, providing both strategic and tactical advice for our management team.”

    6:00p
    Rackspace and Dell EMC Partner on Private OpenStack Cloud

    Brought to You by Talkin’ Cloud

    Rackspace is working with Dell EMC to offer a private cloud-as-a-service solution, according to an announcement on Monday at the OpenStack Summit in Boston. Rackspace said that it is the first step in an expanded relationship with Dell EMC where the partners will work to help make it easier to set up private clouds.

    Customers will be able to combine Rackspace OpenStack Private Cloud with Dell EMC compute and storage solutions, according to the companies. Support for VMAX Hybrid, VMAX All Flash and Dell Servers will be provided initially, with support for Dell EMC ScaleIO forthcoming.

    Even as some industry experts and researchers note that private cloud is losing steam as enterprises opt for workloads in the public cloud, the offering with Dell EMC will allow Rackspace to provide more options to its enterprise customers looking for a multi-cloud approach and tap demand for OpenStack.

    See also: Rackspace CEO Taylor Rhodes Leaving Company

    “Dell EMC is committed to providing customers with best in class solutions to simplify their OpenStack deployments while taking advantage of new innovations,”  Jay Snyder, SVP, Global Alliances, Industries and Service Providers at Dell EMC said in a statement. “One example of this commitment to OpenStack is our latest offering with Rackspace, one of a handful of our global Titanium partners which has unmatched experience in operating OpenStack clouds at scale. The ability to consume Rackspace® OpenStack Private Cloud as-a-Service coupled with Dell EMC compute and storage solutions brings enhanced best-in-class capabilities for customers looking to take advantage of the opportunities offered by private cloud.”

    The news comes as Rackspace has announced that its CEO, Taylor Rhodes, will be leaving and Rackspace president Jeff Cotten will be stepping in as interim CEO. Last month, Rackspace launched Global Solutions and Services (GSS) to provide enterprises and mid-market firms with professional services.

    According to the announcement, the partners intend to lower the barrier to entry for private cloud, and provide the same elasticity of public cloud with security and performance benefits.

    Rackspace’s private cloud can be deployed in a Rackspace data center, customer data center or third-party location.

    “Rackspace has a unique track record of enabling customer success with private clouds by delivering OpenStack as-a-service and leveraging our operational expertise gained from more than one billion server hours managing OpenStack,” Scott Crenshaw, SVP and GM of OpenStack Private Cloud at Rackspace said. “This is why 451 Research has said that ‘Rackspace is the world’s leading OpenStack service provider.'”

    “As a co-founder of OpenStack, Rackspace is committed to innovating OpenStack in a way that makes it easier for customers to consume and benefit from,” Crenshaw continued. “Our collaboration with Dell EMC will do just this, combining the best of Rackspace expertise and Dell EMC technologies to allow customers to consume an OpenStack private cloud in a way that is more flexible and removes barriers to entry. Our organizations look forward to sharing more updates later this year.”

    This article originally appeared on Talkin’ Cloud.

    << Previous Day 2017/05/08
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org