Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, December 22nd, 2016
| Time |
Event |
| 4:06p |
Billionaire Reubens Sell Data Center Stake to Chinese Investors (Bloomberg) — British billionaire brothers David and Simon Reuben sold a 49 percent stake in data center operator Global Switch Holdings Ltd to Chinese investors, despite security concerns surrounding the deal.
A consortium of private sector Chinese investors assembled by Daily-Tech founder Li Qiang and led by Jiangsu Sha Steel Group, will pay a cash consideration of 2.4 billion pounds ($2.96 billion), Global Switch said in a statement.
The company houses web servers for the financial and telecommunications industries as well as governments. Its data centers are located across Europe, as well as in Hong Kong, Singapore and Sydney.
Concerns over potential security risks were voiced in September, when speculation of a deal with China first arose. Former U.K. Foreign Secretary Malcolm Rifkind urged the government to scrutinize the transaction. Global Switch sought to ease such concerns upon announcing the deal.
“All Global Switch data centers will continue to comply with the guidance issued by the UK Centre for the Protection of National Infrastructure as part of the UK government’s national security strategy,” it said in the statement. | | 8:42p |
Inside Schneider’s Big Bet on Tailored Yet Standard Data Center Blueprints Seeing an opportunity in the data center design field’s gravitational pull toward standardization and new, emerging workload categories, Schneider Electric recently redefined its EcoStruxure brand to include tools for data center design rather than simply software for data center management. It is now a platform for patterning and templates for data center designs based on the class of workloads they’ll be hosting and customers they’ll be serving.
Simply put, the data center is becoming more of a mass-producible machine, and Schneider wants to play a role in this progression.
EcoStruxure is now a platform for developing enterprise data centers tailored to their respective industry domains, as Steven Carlini, Schneider’s senior director for data center global solutions, put it in an interview with Data Center Knowledge. “You’re using the same architecture, but you’re doing separate instances that can be customized for each domain,” he explained.
One of those domains is the Internet of Things.
“We’re breaking it out into three levels from a platform perspective,” Carlini continued. First, “we have our connected device level, which is all of the things in the data center you may be monitoring.” (This is about IoT in the data center, not the broader understanding of IoT as connected appliances, cars, CCTV cameras, etc.) That includes individual outlets IoT gear plugs into, in-row cooling units, UPS units, chillers, heat exchangers, switchgear, and so on.
These components on the connected product level are coupled with what EcoStruxure calls the edge control level, which is the on-site software that’s traditionally been given the EcoStruxure brand. In an IoT use case this layer would contain the aggregated data polled from all the connected devices on the network.
“That information. . . could be ported up into cloud level,” he continued, “the top layer, which is our apps, analytics, and services layer. You can run predictive analytics on all the equipment, look for trends, combine this data with data from other data centers. The bigger your data pool, the more accurate your predictive analytics are going to be.”
At that level, operators can devise specific rule-based alerts based on data compiled from aggregated services. Say, for example, that you have a five-year-old battery that has been operating at a specific temperature, has crossed a threshold for number of discharges, and may need to be replaced within a three-month window or face a 90 percent failure potential. An operator could tie an automation rule to be triggered in that event and deploy that rule within a cloud-based service that combines the events from all facilities in a portfolio.
The OT Department
If this sounds like the IT model for managing virtual machines and application workloads applied to the realm of data center operations, it’s not a coincidence. Carlini calls this elevated level “OT” — the operations counterpart to IT. And if Schneider has its way, IT and OT may meld into the same “T” in enterprises everywhere.
“For data centers, it’s an integration of what we’re calling IT and OT. With the EcoStruxure platform you can bring all the monitoring, management, and control of those systems — technical cooling, electrical room, facility level — together under one single platform.”
Organizationally speaking, there may continue to be role divisions between IT and OT personnel for the foreseeable future, he conceded. Technically, however, there should be nothing artificial that prevents the monitoring of physical data center resources and virtual ones from being converged into one cloud service. This way a single automation framework could be constructed that could manage the active deployment of workloads, such as applications and databases, based on physical conditions.
“We see data centers at Schneider as single entities, even though we understand that there’s different silos within data center operations that we have to deal with,” he said. “Schneider is one of the few companies that does the whole IT room, from the outlet all the way up to the building entrance and the medium-voltage switchgear.”
Hyperscale Reconsidered
Schneider is not against the mindset that an enterprise’s collective data center facilities scattered throughout the planet are effectively a single machine that may be automated on a single platform.
In practice today, however, the world’s enterprise workloads are not taking on a singular profile. Classic client/server applications still abound. Carlini invoked enterprise resource planning (ERP) as one example of a workload class that has not evolved much in the past few decades but to which so much of a business’ internal operations are still bound. Analytics has been transforming database operations from a warehouse-driven mindset back into a data science operation, from a cultural perspective resembling more what computer science looked like in the 1970s than the 1990s, although significantly faster and more efficient.
Then there is the new class of containerized, hyperscale, microservices-oriented workloads, often developed by a new class of software engineers using versatile cloud-native languages like Go (or Golang), Clojure, and Scala (based on Java). Meanwhile, there’s still Web applications, the newest of which are composed with another new class of languages including Node.js (based on JavaScript), Python, and Ruby.
You may think none of this would matter all that much to a data center operations manager — someone in Carlini’s OT department. However, there are resource consumption profiles emerging for all these application classes. They utilize infrastructure in dissimilar ways.
Now, a typical cloud data center operator might assume that these resource profiles all wash out in the end if their respective workloads cohabit on a multi-tenant server cluster whose infrastructure is adaptable to wildly varying conditions. But what if that’s not the best idea? What if in the end data centers should be tailored to specific classes of workloads?
In other words, what if instead of a grand, unified cloud, a vast stretch of hyperconverged pipelines and a colossal data lake, each class of workload can best be addressed by a data center whose every component — from the make and model of server to the assembly of the switchgear — is custom-tailored to fit?
“That’s exactly what [EcoStruxure] is designed to do,” responded Carlini. “So you may have some of your internet-giant data centers running these bare metal Open Compute-style servers and storage. And on those, you may want to monitor utilization or temperature, because those data centers run a lot hotter than legacy-type data centers. There’s different types of controls you would use for those, as opposed to an edge data center that’s deploying maybe a few hyperconverged systems. Those may be in a closed box, so you may want to monitor any kind of door alarm or humidity alarm or water sensor.
“You may be taking a completely different approach,” he continued, “but using the same architecture.”
Automating Virtual and Physical Together
The hyperscale, microservices-oriented model that has been pioneered by Google, Facebook, and Netflix represents only a tiny fraction, quantitatively, of the world’s data centers. That won’t be the case for too long — there are good arguments that it cannot be. But Carlini noted that individual racks in these hyperscale models have much more variable temperatures, even when they’re seated right next to one another.
So with the same adept responsiveness an IT manager using a load balancer like NGINX or an orchestrator like Mesosphere can re-situate workloads across server nodes, an OT manager in the EcoStruxure realm could re-partition cooling levels among the racks, optimizing them as necessary for varying levels of heat output.
“Hyperconverged is going to be the next game-changing technology,” said Carlini. “Once they write the software applications to be ported from a more traditional [realm] to hyperconverged systems, you’re going to start seeing more of those as the standard deployment. But you’re still going to have them running different applications at different times with different criticality levels, even though they’re standardized boxes.”
See also: Incumbents are Nervous about Hyperconverged Infrastructure, and They Should Be
New Workloads Drive Compute to the Edge
Server makers such as Hewlett Packard Enterprise — and many others — have argued that IoT and the increasing volume of multimedia are conspiring to bring compute power out of centralized facilities and more toward the edge. But there are two edges one has to think about: one for the data center, the other for the internet and the broader network. Schneider foresees a concept of “edge” where data center units — greater in number, smaller in size — are drawn toward the point in the network that connects customers to resources.
“We’re seeing a trend of closer-to-the-edge data centers in more urban areas and more mid-sized data centers — 1 to 2 MW — because of the [low] latency that’s required for a lot of these applications. The data that’s being transmitted doesn’t have to be stored forever; only the fresh data, the critical data, needs to be stored.”
From the opposite side of the scale, the shift in network resources to the edge of the network is being driven by the surge of IoT applications, Carlini said. Here, higher-bandwidth content and faster compute cycles need to be delivered closer to the user.
The product is an emerging concept of an edge data center. If this concept catches on, at the very least the colocation industry faces the prospects of tremendous competition from a new source: custom-made, rapidly deployed, remotely managed server centers. If such a new industry does take shape, manageability will become a key criterion in determining its viability.
Which may be why Schneider is getting a jump on this evolutionary shift now. | | 9:00p |
Ten Tips for Harnessing the Power of Data Analytics Carey Moretti is Vice President of Data Intelligence Consulting for Trace3.
The explosive growth of business data has created a need for companies to adopt data analytics platforms that can make sense of so much information. By the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet, according to a citation in Forbes. By then, our accumulated digital universe will grow tenfold from 4.4 zettabytes today to 44 zettabytes, or 44 trillion gigabytes.
You can’t fix what you can’t measure, and you can’t measure what you don’t understand. Data analytics can help by addressing current strategic needs while preparing for new innovations on the horizon. So, here are 10 reasons why companies should incorporate data analytics into their core business strategies.
- Businesses in every industry should aspire to be top performers. To be one of the best, a company needs to make decisions based on all available data, rather than relying on “gut instincts.” Data analytics software can ensure that the right information gets analyzed to generate predictions that will put the decision-making process on the right track.
- The ability to create a true competitive advantage for a business. Deep analysis of both private and public data can enable innovations that improve customer intimacy, leading to greater customer satisfaction and loyalty. The ability to leverage advanced analytics and act on this intelligence can produce disruptive new products and services that differentiate a business and drive its market value over competitors.
- Data is growing exponentially, but most companies lack the technology to harness that data for informed strategic business decisions. The technical learning curve is steep to implement advanced analytics systems and dashboards. The longer companies wait to get started, the further behind most will fall, potentially jeopardizing their market share position.
- Rededicate staff time and resources to higher value efforts. With the introduction of advanced analytics, companies can empower their people to drive more value into the organization through increased automation and efficiency. This enables the team to reduce its manual tasks associated with data preparation and management, and thus focus on higher value activities.
- Cascade the power of analysis throughout the organization. The adoption of advanced analytics tools and methodologies enable a business to extend the value of analytics dashboards to executives such as line-of-business owners, marketing managers, developers and others, in addition to traditional data analysts.
- Organize the intelligence that’s available from disparate data. Most organizations today are not interpreting the full breadth of relevant data sources from outside their companies, which should be organized and analyzed in addition to a company’s internal data. This approach can help managers make real-time decisions that have a material impact on business outcomes.
- Root out unneeded inefficiencies across the company. Data analysis can drive greater productivity across product manufacturing, operations, distribution, sales, and marketing functions. In addition, in-depth analysis of service engagements can help streamline new services to make them less costly and more customer-friendly.
- Harness the power of analytics to develop your precious human capital. Hiring and retaining the best people is a critical element for any business success. Applying the latest tools and technologies can provide a real lift for the HR team for the onboarding, training and development of human resources.
- Gain a better understanding of customer issues and concerns. Double-digit annual growth is expected to continue for e-commerce sites through 2020, when eMarketer forecasts that e-commerce sales will reach $4 trillion, up from $1.6 trillion in 2015. Customer service can be greatly improved by analyzing customer sentiments across consumer review sites, for instance, or by measuring customer satisfaction surveys after transactions are compiled.
- Analyze your company’s standing with customers and partners on social media platforms. Social networks such as Facebook, Pinterest, Twitter, LinkedIn and Instagram have a powerful ability to influence a company’s public reputation, both positively and negatively. Advanced analytics capabilities allow organizations to stay in front of the content posted on social media sites, and to respond appropriately in a time-sensitive manner.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 9:30p |
Here’s What Experts Say about Cloud in 2017  Brought to You by Talkin’ Cloud
Experts across cloud, security, web hosting and more offer their predictions for what the future holds. Here are the trends in containers, security, compliance and cloud they are keeping an eye on in 2017 and beyond.
Containers
“Container adoption will move from early adopters to early majority. This shift will be marked by an increase in software vendors adding container capabilities to their existing products and introducing new products. We will also see new solutions that improve manageability of containers at scale. For example, this year we saw Docker get serious about enabling customers to run containers in production. The launch of Docker Datacenter, with manageability and security features moved the needle a long way. For many though, Kubernetes is the choice for container orchestration. One start-up aiming to simplify the adoption of Kubernetes is Heptio, founded by the guys from Google who kick started the project, came out of stealth mode late this year. The container snowball of innovation is overwhelming many of the tools that have traditionally been associated with DevOps. Generally, we are past the unreasonable hype phase of DevOps and now starting to look at real results. What we see is that DevOps is difficult even wrenching to put into place, but the benefits are so compelling there really is no choice for most organizations. How can they not do something that makes them faster and more stable at the same time?” – Dan Jones, director of product management, Skytap
“OpenStack and containers will move beyond proof of concept to deploying solutions that solve real-world business problems, providing a comprehensive strategy that allows enterprises and operators to bring new services that are secure, efficient, elastic, and scalable.” – Anand Krishnan, EVP and GM of cloud at Canonical, the company behind Ubuntu
“Technology comes in waves. When containers exploded a couple years ago, it was all containers, all the time. But containers only serve to make applications easier to build, ship and run to borrow a phrase. People will think more about applications overall in 2017, rather than just the components that make them up.” – Serge Pashenkov, CTO, Cluster HQ
Compliance
“It used to be that security concerns were the biggest impediments to public cloud adoption. But, in 2017, that will no longer be the case. It is widely accepted that security in public clouds is strong, shifting the top concern to compliance. Organizations moving to the cloud need to be able to demonstrate and provide assurance that they are doing things in a secure and compliant manner. So, whether it is PCI, HIPAA, NIST-800 53 or internal compliance standards, organizations need to be able to demonstrate that they can maintain compliance throughout the fast-pace of change that takes place in the cloud. To solve this, they will have to turn to security and compliance automation solutions that will help them measure and report with ease.” – Tim Prendergast, CEO at Evident.io
“Laws protecting consumer privacy should serve as deterrence of cybersecurity negligence leading to data breaches, but so far regulatory bodies have earned a reputation for doling out slaps on the wrist. Data protection authorities, spearheaded by the EU’s new GDPR, are increasing their vigilance – along with the cost of fines. Major fines for HIPAA and EU privacy violations in late 2016 have set the tone for next year. Expect to see global companies scrambling to implement additional privacy controls to prepare for GDPR’s enforcement in 2018.” – Skyhigh Networks CEO Rajiv Gupta
Security
“Machine learning and industry collaboration will transform security: In the New Year, the “bad guys” will become increasingly sophisticated and the value of the attacks they are conducting— both monetarily and impact wise—will be more harmful than ever. To combat this, innovators are working to move from the old school anti-virus approach to a more transformative approach informed by machine learning. This approach learns what normal behavior within an IT infrastructure looks like and then calls out any abnormal behavior. Organizations are also moving away from working in siloes and are now developing security technologies that are working together to aggregate information, so users can monitor and address all vulnerabilities.” – NaviSite CTO, David Grimes
“ISPs will find themselves at an important crossroads next year. By working together with governments and the international community, ISPs can strengthen the underpinning infrastructure of the Internet and significantly reduce the volume of malicious traffic flowing across their networks. These methods aren’t a quick fix, and they certainly can’t protect against the full spectrum of DDoS attacks, but they would be a vital first step in speeding up our global response to attacks. I’m hopeful that the future of volumetric DDoS attacks in two or three years’ time will be significantly reduced by the combined efforts of ISPs, device manufacturers, security vendors and even government entities. As this community rallies together to better protect the integrity of the Internet we may see ourselves in a very different place down the line.” – Dave Larson, CTO, Corero
“DDoS attacks like the one perpetrated against Dyn will force many providers to look to start outsourcing security rather than going in-house due to lack of top talent.” – Chris Crosby, CEO of Compass Datacenters
Public Cloud, Private Cloud and Hybrid Cloud
“Over the last few years we’ve seen previous predictions around increased public cloud adoption come to fruition, and we predict 2017 will be the year hybrid cloud asserts itself as the dominant cloud environment. Cloud spending will continue to be on the incline, and we believe a majority of that spend will go toward hybrid cloud infrastructures; this is proving to be the sweet spot for the enterprise. Organizations that have spent a lot of time and resources on their own data center are not likely to do away with it all overnight. Adopting a hybrid cloud environment allows for a transition to cloud in a way in which feels most comfortable; a gradual approach that can provide both immense cost savings as well as recovery benefits. Hybrid cloud allows for a variety of recovery options should the need arise, on-premises, public cloud or a little of both, which help companies be better prepared for a variety of disaster scenarios. Additionally, the perceived complication and expense of transitioning to cloud, that has previously held many IT organizations back, is now starting to whither. More and more companies are realizing that adopting a hybrid cloud approach, with the right partners in place, can actually be quite simple and affordable.” – Paul Zeiter, President, Zerto
“2016 saw an increase in the adoption of cloud services, but now users are going to raise expectations. More and more organizations are noticing that you either go completely to the cloud or you stay on premises since it’s just not an ideal situation to use a hybrid ‘half-and-half’ approach. Amazon and Azure make it seem easy to migrate, but businesses are realizing that even though they have a cloud environment, they have to manage it and update it continuously. In 2017, they will begin to wonder if they’re getting what they need in terms of performance and security. The cloud will not solve all of their problems and IT departments will need to have an exit strategy to avoid vendor lock-in. The end goal should not be getting all of the desktops in the cloud, but to be able to access data from anywhere.” – Mark Plettenberg, Product Manager, Login VSI
“In 2017, cloud will continue to drive radical change across enterprise IT. Businesses will make even greater investments outside of their own data centers, particularly in ‘as-a-service’ computing. 2017 will be the last year we spend money in our own data center as we move applications to the public cloud. The cloud will also bring about significant change in the role of IT professionals – IT leaders with more general experience will create teams of people with specialized knowledge of key elements of IT infrastructure, such as storage and security.” – Ruben Spruijt, CTO of Atlantis Computing
“We have all witnessed the ‘cloud rush’ of recent years where organizations have been encouraged to move their workloads to the cloud. However, there is a growing recognition among organizations that cloud services are not the be-all and end-all – and certainly not always the most cost effective way to deliver all their IT workloads. In several cases, the promised cost savings that customers thought they would receive haven’t materialized – in fact it can be quite the opposite when they first see their bill, after failing to be properly advised on their long-term costs. Many IT service providers have been caught up in the vendor hype surrounding cloud services, and encouraged cloud migration despite the end user often not fully understanding the long-term cost implications. Next year we will see more businesses move workloads out of the cloud and back on-premise, and in the future, we will see a more educated, sensible approach where cloud isn’t the default option for hosting all workloads.” – Karl Roe, VP Services and Cloud Solutions, Nuvias Group
“I do think that we are coming out of the love affair with cloud. I think it’s really moving into a different sort of paradigm from just ‘everything gets hosted.’ I think companies are starting to go into that mode of almost like we saw, I want to say, 15 years ago where they would outsource everything and then bring it back in house. I think companies are starting to feel the same way–like they don’t have control of their own destinies and that really more of their brand, their brand equity and how they expand their global go-to-market strategies is different. So now they’re looking for what we’re seeing as ‘services partners’ to help them in more of a vertical play, less than a commodity around cloud.” – Kim King, vice president of Global Partners and Channels for Progress Software (via The VAR Guy)
“Today, multi-cloud is a full-fledged reality, as organizations seek to match each workload to the cloud platform where it will achieve the best performance and cost-efficiency. Many enterprises today find themselves managing multiple clouds inadvertently and sometime haphazardly, because teams across the business are independently choosing different cloud platforms and providers to best suit their individual needs. I predict that next year we’ll see more companies formulate explicit multi-cloud strategies to best leverage and coordinate multiple cloud providers. However one gets there, a multi-cloud world comes with a unique set of challenges, including the need for expertise across a larger range of cloud technologies. It also requires managing multiple vendor relationships and more complicated cost tracking. One of the most critical needs is multi-cloud security — and my predictions for the near term are a mixed bag.” – John Engates, CTO, Rackspace
This article first ran here, on Talkin’ Cloud. | | 10:30p |
Watts to Bits: Your Daily Data Center News Briefing Here are the enterprise technology and data center news stories you need to know about today.
Reuben Brothers Sell 49 Percent of Global Switch to Chinese Investors
British billionaires David and Simon Reuben have agreed to sell a 49 percent stake in the international data center provider Global Switch to a consortium of Chinese investors. When rumors of a potential deal with Chinese firms first emerged in September, they generated controversy as some officials were concerned about its security implications. In a statement, Global Switch representatives said the company would continue complying with applicable UK national security laws. Details here.
Facebook Taps Into Top Universities for Hardware Design Ideas
Facebook has struck an agreement with top US universities meant to fast-track the company’s collaboration with academia on hardware innovation. There’s usually a long bureaucratic process involved in setting up relationships between academic institutions and private companies, and the agreement is meant to eliminate that process. The products of Facebook’s hardware design efforts so far include everything from servers and network switches to virtual-reality headsets and solar-powered drones. Details here.
Dell’s Pivotal Gets into Serverless Computing
Pivotal Software, majority owned by Dell Technologies, has launched a serverless computing feature, which enables users to execute pieces of code in the cloud for a brief duration, paying the cloud provider only for the amount of time the code runs. Serverless computing is an emerging area where providers like Amazon Web Services, Microsoft Azure, and Google Cloud Platform have introduced offerings. Details here.
Container Pioneer ClusterHQ Shuts Down
ClusterHQ, one of the earliest companies built around application container technology, is shutting down. The ecosystem of companies competing to enable customers to build and deploy applications as collections of containerized micro-services, rather than monolithic blocks of code, has grown rapidly over the last several years, following the emergence of Docker. The demise of ClusterHQ may be an early sign of an inevitable shakeout. Details here.
Oracle Says It’s Not Ramping Up Software User Audits
Oracle has responded to reports that it was expanding its efforts to find customers using paid Java features without paying, saying it was doing no such thing. It’s common for big enterprise software vendors to make it easy for customers to deploy paid features along with free ones and charging them later, but reports emerged earlier that Oracle had been hiring more people to audit Java users. Details here. |
|