Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, July 2nd, 2015
| Time |
Event |
| 12:00p |
Survey: Enterprises Plan to Spend More on Data Centers Despite the popular belief that cloud services are well on their way to replacing enterprise data centers, most mid-size and large businesses are planning to increase spending on their mission-critical facilities in the near future.
That’s according to a recent report by 451 Research, which said nearly 90 percent of data center operators surveyed in North America and Europe had plans to increase data center facility spending. About one-quarter of them said they will increase spending over the next 90 days.
Enterprises are consolidating smaller data centers into larger centralized ones. They are generally reluctant to build new data centers, so those larger data centers are older facilities that have to be upgraded to support the consolidated capacity.
Despite fears of cloud, enterprise data center spend expected to grow Click To Tweet
Companies generally start looking for options to expand capacity when they reach about 75 percent utilization of their existing data center footprint, according to 451. When they do, they have the choice to consolidate into large existing facilities, buy or build new data centers, use cloud services, or lease colocation space.
Consolidation is the preferred option, the researchers found. Cloud was the second most popular option, and colocation was third. About one-quarter of respondents said they would consider building new data centers, and only 6 percent said they would look into buying one.
Here’s a 451 chart showing colo and cloud providers enterprise users favor most:

There is pressure on IT organizations to support growing business demands, Dan Harrington, research director at 451, explained. To address those needs, companies are allocating more budget to modernization of older data center facilities.
Driven primarily by healthcare and finance industries, the bulk of new enterprise data center spending is going to racks, cabling, electric gear, and data center infrastructure management (DCIM) software.
Vendors that stand to benefit the most from this influx of capital are Schneider Electric, Emerson, Trane, Carrier, Caterpillar, and Cummins.
According to 451, most survey respondents said they preferred Schneider’s power distribution units, universal power supplies, DCIM, as well as racks and cabling. Emerson was the top preferred seller of computer room air conditioning and air handling gear, closely following Schneider in the four other categories.
Most data center operators considered Trane and Carrier top chiller vendors, while Caterpillar and Cummins enjoyed widespread popularity as backup-generator suppliers.
One company surveyed said it was planning to increase spending to replace aging equipment, such as aging PDUs and CRAC units. This is a $5-billion-plus IT company with more than 1,000 employees.
Another company with a similar profile is planning to spend on Emerson’s DCIM platform Trellis.
Emerson announced earlier this week a plan to spin off its data center and telco infrastructure business unit called Network Power as a stand-alone entity. The company said it would benefit from more agility in being able to respond to changing market needs as an independent business.
Still, while enterprises are planning to spend more on data center infrastructure, there will be fewer actual data centers.
Companies are actively consolidating smaller data centers and server rooms, replacing them with larger centralized facilities, supplemented by colocation and cloud services. As a result, there will be fewer facilities, but the total overall footprint will remain about the same, the analysts concluded. | | 1:00p |
New HPC Center in France to Push OpenPOWER Adoption IBM, NVIDIA and Mellanox are opening a center focused on OpenPOWER in Montpellier, France. OpenPOWER is a foundation meant to boost IBM’s open and licensable POWER architecture.
OpenPOWER is positioning itself as an alternative to x86-based products that currently dominate in data centers. The messaging has been around its applicability to high-performance computing workloads. The new center will promote development of HPC applications as the consortium looks to extend the OpenPOWER ecosystem.
IBM and others are trying to promote the use of POWER architecture. These centers get more scientists and engineers working with POWER in general, with the goal of not only developing new applications but also training people to work with the architecture.
The centers are equipped with the latest HPC technologies that will be used to tackle problems in fields like energy and healthcare. Last month, IBM launched SuperVessel, an OpenPOWER developer cloud service meant for universities to develop new uses and apps for free.
IBM, NVIDIA, and Mellanox are contributing experts that will help developers take advantage of GPU acceleration on OpenPOWER-compatible systems.
These are the same types of systems that will be used by the US Department of Energy for the next generation Sierra and Summit supercomputers and by the UK’s Science and Technology Facilities Council for Big Data research. The latter project involved a $475 million commitment by IBM.
The POWER Acceleration and Design Center is the second center in Europe. The new center is a sister institution to the Jülich Supercomputing Center IBM and NVIDIA unveiled in Germany last November.
“Our launch of this new center reinforces IBM’s commitment to open source collaboration and is a next step in expanding the software and solution ecosystem around OpenPOWER,” said Dave Turek, IBM’s vice president of HPC Market Engagement, in a press release. “Teaming with NVIDIA and Mellanox, the center will allow us to leverage the strengths of each of our companies to extend innovation and bring higher value to our customers around the world.” | | 3:00p |
Dispelling Data Security Myths John Joseph is the President and Co-founder of DataGravity.
You know that go-to story you tell at parties? It gets laughs with any crowd, and you don’t even realize how often you fall back on it. Then one day, someone asks an innocent question that exposes a flaw in your memory of that story, and its foundation crumbles. You never think of that anecdote the same way again, so you mentally file it away.
We don’t knowingly rely on incorrect information in our personal lives; it doesn’t make sense to do it at work, either. Still, many of us are prone to looking the other way when it comes to the big, flaw-exposing questions about our business practices that we don’t know how to resolve. For example, data security is a top concern for most technology professionals and one of the most heavily funded areas of IT, with Gartner predicting global information security spending will reach $76.9 billion in 2015 alone. Yet, some myths about data protection, retention and data awareness remain. The scary part is that falling for one of these myths isn’t as simple as getting drawn into an inaccurate story at a party. Instead, the mistake could lead to a damaging corporate security breach and data theft.
Here are some of the top myths IT pros mistake for truths, how to uncover the realities behind them, and what it all means for the data center industry.
Myth #1: We’ve Taken Enough Precautions to Keep Outsiders at Bay
You’re keenly aware of the risks that can wreak havoc on your system. Hackers are aggressively trying to obtain private information from corporate data centers, while innocent and unintentional actions, such as misplacing files and compromising access credentials, can also expose sensitive data to public shares. You may be working to build a fence around your data center to secure proprietary information and employees from harm. Although securing the perimeter will always contribute to the overall protection of your data, these efforts will leave you unprepared to face threats coming from inside your network unless you complement them with strategies to secure data where it’s created and resides.
It’s no longer a matter of whether your company is going to get breached. It’s become a matter of when, and one important question to consider is how you’re going to react. Once a threat has penetrated your system, you need to block it from causing additional harm while you identify and deal with it. In these situations, it’s crucial that your customer, employee, business-critical and personal data are locked down in secure locations out of the attacker’s reach.
Myth #2: Storage and Security Should be Managed Separately
Data storage systems have always been prepared to protect against – and recover from – catastrophic loss on a foundational level through disaster recovery technologies. In the event of a component failure, natural event or human error, your storage probably has you covered. However, storage is getting smarter. Modern technologies no longer wait to save you from the disaster after it occurs, but can instead identify the threats created by vulnerable, sensitive data within your system ahead of time. Just as security is no longer anchored at the fences of your infrastructure, intelligent storage and networking systems no longer blindly transfer data without using automatic sensing capabilities to identify anomalies and potential risks. Instead, you can guard against intruders and the impact of a misplaced file in an integrated process with a single seamless experience. It won’t require four separate software tools when current technology does it with one.
Myth #3: It’s Not Your Job to Make Major Security Changes
Even with new resources at hand, you might not know what steps you can take toward improving security. However, in the face of a malicious attack or data breach, all members of the IT community – with its innovations, resources and success stories – are working toward the same goals. If you want to overcome outdated myths and take action, try some of these initial steps:
- Use resources like Gartner and Forrester analyst perspectives to get educated on a granular level about the solutions, ideas and capabilities the IT industry has to offer.
- Attend security-focused events orchestrated by RSA, the International Association of Privacy Professionals (IAPP) and the Information Systems Security Association (ISSA) to talk to your peers about the challenges they face and the strategies that help get results.
- Identify role-model organizations for your business. Ask yourself what technology decisions helped those companies get to a place where the team knows what’s in its data and can confidently protect it.
- Network with your peers if they are willing to discuss strategies for management and containment. I’ve attended many industry social events where just about every IT person knew six to 10 people at neighboring companies within the metro area. It’s a small, tight-knit community filled with great people.
If you remain educated on new trends, share your results and avoid relying on unconfirmed notions about security and risks; you’ll be able to protect your business and each individual it affects.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 5:20p |
Red Hat Rolls Out Linux for SAP Hana Cloud on AWS Red Hat Enterprise Linux certified for SAP Hana is now available on Amazon Web Services, Red Hat announced this week.
While SAP provides its own SAP Hana cloud on IBM SoftLayer and a development environment on AWS, an ecosystem of cloud service providers is starting to emerge around the core Hana platform.
Jane Circle, manager for certified cloud provider and cloud access programs at Red Hat, said in the case of RHEL the idea is to make it simpler for AWS customers to spin up Hana in a production environment.
“We’re starting to see customers move SAP Hana in production,” she said. “We want to make sure customers have options should they decide to deploy those workloads in the cloud.”
Given the Big Data workloads that generally get deployed on Hana, most customers are expected to run the in-memory database management system on the newest M4 instances of virtual servers that AWS created using the latest generation of Intel Xeon processors, Circle said.
Ultimately, Red Hat expects to see customers deploy federated instances of Hana spanning both on-premise and cloud infrastructure. As SAP begins to certify other service providers to run SAP Hana cloud, Red Hat plans to provide similar support for additional third-party service providers.
The degree to which customers will customize SAP applications in those cloud environments is a subject of some debate.
While the German enterprise software giant is encouraging organizations to create custom applications that run on top of Hana, when it comes to SAP applications themselves the company contends most customers will be better off using the cloud service managed by SAP. At the core of that argument is a contention that IT organizations do not need to customize SAP applications that already address 95 percent or more their business process requirements.
Conversely, cloud service providers note that IT organizations have a long history of customizing SAP applications, a requirement that they contend is not going to go away simply because SAP is now making available instances of its applications on a cloud it manages.
While that battle remains to be fought in the proverbial IT trenches, cloud service providers of all types are clearly anxious to gain support for an emerging class of Big Data workloads that serve to make their overall IT environment more cost-efficient, which invariably leads to additional price cuts being passed along to their customers. | | 5:54p |
JDM Buys State Farm’s Phoenix Data Center in Sale-Leaseback Transaction Phoenix-based real estate investment company JDM Partners has acquired two commercial properties, including a Phoenix data center, from State Farm Insurance for $38 million. State Farm will continue occupying properties, sitting on over 25 acres of land on the Phoenix-Tempe border, as a tenant.
JDM is one of the largest owners of entitled land in the Phoenix metro. Former Phoenix Suns and Arizona Diamondbacks owner Jerry Colangelo is a principal.
The company invests heavily in Arizona because of its growth potential. Population in the state is expected to double in the next 30 years to 9 million people with solid job growth, according to the company.
The sale-leaseback transaction gives JDM a sizable data center fully leased to a single tenant. Sale-leaseback data center transactions are popular with real estate investors, as they provide long-term recurring revenue without having to look for tenants.
Griffin Capital recently made a similar sale-leaseback transaction with an American Express data center in the market.
The Phoenix data center is over 250,000 square feet, Rose Law Group reported. State Farm Mutual Automobile Insurance in Bloomington, Illinois, was the seller and will continue to occupy. The single-level building was developed in 1998 and occupies two-thirds of the site.
JDM has acquired several properties from State Farm over the years, including an 11-state operations-center portfolio consisting of over 3.4 million square feet in 2014 and five buildings in a Tempe, Arizona, corporate park in 2013 for $73 million.
Other data centers in the market include an eBay data center, a data center under construction by Apple, and a massive project from colocation provider CyrusOne in nearby Mesa. Microsoft is reportedly eyeing a data center build in the area. Other colocation providers in the market include IO, Digital Realty, CenturyLink, and Telx.
A 2014 report by the commercial real estate services firm CBRE attributed growth in the Phoenix market primarily to low cost of power (in comparison to the other hubs), as well as recently passed data center tax breaks. | | 6:07p |
Level 3 Acquires DDoS Protection Firm Black Lotus to Boost Security Capabilities 
This article originally appeared at The WHIR
Level 3 Communications has acquired DDoS protection company Black Lotus to boost its security product capabilities, the company announced Wednesday. Terms of the cash-only deal were not disclosed.
Level 3 launched its own DDoS service earlier in 2015, and will integrate Black Lotus’ technology and solutions into its security product portfolio. The new Level 3 DDoS Mitigation service provides enhanced network-based detection with a mitigation scrubbing solution and network routing, rate limiting, and IP filtering capabilities.
Black Lotus also brings its advanced behavioral analytics technology, and its proxy-based mitigation service provides added protection to the application layer, a popular target for effective DDoS attacks. Incapsula noted that bots were using a greater variety of assumed identities in application layer attacks in a report last month.
“At Level 3, we value security and are committed to protecting our customers and our network,” said Chris Richter, senior vice president of managed security services at Level 3. “Black Lotus’ proxy and behavioral technologies, combined with their experienced team of DDoS experts, perfectly complements Level 3’s DDoS mitigation and threat intelligence capabilities. With this acquisition, Level 3 continues its commitment of investing in a comprehensive portfolio of services that enhance the growth, efficiency and security of our customers’ operations, helping enterprises combat the cybersecurity challenges they face every day.”
Black Lotus had been privately held, and the company raised $6 million in financing in April 2014 to support its international ambitions.
Level 3 noted increases in DDoS threat frequency, volume, and complexity in a June report on botnets (PDF). The report noted a trend towards rogue virtual machines, and botnets-for-hire at $190 a month for 1,000 servers.
Level 3 shares on the NYSE (LVLT) jumped slightly higher from $52.65 to $53.16 on Wednesday, and have remained in the $52-53 range.
This first ran at http://www.thewhir.com/web-hosting-news/level-3-acquires-ddos-protection-firm-black-lotus-to-boost-security-capabilities | | 6:34p |
How Next-Gen Data Centers Will Differ From Today’s As the digital economy continues to mature, it’s clear that data centers are the engine around which just about every business revolves today. As such, it’s incumbent upon the IT organizations responsible for fine-tuning those engines to understand where the next advances in IT are likely to take them.
At the Data Center World conference in National Harbor, Maryland, this September, Bill Kleyman, vice president of strategy and innovation for MTM Technologies and a frequent contributor to Data Center Knowledge, will outline how data centers will evolve in the months and years ahead.
“Every organization today is becoming a digital entity,” Kleyman said. “That has a lot of implications for how IT is actually managed. There will be no more silos inside the data center.”
In addition to the rise of cloud computing and open APIs, Kleyman said, emerging technologies, such as software-defined data centers, multi-layered data center control, data center operating systems, and data center automation and robotics, will transform almost every aspect of data center management.
In addition, advances in energy and cooling technologies will make it more cost effective to build and deploy new data centers, he said.
Put it all together, and it’s already clear that the way most data centers are run today will bear little to no resemblance to how those same data centers will be built, deployed, and managed just a few short years from now.
For more information, sign up for Data Center World National Harbor, which will convene in National Harbor, Maryland, on September 20-23, 2015, and attend Bill Kleyman’s session titled “Five Ways Next-Gen Data Centers Will Be Different from Today’s” |
|