Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, May 20th, 2016
Time |
Event |
12:00p |
What’s Driving (and Inhibiting) DCIM Software Adoption? Enterprises are building fewer and fewer data centers of their own, increasingly choosing to outsource operation of the infrastructure for their application workloads to cloud, colocation, and other types of service providers, who in turn are accelerating data center construction. More data center construction is taking place in edge markets, while corporate IT infrastructure is becoming increasingly distributed and orchestrated via software.
Together, these trends are continuing to drive adoption of DCIM software products, the market for which continues to grow, albeit not at the pace vendors predicted it would grow several years ago, when the concept of a stand-alone Data Center Infrastructure Management software market came to being. In some ways, however, they are also inhibiting market growth.
Jennifer Cooke, research director at IDC who focuses on data center management, highlighted these trends in an update on the DCIM software market during a webinar produced by the research firm Thursday.
Read more: Who is Winning in the DCIM Software Market
IDC expects year-over-year market growth rate to climb between 2015 and this year but then start declining. This doesn’t mean the market will be shrinking; it will continue growing, but at an increasingly slower rate until 2020, according to the analysts:

Source: IDC
As enterprise data center construction slows, IT organizations are under growing pressure to get more out of their existing infrastructure, often looking to DCIM as a way to get there. Meanwhile, as more IT infrastructure shifts to outsourced facilities, users need tools that give them visibility into those resources, which is also something DCIM can help with.
Remote visibility is also a factor that’s driving DCIM adoption by companies using points of presence in edge markets, whose construction is on the rise. Qualified staff aren’t as readily available in many of those remote areas as they are in the big metros, and edge data center operators increasingly opt for “lights-out” data centers, managed remotely, Cooke explained.
One of the biggest drivers for DCIM software adoption in the near future, however, will be the transition to software-defined infrastructure. “Data centers will increasingly be viewed not as physical business but as pools of resources that can be drawn on when needed,” Cooke said.
Abstraction of hardware components and unified infrastructure and service orchestration requires a layer of software abstraction for the underlying physical data center resources: power, cooling, and space. DCIM is that abstraction layer and a critical building block for software-defined infrastructure, according to IDC:

Source: IDC
Many DCIM tools on the market today, however, are lacking key functionality that enables them to connect to and enable the digital transformation of data centers, and this is one of the factors that inhibit the market’s growth, Cooke said.
The shift of more resources to outsourced IT infrastructure from on-prem facilities is another growth inhibitor, working both for and against the overall DCIM market. While use of DCIM tools by colocation providers and their users is on the rise, there will be fewer and fewer end user-operated facilities that need these management tools.
The third growth inhibitor Cooke pointed out is the high number of existing, legacy management systems in data centers. Surveying data center operators, IDC found that many have between seven and 12 management platforms running today, which often overlap. This is a result of the legacy silo approach to data center management, where there are separate management systems for power, cooling, building, IT, etc., making operators reluctant to add yet another management system into the mix. | 4:57p |
Aging Hardware Could Corrupt Evidence in Megaupload Case  By The WHIR
Hardware degradation could cause some headaches for investigators in the Megaupload piracy suit.
Megaupload, founded by the infamous Kim Dotcom, was shut down by the US Department of Justice in January 2012, and maintenance of the servers the evidence is stored on became an issue almost immediately. Despite this, Megupload filed a response (PDF via TorrentFreak) in it civil suit last week confirming that the plaintiffs’ evidence is potentially at risk of being permanently lost due to the degradation of servers stored by web host Cogent.
“Recently, the parties have each been advised by Cogent that it has been unable to read eight of the sixteen computer hard drives on which the Megaupload cached data have been stored,” Megaupload says in a response to a notice filed by the Recording Industry Association of America and the Motion Picture Association of America. “Without the assistance of a computer forensic expert, however, Cogent cannot confirm that the data remains extant and uncorrupted.”
SEE ALSO: Contents of Megaupload’s Canadian Servers Still Unknown as Ontario Court Determines Next Steps
The filing by the RIAA and MPAA was in response to a request for a six-month pause in the ongoing lawsuit between the industry groups and the alleged piracy enabler. The RIAA and MPAA objected to the delay, complaining of the risk of data loss.
Cogent says its inability to read the data could be caused by “drive heads” which are “frozen,” but the web host is unwilling to perform free diagnostics and maintenance or repairs on machines it is essentially holding for the DOJ.
Another company that used to provide services to Megaupload is Carpathia Hosting, which was last year acquired by data center provider QTS Realty. In the criminal case against Megaupload, QTS recently moved to be relieved of its obligations relating to the storage of over 1,100 servers once leased by Megaupload. A decision by the court on that matter is pending.
“Having seized control of the Carpathia servers in order to obtain ‘selected’ portions of the data, the government has triggered its duty to preserve the remaining data because the entire data-set ‘might be significant’ to the defense of the Criminal Action,” Megaupload argues, and that entire data-set includes the Cogent servers.
The RIAA and MPAA filed a motion (PDF) on the same day as Megaupload to have the Cogent data subpoenaed, or copied and preserved by a court-appointed third party.
A judge told the parties to the criminal action to figure out the Carpathia server maintenance issue in April of 2012, and has been attempting to arbitrate some kind of arrangement ever since.
This first ran at http://www.thewhir.com/web-hosting-news/aging-hardware-could-corrupt-evidence-in-megaupload-case | 7:43p |
Microsoft Expands Green Data Center Ambitions Microsoft data centers, and the rest of its operations, have been 100 percent carbon-neutral since 2012, the company claims. The problem is that to be considered carbon-neutral, you don’t actually have to use renewable energy, and Microsoft wants to address that problem in an expanded green data center push.
About 44 percent of electricity used by Microsoft data centers today comes from renewable sources, including solar, wind, and hydro. On Thursday, the company announced new goals to turn that 44 percent into 50 percent by the end of 2018 and 60 percent sometime “early in the next decade.”
Like most other companies with corporate sustainability goals, which usually include carbon-neutrality commitments, Microsoft buys Renewable Energy Credits to compensate for renewable energy it cannot source to reach those goals.
RECs can be decoupled from the renewable energy that was generated to produce them, so they do nothing to clean up the fuel mix on a utility grid supplying a particular data center if they were produced elsewhere.
Microsoft and other big cloud providers, such as Amazon and Google, are in rapid data center expansion mode to support growth of their cloud services. In the blog post announcing the new green data center goals goals, Microsoft VP and chief legal officer, Brad Smith, acknowledged that “data centers will rank by the middle of the next decade among the large users of electrical power on the planet.”
The job of cleaning up energy supply of these cloud data centers is made more complicated by the fact that the cloud providers don’t own many of the facilities they use to host their infrastructure. Expanding quickly around the world means having to lease capacity from data center providers, whose sustainability ambitions are often not aligned with their clients’.
See also: What Cloud and AI Do and Don’t Mean for Google’s Data Center Strategy
While there are signs that some major data center providers have made renewable energy a bigger priority than in the past, the industry as a whole still has a long way to go.
Akamai, one of the world’s largest Content Delivery Network providers, is currently wrestling with this issue. The company announced earlier this month a goal to source renewable energy for at least half of its 200,000-server infrastructure, which is highly distributed, consisting primarily of small deployments in colocation data centers across 126 countries.
Akamai acknowledged that its goal will be hard to reach and said it would start by testing a financial instrument called Contract for Difference, where the energy user agrees to pay the difference between the cost of regular grid power and the cost of generating renewable energy to the producer whose solar plant or wind farm is on the same grid in exchange for RECs.
Microsoft recently hired Jim Hanna, former head of environmental affairs at Starbucks, to lead its green data center strategy.
This month, the company announced it had joined an alliance with environmental groups and Facebook, which will promote renewable energy development. The Renewable Energy Buyers Alliance’s goal is to push for development of 60 gigawatts of renewable energy by 2025, which is enough to replace all US coal-fired power plants that are slated for retirement in the next four years.
See also: Cleaning Up Data Center Power is Dirty Work | 8:34p |
Arizona Governor Vetoes Data Center Consolidation and Cloud Bill Arizona Governor Doug Ducey has vetoed a bill that would require state agencies to consolidate data centers, migrate as many of their application workloads to cloud services as possible, and to review their IT infrastructure decisions every two years.
The bill, SB 1434, passed the legislature and was placed in front of the governor earlier this month. Ducey vetoed it this past Tuesday.
Its text didn’t include any specific requirements for data center consolidation, saying only that departments would have to “identify opportunities for information technology consolidation and shared services, including consolidating servers and data centers.”
The proposed legislation was a lot more specific about cloud. It would require each state agency or department with its own budget to “evaluate and progressively migrate” existing workloads to cloud services, evaluating hardware and software decisions every two years with the goal of moving to cloud, and reporting regularly on their progress to the state CIO and chair of the legislative budget committee.
The agencies would have to consider cloud services before making any new IT or telecommunications investments. They would have to make sure services they select comply with a list of federal security and privacy policies, such as FedRAMP, HIPAA, and others.
See also: White House Orders Federal Data Center Construction Freeze | 8:47p |
IT Innovators: Private Cloud — What to Focus on and Why 
Brought to you by IT Pro
The value proposition of the public cloud is pretty clear. Indeed, there are few companies today that aren’t taking advantage of it in some way. The benefits of a private cloud can be a bit more challenging to define.
Jim Rapoza, editorial director and senior analyst at the Aberdeen Group, has seen the innovative ways in which many companies have effectively implemented a private cloud. Here, he shares some of its use cases, and recommends what companies should focus on when building one.
According to Rapoza, one of the main reasons to implement a private cloud is to gain better management over your virtualized infrastructure and be able to better provide services to end users and the business.
“As businesses have used server virtualization over the years, they’ve run into a lot of problems, such as virtual machine sprawl and orphaned virtual machines and applications,” said Rapoza. “Private cloud makes it easier to bring these under control and ensure that only applications and services that are needed and being used are provisioned.”
The other main reason for implementing private cloud, according to Rapoza, is to, “add that kind of simple application request and provisioning that you get in the public cloud to internal IT services,” he said. “So, just as [a web services company] makes it simple to quickly start up a new application or service in the cloud, a good business private cloud lets users and business departments quickly request and activate applications and services with limited demand on IT resources.”
Once companies have determined that it makes sense for them to implement a private cloud, there are two key areas to focus on: hardware infrastructure and management systems, said Rapoza.
“You need to ensure that the hardware infrastructure is optimized for virtualization and private cloud,” he said. “The last thing you need in a private cloud is significant downtime, so organizations should implement hardware systems that have high availability, good fault tolerance and a lot of flexibility. From a software standpoint, ensure that your private cloud management systems have good application portal and provisioning capabilities, and make it possible for IT to track application usage and even leverage features like chargeback or showback to track usage of resources.“
When it comes to the public and private cloud, it’s not an either/or situation. Companies that implement private cloud capabilities need to determine how the public cloud can and should be integrated.
“You need to ensure that you can take advantage of hybrid cloud capabilities wherever possible,” said Rapoza. “These can be vital for handling surges in demand and also to provide good disaster recovery capabilities.“
In the end, said Rapoza, private cloud is about having a more agile and flexible IT infrastructure. “Instead of the old school IT days where requesting and building a new application can take months (and results in something that doesn’t meet the original requirements), a good private cloud lets organizations quickly and efficiently get the applications and services that the business needs up and running.”
Deb Donston-Miller has worked as a tech journalist and editor since 1990. If you have a story you would like profiled, contact her at Debra.Donston-Miller@penton.com.
The IT Innovators series of articles is underwritten by Microsoft, and is editorially independent.
This first ran at http://windowsitpro.com/it-innovators/it-innovators-private-cloud-what-focus-and-why | 10:57p |
Strategic Channel Alliance: an Emerging Partnership Type in the Channel  By The WHIR
Brought to You by the WHIR
As channel partnerships continue to deepen and evolve, we see the continuum of partnership types expand. Years ago we could clearly delineate a channel partner from a strategic alliance. Today, those lines are blurring. Datapipe and Equinix represent a classic example of this.
What started 16 years ago as a customer and vendor relationship between Datapipe and Equinix has evolved into a real channel-alliance partnership. Datapipe, a managed hosting and cloud services provider, needed to expand its business across multiple geographies. Equinix, an industry leading data center company with global footprint, was also growing at a rapid rate.
“We don’t enter into partnerships lightly,” said Rich Dolan, SVP of Marketing at Datapipe. “We make sure our partners are like-minded and innovative, and will assist us in providing clients with the strongest custom solutions, managed services, security, and reliability.”
Datapipe is also a highly valued, strategic partner of Equinix’s and was one of the first companies to join its global channel partner program in 2015. “By collaborating with them over the years, we have been able to work with enterprise companies worldwide to remove many of the common barriers to cloud adoption. Together, Datapipe and Equinix help enterprises to deliver and maintain scalable and dynamic cloud solutions, including infrastructure as a service and platform as a service within highly secured and reliable data centers to fit every need,” said Chris Rajiah, Vice President of Worldwide Channels and alliances at Equinix.
The two companies have come together, enabling domestic and global enterprise and government customers to scale both traditional and cloud solutions across the Americas, Asia Pacific and Europe through their partnership. Together they have also adopted a number of strategic alliance best practices that are now built into both companies’ DNA:
Executive alignment: The alliance fosters a close relationship between the two company’s executive teams. This relationship enables quick buy-in from the top for new strategies, clients, implementations, and partnerships and helps ensure the alliance’s success.
Alignment on new market opportunities: Datapipe and Equinix recognized early on the shift from traditional to hybrid cloud happening within their client base. The established alliance with Equinix enabled Datapipe to offer clients a hybrid cloud; which continues today to be one of the most significant growth opportunities in the world for service providers to offer to enterprise IT.
Practical growth based on real customer opportunities and honing a vertical market approach: Datapipe builds out new offerings and new geographies based on demand. The alliance with Equinix gives Datapipe the ability to quickly respond to and scale customer requests. They can turn on new services and rapidly enable customers to expand into new geographies.
The company also has a vertical approach, recently making an acquisition to accelerate its success in the US federal market. This successful acquisition has enabled both Datapipe and Equinix to expand their presence into the military and civilian government markets.
Joint marketing, cobranding and demand generation: Datapipe and Equinix co-sponsor marketing and industry events, and mutually invest proactively in co-marketing and lead generation. They conduct joint thought leadership and marketing events in cities around the country, produce collateral and data sheets about the companies offers’, and publish joint success stories. Both believe the real client success stories in a wide range of verticals and applications/solutions are critical to help other customers understand the value. Also, Charlie Colletti, Datapipe Channel Marketing Manager, shared that he conducts a regular cadence of calls with the marketing team at Equinix to drive innovative initiatives and execute programs that are key to both companies’ success.
A culture of partnering embedded into the DNA: Datapipe has a culture that rewards successful partnerships across sales, marketing, and executive management. The two companies have very different cultures, and work closely together to drive and maintain alignment at the executive, sales, and marketing level to keep the partnership on track. In addition, both Equinix and Datapipe channel partners can leverage the alliance and the solutions for their customers. There are clear goals and teams aligned from both companies to make those goals happen.
Continuing education on the alliance: Datapipe and Equinix conduct field boot camps, ongoing webinars and education for both companies’ sales teams and the channel partners. The teams believe in continuing to reinforce key partnership messages and success stories for sales teams and partners to replicate.
Business processes that serve as foundation for the alliance: Datapipe and Equinix measure the success from all leads for both companies through their lead flow process in Salesforce. Both companies mutually track and report on a comprehensive set of leads through the lead-to-close process, using provisions that tie those leads back to the respective partners they belong to. This serves three purposes: it gives an accurate metric for progress, ensures Datapipe and Equinix are aligned, and provides partners opportunities and drives their accountability in the process.
The Datapipe and Equinix partnership truly represents an example of a relationship that has progressed from first a customer relationship, then a channel partner, and now a true strategic channel alliance leveraged by both companies’ channel partners in the marketplace.
This article is brought to you by HostingCon, the Cloud and Service Provider Ecosystem event. Join us in New Orleans, Louisiana July 24-27, 2016 to hear Theresa and other thought leaders talk about issues and trends in the cloud, hosting and service provider ecosystem.Save $100 off your HostingCon All Access Pass with coupon code: H1279
This first ran at http://www.thewhir.com/blog/strategic-channel-alliance-an-emerging-partnership-type-in-the-channel |
|