Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Monday, May 13th, 2013
| Time |
Event |
| 11:45a |
HP Updates Cloud Management Software HP (HPQ) has released the next generation of its software for automating the management of data centers and cloud infrastructure, the company said today. HP Operations Orchestration 10 is an integrated portfolio of software and services to help automate complex distributed systems and heterogeneous environments.
HP Operations Orchestration (OO) 10 has out-of-the-box support for over 5,000 IT operations, including new support for Amazon S3 storage, HP ArcSight, HP Fortify, OpenStack and SAP applications. The HP Server Automation (SA) 10 server lifecycle management platform allows IT to manage more than 100,000 physical and virtual servers, and improves operational economics by reducing the administrator-to-server ratio by up to 60 percent. HP SA 10 is also offered as a virtual appliance for smaller organizations or department-level IT teams to begin managing server environments in less than one hour.
HP Database and Middleware Automation (DMA) improves administrator efficiency by automating administrative tasks associated with database management. It has over 1,000 out-of-the-box best practices to provision, patch, upgrade and release application code into databases and middleware servers such as DB2, Oracle, SQL Server, Sybase and WebSphere. HP Cloud Service Automation 3.2 is a comprehensive, unified cloud management platform for building, brokering and managing enterprise-grade application and heterogeneous infrastructure cloud services. It simplifies management of heterogeneous environments by leveraging HP OO and HP SA while providing support for Amazon EC2 public cloud services, HP Cloud Services, KVM OpenStack and Microsoft Hyper-V.
“Our IT employees were bogged down being enterprise ‘fire fighters’ instead of proactive business partners,” said Andy Smith, vice president, Application Hosting Services, McKesson. “HP cloud and automation software enabled us to improve our IT operations by automating routine, repetitive tasks prone to human error, encouraging our employees to focus on innovative IT services. As a result, we can now deliver both IaaS and PaaS in under an hour, and we reduced IT service outages by 78 percent, the occurrence of critical IT incidents by 65 percent and have been able to deploy 40 percent more IT systems.”
HP OO 10, HP SA 10, HP DMA 10 and HP Cloud Service Automation 3.2 will be available individually worldwide directly from HP or through its ecosystem of worldwide channel partners. | | 12:30p |
Beating the Storage Odds in Age of Big Data Ambuj Goyal is general manager, IBM Systems, Storage and Networking.
 AMBUJ GOYAL
IBM
Technical evolution moves at different rates and for different reasons. Unlike other areas of computing, for example, storage solutions for distributed systems have evolved as a result of proliferation, rather than more traditional reasons such as price, performance and technical advancements. In other words, when organizations have bought a particular storage technology, they’ve grown with it whether they planned to or not.
That’s largely because storage vendors have spent a lot of time creating products that are based on a variety of individual architectures and protocols. Once an organization commits to one of those architectures, it’s difficult to even consider adding or transitioning to another, different, architecture, even if that alternative offers cost, performance, or management benefits. The result of being painted into this proverbial corner, of course, is that it can lead directly to things like storage sprawl, underutilized storage systems, and complex management – all of which reduces productivity and adds cost.
Storage Controllers at the Center
One area of repeated isolation has been the storage controller, or the brains of the storage system. For various reasons, the industry has had a propensity to create separate storage controllers for different protocols, such as block, file or object. Even though the media on which these controllers store the information is the same, the storage systems will only support the designated protocol it is serving. The software (or, so called microcode) simply interprets the protocol and stores the information.
So the question becomes, why has the industry produced so many different controllers? One reason is that technology has a tendency to be “fast out of the gate.” The industry is rife with examples of technologies that have raced to production and market only to be reined in at a later point with standards or consortium-led initiatives that enable more competition, ease of use, or ease of management. And to be honest, it’s often in the vendor’s best interest to push the concept of “engineered” or “optimized” boxes for each protocol.
The Revolution is Here
The storage situation is not dissimilar to what the industry experienced with the original x86 ecosystem, where suppliers and vendors succeeded by creating a certain technology proliferation in the enterprise. Today, however, that ecosystem has been revolutionized. Now, through workload consolidation technologies implemented in private and public clouds, there is higher utilization, and consistency of management. And, note, that in the mainframe and Unix worlds, workload consolidation and the resulting improved utilization has been the norm for more than a decade.
The storage environment is ready for the same kind of revolution. It’s ready for solutions that abandon the proliferation strategy of days gone by and help organizations avoid lock-in through wide protocol support, and encourage scalability through openness. That’s what we’re working on at IBM. Our Storwize platform of high-capacity systems, for example, tackles these issues head on.
Do Your Research
But don’t take my word for it. Ask yourself, what if there was a way to abstract the protocols from the basic store and retrieve functions? What if you could use old storage and new storage simultaneously, thus maximizing the return on capital investments? What if an application provider could automatically manage the life cycle of storage without getting a storage administrator engaged?
That’s where the storage industry should be headed.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 2:03p |
IO Partners With TMI to Bring Modules to Asia-Pacific  A look at IO’s modular data center technology in a recent deployment. The company has partnered with TMI to distribute its “Data Center 2.o” technology in Singapore, Malaysia and Brunei. (Photo: Rich Miller)
IO has been aggressively trying to expand global distribution of its Data Center 2.0, its modular solution. The company is partnering with Tractors Machinery International (TMI) to distribute its data center modules in Singapore, Malaysia and Brunei. TMI will serve as the exclusive distributor of the IO.Anywhere platform in these countries, providing the local presence IO needs to expand its concepts in the region.
Modular design was initially viewed as a niche play by many in the data center industry, but has been seeing increasing traction. Once thought to be limited to mobile requirements, temporary capacity, or novel designs like cloud computing facilities, there’s been an influx of wins, discussions and partnerships around modular designs in general. The IO_TMI partnership extends that trend to new geographies.
“IO’s Data Center 2.0 products are embraced by the most demanding enterprise technology users in the world, and Asia Pacific is a highly sophisticated market,” said Oon Ho Tan, General Manager of Tractors Machinery International Pte. Ltd. “With IO, we are empowered to deliver next-generation technology that optimizes service delivery, reduces risks and aligns the data center with the needs of business and IT.”
This latest partnership is good one for IO, who has been looking to expand its presence in AsiaPac, also announcing a facility in Singapore a while back. TMI also will distribute IO.OS, the world’s first true data center operating system that integrates modular and legacy infrastructure with the entire IT stack, providing unsurpassed visibility, insight and control to optimize data center performance.
By partnering with IO, TMI gains exclusive rights going forward to distribute IO.Anywhere products within the territory, as well as access to IO training and certification programs, technology resources, and sales and marketing support. As an IO global distribution partner, TMI is poised to generate incremental revenue by leveraging IO’s next–generation technology platform and global brand equity.
“With the global rise of mobility, cloud and big data, companies everywhere must rely on their data centers to deliver unprecedented business agility,” said Adil Attlassy, IO Senior Vice President of Global Operations. “TMI is ideally situated to help IO’s global clients improve data center efficiency, agility, security, reliability and sustainability.” | | 6:59p |
NY Times: Data Centers Acting as ‘Wildcat Power Utilities’ 
The New York Times has resumed its critique of the data center business, suggesting that the industry has become a “wildcat power utility” by reselling power to customers at a profit. The Times report examines the use of a common business structure – the real estate investment trust, or REIT – by data center operators, “allowing them to eliminate most corporate taxes.”
The latest piece by Times staff writer James Glanz, titled “Real Estate or Utility? Surging Data Center Industry Blurs Boundaries,” follows a pair of sharply critical stories that ran in September 2012. The latest story, which appears online but doesn’t appear to have been published yet in the New York Times print editions, Glanz examines the power provisioning in data centers.
“Electrical capacity is often the central element of lease agreements, and space is secondary,” Glanz writes. “A result, an examination shows, is that the industry has evolved from a purveyor of space to an energy broker — making tremendous profits by reselling access to electrical power, and in some cases raising questions of whether the industry has become a kind of wildcat power utility. Even though a single data center can deliver enough electricity to power a medium-size town, regulators have granted the industry some of the financial benefits accorded the real estate business and imposed none of the restrictions placed on the profits of power companies.”
Pricing Policies at Issue
The Times bases its assessment of data centers as electric utilities on the use of flat-rate pricing, in which a customer pays for access to power capacity, whether it uses all of that capacity or not. The practice is one of several approaches to pricing by colocation and data center providers.
“The capacity pricing by data centers, which emerged in interviews with engineers and others in the industry as well as an examination of corporate documents, appears not to have registered with utility regulators,” the Times writes. “Interviews with regulators in several states revealed widespread lack of understanding about the amount of electricity used by data centers or how they profit by selling access to power.”
As in its earlier stories, the Times presents a selective version of facts. The Times story mentions the fact that one of a data center’s primary tasks is providing cooling for the thousands of servers they house, as well as the ability to connect customers with a wide array of network services. Chris Crosby, the CEO of Compass Datacenters, notes in the Times article that data centers also provide emergency backup power to keep customers online. All of these are core components of the data center business and its value proposition, and go beyond the traditional roles of a power utility.
As was the case with the previous installments of the Times’ “Cloud Factories” series, the latest article is likely to prompt discussion within the industry about regulation and business structure, as well as the accuracy and fairness of the coverage in the Times. What’s your take? Share in the comments. | | 8:00p |
Digital Realty Trust Launches DCIM Software 
Data center developers provide the bricks, mortar, power and ping to support their tenants. But they’re increasingly finding the need to get into the software side of the data center business, offering tools to make management easier. The latest to do so is turnkey wholesale giant Digital Realty Trust, which today launched EnVision, a comprehensive data center infrastructure management (DCIM) solution.
Digital Realty says EnVision is a DCIM solution built by a data center operator for data center operators. The software will provide increased visibility into data center operations through a user-friendly interface, offering access to historical data as well as predictive capabilities. The EnVision rollout will begin this month and take approximately 18 months to complete across Digital Realty’s global data center portfolio, which consists of 122 properties in 32 markets.
“Up until now, data has been collected, but it has not necessarily been easily accessed or arranged in an intuitive manner that is helpful to a data center operator,” said David Schirmacher, senior vice president of portfolio operations at Digital Realty. “The goal in rolling out EnVision across our global portfolio is to give our customers a common database that is structured around the specific needs of data center operators and can therefore manage the millions of data points that are found in today’s large-scale facilities.”
Diversifying its Capabilities
The announcement further blurs some of the traditional lines in the data center business, and reflects Digital Realty’s move to diversify its business to offer a broader set of capabilities. In recent years Digital Realty has expanded into colocation and dark fiber services. It’s not the first infrastructure provider to develop its own management software (one early example is IO, which entered the DCIM market in 2011), but as the world’s largest data center landlord, Digital Realty has the resources to be a player very quickly. To speed the process, last year Digital Realty hired Schirmacher, who previously worked at DCIM specialist Fieldview and helped automate infrastructure at Goldman Sachs.
The new product will let current and future Digital Realty customers analyze data located within specific racks, buildings, entire states and even up to the entire global portfolio, providing insight whether on a granular or high-level basis.
“EnVision links data center IT and infrastructure metrics in order to give our customers real-time, historical and predictive views into their operations,” said Michael Foust, chief executive officer at Digital Realty. “This will benefit our customers in a variety of ways. For example, it will provide improved efficiency analysis and help operations teams to support future planning. We are excited to bring EnVision to market and feel that it represents the next critical stage in the ongoing evolution of DCIM solutions.”
EnVision provides an organized view, not only saving time and adding efficiency, but it also addressing a key data management challenge by pulling together siloed, or stranded, data and presenting it in context, providing a complete real-time view into the environment rather than just a view into a portion of the environment over a “slice” of time.
Schirmacher will discuss Digital Realty’s DCIM initiative tomorrow in one of the afternoon keynote sessions at The Uptime Symposium in Santa Clara, Calif. |
|