Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Monday, March 18th, 2013
| Time |
Event |
| 1:03a |
Industry Perspectives Focus on Efficiency Efficiency in the data center was the common theme for this week’s Industry Perspectives columns. From increasing your efficiency through more effective storage, monitoring such as eBay’s new monitoring dashboard, selection of the right drives to be more compatible with compute and storage, consolidation of the data center to the move to use water more efficiently, all the guest articles from industry experts can be used to guide you towards improving efficiency in these aspects of data center operation. Enjoy reading!
Reducing the Storage Footprint & Power Use in Your Data Center – When space is tight and storage costs are mounting, deduplication is a method to squeeze more out of current data center space, according to Eric Bassier of Quantum.
Why eBay’s Digital Service Efficiency Changes the Game – eBay has provided a role model, which is organized around common metric, to optimize the overall effectiveness of IT, and has openly disclosed the metrics and indicators used to do it. This is a step forward in transparency, writes Winston Saunders of Intel.
Six Tips for Selecting HDD and SSD Drives - Today selecting the right drives can be a challenge, says Gary Watson of Nexsan. This article offers six tips for navigating through this complexity to help you pick the right solutions for your needs.
Consolidation: Shrinking Our Way to Data Center 3.0 - Like everything in the IT industry, there is no magic solution to all situations. However, the trend toward shrinking, just-in-time, data center deployments is growing, and becoming a significant option in the arsenal of data center operators, writes Antonio Piraino of ScienceLogic.
Do You Know the Hydro-Footprint Of Your Data Center? - The complex relationship between water and energy use in the data center is outlined in this column by Ron Voukum of JE Dunn and Harold Simmons of United Metal Products.
| | 12:18p |
DataBank Expands With Acquisition of VeriSpace Data center service provider DataBank is expanding beyond the Dallas market with the acquisition of VeriSpace, a colocation provider based in Minneapolis, the company said today.
DataBank operates six data centers within 400 South Akard, a seven-story building constructed in 1921 to house the U.S. Federal Reserve Bank of Dallas. In 2000 the building was renovated as a technology and telecom hub, taking advantage of security features from its use as a banking facility. Having filled its original building with customers, Databank is about to open a new data center in Richardson, a suburb of Dallas.
Now Databank is entering a new market with its deal for Verispace, which operates a 10,000 square foot data center in Edina, Minnesota where it offers carrier-neutral colocation space. Following the acquisition of VeriSpace, DataBank will operate more than 180,000 square feet of data center space.
Databank becomes the latest in a series of players who’ve established their business in one geographic market and then expanded to other cities. Databank has been eyeing new geographies since the company was acquired by Avista Capital in 2011, and expressed its intent to increase its data center footprint.
“DataBank has been working diligently on expansion into new markets, and VeriSpace offered us the perfect vehicle,” said Tim Moore, CEO of DataBank. “VeriSpace has a strong market presence and provides service to a number of the region’s top businesses. We are continuing to execute on our plan. This first step in Minnesota gives us a great location in which to launch our expansion efforts and we will be developing additional high-quality data center capacity here, in the coming year.”
VeriSpace was founded in 2002 by Minnesota commercial real estate developer Dave Frauenshuh, who also owns the building in Edina that houses the VeriSpace data center. | | 12:30p |
The Customer is Always Right: Remaining Relevant in the World of Data Centers Jim Smith is Chief Technology Officer of Digital Realty (NYSE:DLR) and is responsible for overseeing data center development, delivering more than 500MW of data center projects totaling more than a billion dollars of capital investment. Smith also leads Digital Realty’s sustainability and energy efficiency initiatives.
 JIM SMITH
Digital Realty
As more and more enterprises turn to the cloud for their data transfer and storage needs, demand for leading-edge data centers is increasing. And the ability of a data center owner/operator to understand and embrace the evolving IT needs of its customers can be the difference between remaining relevant and becoming extinct in today’s highly competitive business environment.
In the spirit of carrying out 2013 New Year’s resolutions geared toward customer satisfaction, data center owner/operators could do worse than to focus on one or more of the following areas: availability, multi-tiering, Data Center Infrastructure Management (DCIM), financial flexibility and rapid delivery.
Striving for the ‘Five Nines’
In the realm of data centers, the degree to which a system is operable and committable is the go-to metric. Everybody wants high availability, but the concept is rarely mentioned during the pre-sale because it has become an assumed deliverable. Nonetheless, we should all be striving for the “five nines,” which translates to a downtime of just more than five minutes per year. However, there are a limited number of data center providers that have the either the operating history or ability to demonstrate a track record of five nines in availability.
While high availability remains the industry standard, with the development of virtualization, new types of facility-aware software and more flexible operating systems, some data center users are becoming more interested in multi-tiering.
Multi-tiering for Redundancy
Multi-tiering refers to an owner/operator’s capacity to provide adequate back-up systems with different maintainability and reliability characteristics in a single facility, campus or data hall. Whether a customer requires one-megawatt or one-kilowatt increments, the ability to adjust capacity and to adapt it to a specific set of applications is invaluable in today’s competitive environment. In other words, redundancy increases relevancy.
One method to apply redundant architecture and then deploy it is to construct a building containing multiple, discrete data centers with variable tiers that share network, storage and monitoring services, which gives a customer the option to deploy on different dimensions.
Allowing for Incremental Deployment
Because of the complicated nature and rigorous cooling requirements of the facilities themselves, data center development projects are capital intensive – probably more so than any other real estate construction undertaking (logistics, retail, mixed-use). As such, they have been receiving a heightened level of scrutiny since the onset of the global financial crisis in 2008.
While this increased attention has on occasion created challenges for commercial owners and operators in the data center sector, it has also encouraged the industry to take a prudent approach to the development of these high-cost projects. In general, this is a very positive development for the data center space and has encouraged owner/operators to focus on providing their customers with a financial flexibility that allows them to deploy capital resources incrementally.
In other words, several pre-meditated investments can be made over an agreed upon period of time versus committing a capital deployment all at once. Customers appreciate this flexibility.
Minimizing Downtime via Rapid Delivery
Downtime is the bane of enterprises that rely heavily on the cloud for the delivery of their products, which brings to mind the troubles that a popular video streaming provider experienced on Christmas Eve 2012 when an AWS outage impeded its service on some devices. As you can imagine, consumers were up in arms as they have come to expect immediate gratification when it comes to downloading content on their handheld gadgets, tablets, laptops, personal computers and home media systems.
Data center customers that serve these consumers expect their existing facilities to be highly available. And in terms of expansion, whether building a brand new facility or performing a system upgrade, they want rapid delivery of new or improved data centers in order to minimize the risk of downtime—thereby minimizing the risk of alienating consumers.
When asked in what increments they would prefer to expand their data centers’ IT capacities, 65 percent of our customers said during a recent survey conducted by Digital Realty’s operations teams that growing in increments of 250 kilowatts at a time would be ideal. Ultimately, the key consideration should be how to scale-up as non-invasively as possible.
Keeping Your Eye on Customer Needs
The key to longevity for a data center owner/operator is to keep a finger on the pulse of its customers – to not only ask the hard questions but to ask them early and often in any given relationship. The most successful of these are collaborative in nature, with owner/operators and their customers partnering closely over the long term to satisfy the evolving IT requirements of a business.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 12:30p |
Requirements Analysis Process for a Custom Data Center This the second article in series on DCK Executive Guide to Custom Data Centers.
Before considering building a custom data center, you need to carefully examine your organizations requirements and compare them to the deliverables offered by the standard “off-the-rack’ designs that comprise the mainstream data center industry. In many cases the request for a custom design data center is driven by the IT department, perhaps because they had previous experiences with standard designs that did not meet some of their specialized requirements. If this is the case, then a very specific requirements list needs to be presented during the early stages during the requirements analysis. Moreover, IT architectures are constantly evolving to meet customer driven business requirements, as well as being affected by new equipment, which can impact the data center design. In addition, as was discussed in part 4 of this series, “Creating Data Centers Strategies with Global Scale,” business requirements are changing radically and rapidly and that impacts IT long term roadmaps, which must be carefully considered, before rolling out any new data centers.
Therefore, a clear understanding is mandated in defining the requirement analysis process that needs to first occur within your organization. This internal analysis process needs to involve the key business, IT and facility management personnel to scope out the immediate and long term requirements as well as reviewing the overall expectations of the information systems. Once this has been established, then the details need to be more closely analyzed by the CIO, CTO and IT Architects. This, in turn, should result in an IT roadmap which can project the amount and type of computing hardware required to support and deliver the stated business goals.
Once the IT roadmap is established, a cost or technology advantage justification needs to show how and why your IT architecture, systems and strategies require a custom design, rather than a standard data center. Moreover, a highly customized data center which deviates significantly from a more generic standard data center may not adapt well in the future, should your highly specialized IT requirements change significantly. This is not to say that you should forgo a custom design out of fear of early obsolesce due to a technology shift, just a caveat to be considered when evaluating the justification for a custom design, in relation to your long term IT roadmap and business goals.
You can download a complete PDF of this article series on DCK Executive Guide to Custom Data Centers courtesy of Digital Realty. | | 1:55p |
Dark Fiber Challenges Derail Project in Upstate NY  An aerial view of the Yahoo data center in Lockport, N.Y.
It’s not easy being green. At least not for data center providers attracted to the cool breezes and hydroelectric power in upstate New York. The region is home to a Yahoo data center that has been cited by Greenpeace and others for its energy efficiency and sustainable design.
So in an era of increasing scrutiny of data center sustainability, why aren’t more companies building in this ideal location? A number of data center projects have scouted sites in the area and opted not to build for a variety of reasons. Prior to Yahoo’s arrival, several prospects reportedly experienced challenges with power provisioning. A proposed Verizon project was slowed by legal challenges from area landowners, and later shelved when the company opted to buy Terremark instead of building new space.
In the latest example, local officials in Lockport, N.Y. say a data center company that spent two years evaluating a site near Yahoo has backed out die to a lack of dark fiber at the site. The town’s Industrial Development Agency is now considering whether to invest hundreds of thousands of dollars to make dark fiber available at the site, citing the need to compete on future deals, according to The Buffalo News.
One question in the mix is whether the cost of extending dark fiber to a site should be borne by the data center operator or the developer or economic development agency. With more states extending tax breaks and other economic incentives to attract data centers, the competitive landscape is shifting. Even as Greenpeace and other groups seek to pressure data center operators to use more renewable energy, that often requires tradeoffs once the larger economics of site location are considered.
So a question for our readers: Can sustainability and data center economics align? If so, what are the areas that offer the best opportunities?
| | 9:00p |
OVH Raises $181 Million to Build Data Centers in U.S.  OVH uses a cube-shaped design at its data center in Roubaix, France. The copany has raised $181 million to fund its North American expansion. (Photo: OVH)
European hosting giant OVH has lined up $181 million to build new data centers in the U.S., as it continues to expand its business into the North American market. The French company is known for hyper-growth and an innovative approach to infrastructure design, building custom servers, containers and data centers shaped like giant cubes.
OVH said the credit line was the largest ever arranged for a European hosting firm. It was arranged with a group of 10 banks, who said OVH’s steady growth convinced the banks to support the expansion phase. The company was founded by Octave Klaba in 1999, and operates more than 140,000 dedicated servers in 11 data centers.
“This syndicated loan assures the next two years of development for the company, by financing 70 percent of the North American and European investments to come, the remaining being self-financed,” said Nicolas Boyer, CFO at OVH.com. “It therefore offers us the visibility to carry on a very sustained development and an ever-growing client demand. Moreover, this financing allows us to consolidate our relationships with financial partners that have trusted us for over 10 years and who have reaffirmed, through this operation and the magnitude of this fundraising, their willingness to accompany OVH.com’s development.”
The company designs and builds its own servers and data centers, including some unusual designs:
- OVH’s first North American data center in Beauharnois, Quebec, is housed in a former Rio Tinto Alcan alumnium plant with an airflow design reminiscent of the Yahoo Computing Coop design, designed to allow waste heat to rise and exit through a central ceiling vent. The building is located alongside a dam that will provide 120 megawatts of hydropower to support the facility. OVH estimates that at full buildout, the facility could house as many as 300,000 servers.
- In 2011 OVH opened an innovative Cube-shaped data center in Roubaix, France, which houses servers in an exterior corridor built around an open center, allowing for easy airflow through the facility.
- In early 2012, OVH opened a new data center in Strasbourg, France that recreates many elements of the cube design using stacks of shipping containers housing servers. The facility features 12 containers, which are stacked three-high in two rows. Outside air enters the facility through louvers in the exterior wall. The air travels through the racks of servers, and then exits the IT corridor via large fans behind the racks, which vent the air outside into the open center.
Last year OVH announced plans to enter the North American hosting market, and acquired the property for the Quebec facility. At the time, Klaba indicated that the expansion would include additional data centers in the U.S.
“North America is a territory so vast that it takes at least three large data centers to cover the entire population,” he said. “We are in phase A, which is taking account of the East coast. Later we think do the same to the west and certainly the center.”
For additional coverage, see French Web Host OVH Raises $181 Million at The WHIR
|
|