Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, October 3rd, 2013
| Time |
Event |
| 12:30p |
Diesel Fuel: It’s Critical for Standby Power Alastair Trower is the President of Puritas Energy, Inc., a green tech company distributing diesel fuel polishing systems.
 ALASTAIR TROWER
Puritas Energy
Data center downtime is a familiar topic in the industry; however avoiding down time through proactive fuel management is not common knowledge in the field despite an emerging relevance. A vast majority of data centers use diesel to supply their backup power systems.
Historically diesel could be stored for extended periods at a time and function smoothly when needed. Unfortunately, this is no longer the case. The environmental burden of diesel has been reduced by government mandates reducing the sulphur content of diesel and introducing biodiesel blends. However, in doing so, the need to manage stored diesel has surfaced.
Changes in Diesel
By 2007, Governments of the US, Canada and the European Union have mandated the reduction of Sulphur content in fuels. The process to remove sulphur in diesel can affect the functionality of the fuel, to compensate, refiners include additives. Some of these additives, such as certain forms of lubricants, de-icers and biodiesel itself increase the ability of the fuel to absorb water.
Effects on Your Generator
Water in diesel causes an array of problems and subsequently can lead to diesel generators either not kicking in, or failing mid-operation, when standby power is needed in emergency backup situations. This has been seen in disaster situations such as Hurricane’s Sandy and Katrina where data centers and other critical facilities, such as hospitals, faced severe down time as a result of backup generators not functioning as expected.
Effects on Your Business
Organizations lose an average of $138,000 for one hour of data center down time, an increase of 38% from 2010 to 2012. Operations spend millions ensuring power reliability and still these systems fail because the fuel is not managed.
Fuel Management: Fuel Testing
A comprehensive fuel management strategy begins with knowing what type of fuel you have, and the state it is being stored in. Research into biodiesel mandates in your area and perform regular onsite and offsite testing to see the biodiesel, water and microbial contamination of your fuel (microbial growth is a sign that troubles lie ahead).
Fuel Management: Fuel Polishing
According to Polaris Laboratories, “in systems prone to water contamination,” (such as fuel storage tanks) “it is imperative that the contaminated oil be able to shed water, or demulsify in order to maintain lubricity, viscosity and prevent the formation of acids.”
To begin creating a fuel management protocol, evaluate the tank, piping and generator set up to highlight areas of weakness; consider the impact of likely site temperature and humidity ranges. The Uptime Institute’s technical paper, titled Biodiesel, suggests finding a fuel polishing system utilizing coalescing filters which have been proven to remove water suspended within the fuel (emulsified water). The Society of Automotive Engineers (SAE) has updated a test to see the ability of a filter to remove emulsified water under its SAE J1488:2010 protocol. An automated fuel polishing system is recommended; continuously remove water and particulates, ensuring emergency ready fuel all the time.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 12:59p |
Prudential, Digital Realty Deal Could Boost Data Center Investment  This data center in Silicon Valley is among those included in a new joint venture between Digital Realty and Prudential Real Estate Investors. (Photo: Digital Realty)
Prudential Real Estate Investors (PREI) has teamed with Digital Realty Trust on a $369 million joint venture to operate fully-leased corporate data centers. The deal provides cash for Digital Realty, which is contributing nine facilities. The participation by Prudential is a vote of confidence in data centers as an asset class that is attractive to real estate investors.
By packaging properties with high-quality corporate tenants, Digital Realty (DLR) has made the portfolio attractive to “core” real estate investors like Prudential, who typically pursue a conservative strategy that focuses on stable properties with low investment risk. Establishing data centers as core assets opens the sector up to a broader range of institutional real estate investors, which in turn could boost data center investment by making it easier to find financing and buyers for projects.
“We are delighted to be partnering with an institution of PREI’s caliber, and we believe this transaction represents an important validation of the appeal of data centers as an asset class to a sophisticated, core real estate investor,” said Michael Foust, Digital Realty’s Chief Executive Officer.
Focus on Powered Base Buildings
The joint venture includes nine fully-leased data centers that were leased as Powered Base Buildings, an approach in which Digital Realty provides the tenant with a shell that the tenant builds out for data center use. The buildings, which total 1.06 million square feet, are valued at $366.4 million (excluding $2.8 million of closing costs), or $346 per square foot. The PREI-managed fund will take an 80 percent interest in the joint venture and Digital Realty will retain a 20 percent interest.
PREI receives rent from the properties (estimated at $24.5 million a year for 2013), while Digital Realty will receive $328 million for the sale to the JV and ongoing fees for managing the properties.
“The long lease terms and contractual rental rate increases on these Powered Base Building data centers provide a stable rental income stream that represents a good fit with our investment objectives,” said Cathy Marcus, managing director at PREI and senior portfolio manager of the firm’s core U.S. real estate strategy. “These institutional quality properties are fully leased to a diversified roster of credit tenants, and we look forward to realizing a stable return on this portfolio over the course of a long-term relationship with Digital Realty.”
High Tenant Credit Quality
Digital Realty didn’t identify the tenants. But based on an analysis of Digital’s portfolio, DCK believes the tenants include Equinix, Amazon, Verizon and CenturyLink (Savvis). The quality of the tenants is critical to making these data centers more attractive to investors.
The data center sector has long been viewed with caution by institutional real estate investors, who see it as a specialized, capital-intensive properties with a history of tenant credit problems dating to the dot-com bust, when many companies that built data centers experienced losses when unprofitable startup tenants went bankrupt of defaulted on their leases. The risk profile of the data center sector has changed dramatically over the past decade, as developers like Digital Realty have pursued new deployment models that manage capital more carefully. Operators have also focused on tenant credit quality.
That approach has won over Prudential Real Estate Investors, which has a global portfolio valued at about $53 billion. The joint venture could set a precedent by establishing data centers as a successful core real estate investment, which would broaden the pool of potential investors. Thus, while the deal has immediate benefits for Digital Realty, over the long-term it could create more options for data center developers seeking buyers for their properties (other than Digital Realty).
The joint venture has arranged a $185 million five-year unsecured bank loan from US Bank and SunTrust at LIBOR plus 180 basis points, representing a loan-to-value ratio of approximately 50 percent. | | 2:15p |
Cray Deploys Liquid-Cooled CS300 Supercomputer  The Cray XC-30 supercomputer (Photo: Cray)
Cray announced that it has won a contract with Mississippi State University to provide a Cray CS300-LC system. The CS300-LC is a liquid-cooled version of the CS300 cluster supercomputer. Expected the be delivered later this year, the 322 teraflop supercomputer will include Intel Xeon E5-2600 v2 processors and Intel Xeon Phi coprocessors. Its a liquid-cooled design that uses warm water heat exchangers instead of chillers to directly cool the compute processors and memory, allowing for a more efficient removal of system heat.
The new system will be located at the University’s High Performance Computing Collaboratory (HPC2), and will serve as the primary performance computing system for shared research. HPC2 supports a coalition of member centers and groups, and the effort spans a wide range of application disciplines, and several of the member centers have extensive physical and experimental modeling and analysis capabilities to complement their computational efforts.
“Our mission is to serve the University, State and Nation through excellence in computational science and engineering, and we are pleased to have the resources of a Cray supercomputer to support our efforts,” said Trey Breckenridge, director of high performance computing at Mississippi State. “With the Cray CS300 system and its advanced liquid-cooling architecture, we will provide our vast user community with a cost-effective and energy-efficient high performance computing system that is also a powerful and technologically-advanced tool for scientific research.”
Cray adds GPU and Phi to XC30
Cray also announced that it has broadened its support for accelerators and coprocessors, and is now selling the Cray XC30 series of supercomputers with NVIDIA Tesla K20X GPU accelerators and Intel Xeon Phi coprocessors. Us of the Intel Phi and NVIDIA Tesla GPU accelerators is a continuation of Cray’s support for innovative hybrid supercomputing technologies. In addition to now being offered in both the Cray XC30 and Cray XC30-AC systems, Intel Xeon Phi coprocessors and NVIDIA Tesla GPU accelerators are also available in the Cray CS300 line of cluster supercomputers.
“We designed the Cray XC30 supercomputer to be the realization of our Adaptive Supercomputing vision, which is about providing customers with a powerful, flexible tool for solving a multidisciplinary array of computing challenges,” said Peg Williams, Cray’s senior vice president of high performance computing systems. “Integrating diverse accelerator and coprocessor technologies into our XC30 systems gives our customers a variety of processing options for their demanding computational needs. Equally as important, Cray XC30 supercomputers feature an innovative software environment that allows our customers to optimize their use of diverse processing options for their unique applications.” | | 3:48p |
Uptime Certification Funny Business: Design vs. Construction  Are these two certifications the same? In fact, they’re different, according to executives from Compass and the Uptime Institute.
What does it mean to be Tier-certified? Misleading language and confusion over what it means to be “design certified,” versus “construction certified,” means that the customer may not always know what they are getting.
Some industry leaders believe the confusion around Uptime Institute Tier Certification has reached the point that it reflects poorly on the entire industry. Compass Datacenters CEO Chris Crosby and Uptime Institute SVP Julian Kudritzki are seeking to shine a spotlight on the issue, in hopes that it helps clean up the problem and helps customers protect themselves. They recently spoke with Data Center Knowledge about their concerns.
Both Crosby and Kudritzki argue that misleading language around tier certification is becoming more common.
“A ton of guys that claim a tier level, but don’t have certification,” said Crosby. “As a customer you have to go in and figure it all out. As the industry matures, those claims become more meaningful.”
Tier Certification on the Rise
Tier certification is an important issue, particularly in light of two trends: the increasing standardization of the industry as a whole, and the recent trend of multi-tenant data center providers seeking certification.
Uptime Institute created the standard Tier Classification System to evaluate data center infrastructure in terms of a business’ requirements for system availability. The Tier Classification System provides the data center industry with a consistent method to compare facilities based on expected site infrastructure performance, or uptime. Uptime defines four tiers, with I being the least reliable and Tier IV being the most reliable.
Uptime Institute offers paid certification of a data center’s tier level, which aims to do three things:
- Verify benefits defined within a standard are actually delivered.
- Act as an insurance policy for the customer.
- Provide binary evaluative criteria for customers
The Uptime Institute has awarded 278 certifications around the world. But the lion’s share of those certifications – 218 of them – have been based upon design documents rather than actual operational data center buildings. Just 67 have thus far completed certification of a completed facility. About 62 percent of operators that have received design certification and expected to eventually certify their completed data center.
Standardization occurs as an industry becomes commercial. “If I look back on my career, when the baby Bells deregulated there was a big influx of new equipment providers,” said Crosby. “You were NEBS (Network Equipment Building System) compliant or you weren’t. Data centers are similarly going to be consumer compliant. We’ve grown up as an industry. Having third parties come in to certify is the logical evolution. When it truly becomes commercial, from a consumer perspective, that’s when the certification comes in.”
In the data center industry, there are several ways to certify. In addition to the Uptime Institute Tier Standard, ASHRAE 9.9, LEED cetification and Power Usage Effectiveness are needed, according to Crosby and Kudritzky. Standards and third-party certification are hallmarks of a maturing industry.
Blueprint vs. Build
More and more data center providers are talking about how their data center builds are certified, sometimes giving the impression that they have gone through the rigorous certification process for a completed structure, when in fact they only have a Design Certification for the blueprint. The design certification for the blueprint is a much lower threshold to meet, and this presents some problems. If you have a design certification, but build differently than that, this presents an issue.
There are only two wholesale providers with Constructed certified facilities in the US: Compass data centers with two, and Digital Realty with one. Why the disparity?
Meeting full construction certification is a much higher standard. “The disconnect is that many customers believe that this is what they’re getting,” said Crosby. “It’s builders taking advantage of customer confusion about the different types of certification. It’s a deceptive practice that means customers aren’t getting what they paid for.” | | 3:49p |
iomart Acquires Cloud Backup Provider Backup Technology for $37M Brought to you by The WHIR.

iomart Group announced on Tuesday that it has acquired cloud backup and disaster recovery services provider Backup Technology for $37 million. Established in 2005 in Leeds, BTL has around 200 enterprise clients including Siemens and British Red Cross.
The acquisition of BTL comes less than a month after iomart acquired dedicated server and managed services provider Redstation for $12.4 million.
“We are delighted to welcome BTL to the Group as they have achieved ground breaking progress in the delivery of cloud backup and disaster recovery and have been on our radar for some time,” Angus MacSween, CEO of iomart Group said. “BTL gives iomart a solid and well-established platform to grow further from, with a very good enterprise customer base and little crossover from the existing Group base. It’s a very good strategic fit, complementing our portfolio of existing products.”
Cloud backup has been adopted by 30 percent of IT professionals in EMEA and North America, according to a recent Spiceworks survey. An additional 13 percent plan to implement cloud backup within the next six months, indicated a strong growth in this cloud segment over the short term.
“This is a really exciting moment for BTL,” Simon Chappell, founder and owner of BTL said. “We very much admire what Angus and his team have done in establishing iomart as the leading player in the market for cloud services in the UK and can see that our backup expertise and services are a great fit. iomart will be a great home for BTL.”
Chappell is leaving BTL to pursue other opportunities, but BTL’s team of 16 people will stay on at iomart.
BTL was advised by Livingstone Partners, and Pinsent Masons. iomart was advised by Shepherd and Wedderburn.
Article originally published at: http://www.thewhir.com/web-hosting-news/iomart-acquires-cloud-backup-provider-backup-technology-for-37m | | 6:27p |
Methode, NORLINX Team on DCIM Smart Rack  At the Data Center World conference this week, Methode introduced a “smart rack” with integrated DCIM capabilities. (Photo: Colleen Miller)
Methode Data Center Technologies this week announced a partnership with Norlinx, combining Methode’s hardware expertise with Norlinx’s software. The result is what the company calls a turnkey, integrated DCIM (data center infrastructure management) solution. In essence it is a smart rack with full asset and environmental monitoring, along with power management.
The companies were on-hand at Data Center World showing their all-in-one rack solution. It’s an appliance approach to DCIM, with everything you need fully integrated and built into the rack.
“This strategic partnership between Methode and Norlinx provides our customers with a comprehensive DCIM solution for improved monitoring, reporting and control,” said Tim Hazzard, president of Methode Electronics Data Solutions Group. “With this unique integration of hardware and software, we offer streamlined efficiencies and capabilities, while reducing the total cost of ownership.”
The company wants to make it like buying a refrigerator. “This is the Cadillac of racks – it can hold whatever you want it to and do full DCIM,” said Hazzard. All the pieces such as RFIDs to collect the data are in the cabinet, along with PDUs and cable management. NORLINX provides the analytics to take that data and do something with it.”
History of Automotive
Methode Electronics would know what the Cadillac of racks would look like. The company’s solid-state touch sensitive switches are used in many of today’s appliances and automobiles. That touchscreen in the console that’s in charge of all of the “infotainment” in those fancy Ford models? That’s Methode. It builds user interface consoles for new kitchen products, including for many of the world’s largest white goods OEMs. If you’ve noticed those fancy beverage selection interfaces at your favorite restaurant – like that touch screen that lets you pick a variety of options? – there’s a good chance Methode is behind that, too. The company’s global manufacturing capabilities not only allow it to “productize” this new data center cabinet, but also to cut out the middle man and directly manufacture it at an attractive price point.
The front of the data center cabinet also has a touchscreen, which the company has brought in from its experience in the automotive industry with companies like Ford. There’s also a keyboard tray and keyboard to allow for easy KVM interaction without entering the cabinet.
RFID asset tracking knows the instant a server enters or leaves, with RFID asset tracking mounted in side rails. Humidity sensors mounted at the top of the rack provides up to the second readings. Airflow sensors mounted at both the front and rear of the cabinet allows for true airflow analysis. There are also six temperature read points front and back at top, middle and bottom of the rack.
In terms of power management, there is individual control of each port of the PDU and breaker management. It provides consolidated power usage reporting for the PDU as well as at the port level. All of these hardware pieces inside the rack interact tightly with DCIM software suite provided by NORLINX. The software and hardware communicate out of the box.
“Built into the rack is the ability to manage, control, to know what’s going on,” said Hazzard. “It’s a good insurance policy. Ninety percent of data center floor mistakes are not malicious.”
Lost in the Shuffle
Hazzard refers to several instances where equipment is misplaced or lost on the data center floor, or when servers no longer in practical use continue to suck up power for no reason when they’re lost in the general data center shuffle.
A single IP address Cabinet Control Module (CCM) serves as the collection point for environmental monitoring, asset tracking, electronic lock controls, cabinet touch console and Power Distribution Unit, or PDU, management. Data is delivered from the CCM to Norlinx’s analytics software in one of three applications – GSM Rack, GSM Power and GSM air and space.
“This shows everything that’s happening on a granular level: the temperature, humidity, the airflow, it’s all monitored,” said Hazzard. What started as an RFID project evolved into deep asset tracking within the cabinet. “We’re making it easier to run predictively vs. reactively,” said Hazzard. “Having this data on a granular level takes away troubleshooting time in the long run. The DCIM piece of it coalesces info, getting data and rolls it up in one central repository.” It sends out auto alerts, such as when a lock is opened or a server leaves.
Issues Going Forward
It’s hard to position what Methode is selling here because the data center industry is an industry of habit. The company started marketing a micro-container but it didn’t catch on. The biggest problem that Hazzard sees is people who have chosen to standardize. By creating an all-in-one rack solution, it’s asking many users to abandon familiar processes and standards. It is competing against “dumb”racks in the data center that have long been in place. The company has built in PDUs, but Hazzard says some insist on using the PDUs they’ve always used.
“There’s been nothing new in PDUs in a long time,” said Hazzard, “but that’s still a request.” The company is competing against routine and standards. So although it has sought to build the ultimate cabinet, it faces an uphill battle convincing people to make the additional investment.
That’s why Methode is very open to letting companies trial the equipment. It will lend out one of these smart cabinets to a lab so they can test and see these advantages for a while before deciding to purchase.
Methode has a lot of experience in other industries, and it has collected all the pieces here into one form factor. The appliance-based approach to cabinets and DCIM means the company has chosen to focus on new builds, where companies are more open to trying new things. Hazzard believes it’s an affordable form factor, coming in around $6,700. Considering these racks can contain millions of dollars of equipment, Hazzard says this should be seen as a small investment in order to gain more control of what’s going on in the data center. |
|