Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, December 24th, 2013

    Time Event
    1:00p
    The Top 10 Data Center Stories of 2013
    bahnhof-exterior

    Bahnhof’s new modular space station” data center includes an inflatable vestibule. (Photo: Bahnhof)

    Big projects, big budgets, design innovation and repurposing the old economy’s infrastructure for the new economy were key themes during 2013. That’s reflected in our review of the most popular stories on Data Center Knowledge in 2013, which also featured mystery barges, data centers that look like space stations, and the NSA. Here are the top 10 stories of 2013, ranked by page views:

    With Ubiquity, Sears is Turning Shuttered Stores into Data Centers – Our readers were fascinated with Sears Holdings’ entry into the data center business. Sears formed a new unit to market space from former Sears and Kmart retail stores as a home for data centers, disaster recovery space and wireless towers. With the creation of Ubiquity Critical Environments, Sears hopes to convert the retail icons of the 20th century into the Internet infrastructure to power the 21st century digital economy. Sears Holdings has one of the largest real estate portfolios in the country, with 3,200 properties spanning 25 million square feet of space. That includes dozens of Sears and Kmart stores that have been closed over the years.

    Google Has Spent $21 Billion on Its Data Centers – Google has invested more than $21 billion in its Internet infrastructure since the company began building its own custom data centers in 2006. The company’s spending has intensified in recent quarters as Google has launched a global expansion of its data center footprint, which has led to quarterly spending in excess of $1 billion. The company invested a record $1.6 billion in its data centers in the second quarter of 2013.

    Microsoft’s $1 Billion Data Center – This was a pivotal year in the emergence of the billion dollar data center campus, as tech titans continued to invest in key data hubs. With its latest expansion, Microsoft’s investment in its data center campus in southern Virginia has reached $997 million. Microsoft also provided a first public glimpse of its new data center design, which features pre-fabricated modules housing thousands of servers, some of which sit on a slab, open to the sky and the outdoors.

    The Space Station Data Center – The team behind the stylish “James Bond villain” data center in Stockholm began deploying its first modular data center. As you might expect, the project embraces a futuristic design that doesn’t resemble your typical data center. “The goal with this installation is to make it look like a space station,” said Jon Karlung, the CEO of Bahnhof. The design features a spacious double-wide module built with bullet-proof steel that will house servers, which attaches to “The Dome,” an inflatable central vestibule that houses security staff.

    NSA Building $860 Million Data Center in Maryland – The NSA was in the news throughout the year. While the agency’s Utah data center became the focus of global attention, the NSA continued expanding its data center footprint in other areas. The agency broke ground on an $860 million data center at Fort Meade, Maryland that will span more than 600,000 square feet, including 70,000 square feet of technical space. Last month the NSA and the U.S. Army Corps of Engineers began building the High Performance Computing Center-2, an NSA-run facility that will be located on base at Fort Meade, which is home to much of the agency’s existing data center operations.

    1:30p
    It’s That Time of Year: How to Prepare Your Data Center for 2014

    Lars Strong, senior engineer, thought leader and recognized expert on Data Center Optimization, leads Upsite Technologies’ EnergyLok Cooling Science Services, which originated in 2001 to optimize data center operations. He is a certified US Department of Energy Data Centre Energy Practitioner (DCEP) HVAC Specialist.

    Lars-Strong-HeadshotLARS STRONG
    Upsite Technologies

    As 2013 draws to a close, demands continue as high as ever for increased data center efficiency, capacity, and reliability. Given this focus on improving computer room cooling efficiency and that the typical data center today has cooling capacity that is nearly four times the IT load, data centers could reduce their operating expense by an average of $32,000 annually simply by improving airflow management (AFM). In addition, the stranded cooling capacity that is released from these improvements could result in deferring capital expenditure that would have been required for adding additional cooling units or building a new data center. Moreover, releasing stranded capacity makes increasing computer room density and reducing carbon emissions possible.

    Upsite Technologies’ recent research of 45 data center sites reveals there is much room for improvement. Poor AFM accounts for nearly half of conditioned air in data centers escaping through unsealed cable openings and misplaced perforated tiles. When your data center has common openings like these, it obviously requires you to run more fans to provide vital conditioned air to your heat load. This state of cooling inefficiency is a prime example of bypass airflow, which is any conditioned air supplied by a cooling unit that does not pass through (bypasses) IT equipment before returning to a cooling unit. Cable openings in a raised floor and excessive volumes of cold air delivered to a cold aisle are two principal sources of bypass airflow.

    As Power Usage Effectiveness (PUE) analysis reveals, the cooling infrastructure is the largest consumer of power in a data center, and AFM remains the easiest and lowest-cost way to improve cooling infrastructure efficiency and capacity. However, even if your site makes strides to improve AFM and you do it well, your efforts can easily erode over time, and some infrastructure components will require performance validation that is often not part of a standard maintenance agreement.

    Is Your Site Calculating Key AFM Metrics Monthly?

    You need to calculate your key AFM metrics monthly and analyze them annually for trends and capacity planning. Key AFM metrics include:

    • Cooling Capacity Factor (CCF) – The ratio of total running manufacturer’s rated cooling capacity to 110 percent of the critical load. Ten percent is added to the critical load to estimate the additional heat load of lights, people, etc.
    • Perforated tile and grate placement – Perforated tiles and grates should only be located in front of equipment that requires conditioned air for cooling. The percentage of properly located perforated tiles and grates should be 100 percent. Place perforated tiles and grates to make all IT equipment intake air temperatures as low and even as possible. Replace all perforated tiles and grates located in dedicated hot aisles and open spaces with solid tiles.
    • IT equipment intake temperatures – The primary purpose of a computer room is to provide a stable and appropriate intake air temperature for IT equipment. As such, computer rooms are in either of two categories, those with and those without intake air temperature problems.
      Of the 45 sites that Upsite researched, 20 percent of cabinets had hot spots and 35 percent of cabinets had cold spots on average. ASHRAE recommends an allowable IT equipment intake air temperature range of 64°F (18°C) to 80.6°F (27°C). The percentage of cabinets with intake temperatures outside of the ASHRAE recommended range should be 0 percent.
    • Raised floor open area percentage – Raised floor bypass open area is made up of unsealed cable openings and penetrations, and perforated tiles placed in hot aisles or open areas. The percentage of raised floor bypass open area is calculated by dividing the total bypass open area by the total open area in the raised floor. The percentage of bypass open area should be less than 10 percent.
    • Blanking panel utilization – Install blanking panels that seal effectively, with no gaps between panels, in all open spaces within cabinets. Spaces between cabinets and under cabinets need to be sealed to retain conditioned air at the IT equipment face and to prevent hot exhaust air from flowing into the cold aisle. The percentage of open U spaces filled with blanking panels should be 100 percent. Close all open space of the vertical plane of IT equipment intakes. Install blanking panels, seal under cabinets, and seal between mounting rails and sides of cabinets.
    • Rack space utilization – The utilization of rack space is important to understanding how well the valuable space of a computer room is being utilized. Cooling capacity and planning are closely related to rack space utilization.
      Equipment performance validation.

    Another key aspect of your overall AFM improvement strategy is to regularly validate your IT cooling equipment performance:

    • Return air temperatures vs. standard rated conditions – Manufacturers rate their cooling units on standard return-air conditions, typically 75 degrees (F) with a 45 percent relative humidity (RH%). However, since most sites run their cooling units with set points lower than standard conditions, the rated capacity cannot be delivered. This results in the very costly condition of more cooling units running because the cooling unit’s cooling capacity decreases at lower return-air temperatures. For example, a common 20-ton (70 kW) cooling unit has 20 tons (70 kW) of total capacity at a 75-degree (F) return-air temperature and 45% Rh. Conversely, at a 70-degree return-air temperature and 48% Rh, the same 20-ton cooling unit has a sensible cooling capacity of only 17 tons (59.7 kW).
    • Presence of latent cooling – In some IT configurations, high relative humidity (RH%) can result in condensation forming on cooling unit coils (i.e. latent cooling). Moisture condensing on cooling unit coils actually gives off heat that consumes some of a cooling unit’s cooling capacity, stranding capacity that could otherwise be used to reduce the air temperature of the supply air to IT equipment.
    • Calibration of cooling unit return-air temperature and relative humidity sensors – To accurately assess cooling unit return-air temperatures and latent cooling conditions, ensure that you regularly calibrate all cooling unit return-air temperature and relative humidity (RH) sensors.

    Infrared Temperature Surveys

    Conducting an infrared (IR) temperature survey will help track the proper management of previous thermal management issues, as well as identify any new issues that may arise.

    • First, use an infrared thermometer to measure the intake air temperatures. If they are all cool and the ceiling is cool, then there is more conditioned air being delivered to the aisle than needed.
    • Start removing perforated tiles and measure the intake air temperatures again.
    • Repeat Step 2 until you find that the intake air temperatures start to increase. Then, add tiles back until you resolve the problem. This process establishes the optimum airflow needed for that aisle.

    Keep Fine Tuning

    A computer room is a dynamic environment, so it’s unrealistic to expect that these key AFM metrics would not drift over time. Therefore, closely tracking each will help assure that your cooling infrastructure will be operating at maximum capacity, maximum reliability, and the lowest operating cost (and best PUE) in 2014.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    << Previous Day 2013/12/24
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org