Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, March 4th, 2013

    Time Event
    1:00p
    Telus Warms Condos With Heat From Its Servers
    telus-garden

    Offices and condos in the Telus Garden mixed-use complex in Vancouver will be warmed by a district heating system fed by the hot aisles in an adjacent Telus data center. (Photo: Telus)

    For every knock at data center energy waste, there’s two examples of innovative thinking and recycling going on. One of the latest design trends is using waste heat from a data center hot aisle in a district heating system for nearby offices and condos. Canada’s Telus Corp. provides the latest example of this, tapping waste heat from its data center in Vancouver  to power heating and cooling systems of its adjacent $750 million mixed-use Telus Garden development.

    FortisBC will operate a regulated utility within Telus Garden, in partnership with Telus and Westbank, creating a District Energy System (DES).

    “The TELUS Garden District Energy System represents a shift in how we think about and utilize energy,” said Andrea Goertz, senior vice-present of TELUS Strategic Initiatives and Communications. “By recovering energy that would normally be lost and putting it to good use, we are innovating through design to create one of the most environmentally-friendly urban communities in North America.”

    Provides 80 Percent of Energy for Tower

    Waste heat from the data center and the cooling system of Telus’ Robson Street headquarters will provide 80 percent of the energy needed to heat and cool the development’s one million square feet of space, and  heat domestic hot water for both of its towers on top of that. Capturing and recycling the waste heat will help reduce carbon dioxide emissions by one million kilograms a year.

    “Our collaboration with Westbank and TELUS is an example of the innovation and energy savings available to customers using district energy systems,” said Doug Stout, vice-president of Energy Solutions and External Relations for FortisBC, which is featured in a video presentation about the DES.

    The DES will help to reduce overall energy use and protect residents and employers from rising energy costs in the future. The British Columbia Utilities Commission has approved the construction of the TELUS DES system by the partnership, and for FortisBC to own and operate the energy system once commissioned.

    This is one of the first systems in Vancouver to use waste heat from a neighboring site to heat and cool a new development. But it isn’t the first worldwide.

    Other examples

    • Across the pond in London, Telehouse began using excess heat in a Docklands data center to heat nearby homes and businesses in 2009. It was the most ambitious effort at the time to reuse the excess heat from data centers.
    • IBM has a data center in Switzerland that warms a nearby community swimming pool.
    • An unusual concept was put forth by researchers from Microsoft and the University of Virginia in a paper published in 2011. It suggested that large cloud infrastructures could be distributed across offices and homes, which would use exhaust heat from cabinets of servers to supplement (or even replace) their on-site heating systems.

    Interested in getting in on this heat recycling love fest? Check out DCK’s Guide to HeatRrecycling.

    1:30p
    Mind the IT Skills Gap and the STEM Cliff!

    Bob Supnik is vice president of engineering and supply chain for Unisys Corporation. He is also directly responsible for software and hardware development.

    Bob-Supnik-tnBOB SUPNIK
    Unisys

    In discussions with customers’ IT executives – especially those whose data centers rely on mainframes – one often hears demographics and aging IT staff as recurring themes. In many organizations, key technical staff members are retiring or nearing retirement, and it’s difficult to find replacement staffers with the right skills.

    This skills gap is part of a trend in the larger STEM (Science, Technology, Engineering, and Mathematics) workforce in the United States.  According to one study, half of the STEM workers in this country will retire in the next 10 years.

    However, the demographic “time bomb” is about more than the loss of programmers who know COBOL or of test engineers who know transaction processing. There’s also been a major shift in how companies organize their IT staffs, and this has implications for supporting mainframe environments and related business applications.

    Programmer-analysts designed many organizations’ business-critical applications. These analysts are not only on the verge of retirement, but are also very difficult to replace. That’s not just because they have skills in legacy technologies, but also because IT departments separated the roles of business analyst and programmer a long time ago and can’t find people who do both.

    For IT, Age Isn’t Just a Number

    Historically, customers who came to IT firms with a problem to solve expected that the programming team would master the relevant business case, processes and technical issues and specify a comprehensive solution, from specification to design and, ultimately, code.

    This “community” approach extended all the way through the IT organization. The task of mastering the customer’s world was everyone’s responsibility, so that the resulting system would be a coherent and consistent solution to the initial problem.

    Of course, the industry has progressed and has taken a more streamlined approach to applications programming. The world has become infinitely more complex, and factors such as global competition, new technologies and regulations make it unrealistic to expect every programmer to understand every nuance of a business process.

    The new approach resembles an assembly line, where everyone has a specific role. Analysts create specifications for new applications and develop use cases. Architects and designers decompose the use cases into an overall structure and individual functions. Programmers write and unit-test the functions. Integration testers put the pieces together and test sub-assemblies and the whole application against the use cases that the analysts provide.

    The Culture Factor

    This change in programmer-analyst roles has an even more radical analogue in changing generational cultural attitudes, which spill over into IT.

    For example, veteran Unisys engineers continually work to enhance the security in the operating environment of our ClearPath systems because, as one recently said, doing so “is in our DNA.” From long experience, he understood that one careless programmer can undermine the security integrity of an entire system.

    That response raises the sometimes profound differences in cultural attitudes between generations – which can often manifest themselves as work style differences.

    For example, Millennials, today’s youngest workers, can demonstrate profoundly different attitudes from their older Boomer co-workers on matters such as IT security.

    The Millennials grew up in a consumer-focused world full of social networking sites and search engines, where security implementation is a secondary concern. At a recent event with a largely Millennial audience, one of our engineers asked how many in the room worried about the security and privacy implications of FaceBook and Google Apps. Only one Millennial raised her hand; all but one of the Unisys engineers raised theirs.

    Dealing with Demographic Destiny

    So, what must organizations do to sustain the quality of their mission-critical IT environments in the face of generational flux?

    One simple solution: seek help. Outsourcing system operations, application management and even application development to an expert provider can cut the need to sink resources into development and implementation of a new management approach. However, the provider would need personnel as well versed in the technology and system use cases as the departing internal workers.

    A second option: get educated. Educate younger workers in your business processes and mission-critical systems. Rebuilding your skills base, from application design and coding to performance tuning and system operations can have significant long-term benefits. It often takes more time and effort than outsourcing, but it’s a worthwhile investment. It could also be an opportunity to increase the diversity of your organization by opening new paths for women IT workers to replace male-dominated cadre on the verge of retirement.

    Regardless of approach, it’s imperative to recognize that demographics are destiny. The impending “STEM Cliff” in the United States is a major challenge, but it also presents a real opportunity to build new skill sets and ultimately help your increasingly younger IT team work smarter.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    2:00p
    The Iceotope Liquid Cooling System in Action
    Iceotope_module_annotated

    This cross-section of an Iceotope server module provides an overview of the liquid cooling system and how the water and Novec function inside the chassis. (Photo: Iceotope)

    Iceotope has moved from the show floor to the lab. The UK cooling company, which launched in 2009, has developed a liquid cooling system that encapsulates servers in heat pipe modules containing 3M’s Novec fluid as its heat removal medium. Last March we provided an update on the company’s demo at the Cebit trade show, and a year later the company has a system running in the lab at the University of Leeds.

    The company makes some attention-getting claims about its liquid cooling technology, claiming it can reduces data center cooling costs by 97 percent, IT power load by 20 percent and overall infrastructure costs by 50 percent. But Iceotope has advanced slowly since its debut at the SC09 conference, testing its technology at the Universty of Sheffield and now at the University of Leeds, which said this week that it has got its the first Iceotope production system installed after two years of testing prototypes.

    In Iceotope’s approach, each server motherboard is completely immersed in a sealed bath of liquid coolant which passively transfers heat away from the electronics to a heat exchanger formed by the wall of the module, where water is continuously re-circulated by low-power pumps. The system is nearly silent and requires no cooling outside the cabinet, which in theory would allow data center operators to eliminate expensive room-level cooling schemes. Iceotope says its system uses just 80 watts of power to harvest the heat from up to 20 kilowatts of IT equipment.

    “The fact that this system is completely enclosed raises a host of possibilities,” said Dr. Nikil Kapur, also from the University of Leeds’ School of Mechanical Engineering. “It does not interact with its environment in the way an air-cooled server does, so you could put it in an extreme environment like the desert. It is also completely silent. You could have it on a submarine or in a classroom.”

    “Extraordinary Stuff”

    “The liquid we are using is extraordinary stuff,” said  Dr. Jon Summers, Kapur’s colleague at the University of Leeds. “You could throw your mobile phone in a tub of it and the phone would work perfectly. But the important thing for the future of computing and the Internet is that it is more than 1,000 times more effective at carrying heat than air.”

    Iceotope uses Novec, a non-conductive chemical with a very low boiling point, which easily condenses from gas back to liquid. It’s made by 3M, which is also developing its own immersion cooling technology around Novec. That technology, known as “open bath immersion cooling,” is in the early stages of commercialization. But another liquid cooling vendor, Green Revolution Cooling, has placed several installations in production and is working with Intel to adapt motherboards for its immersion cooling systems.

    Iceotope executives say they are pleased with their progress and see big things ahead for the technology.

    “More than five years of research, innovation and collaboration have gone into Iceotope’s technology,” said Peter Hopton, Iceotope’s Chief Technology Officer. “The basic principle of the design has many applications and, while a few years away, there is no reason why every home shouldn’t make better use of the surplus heat from consumer electronics, imagine having your PC or TV plumbed into the central heating system.”

    In this video, Summers shows the Leeds system and demonstrates the novel capabilities of Novec, illustrating what happens when you put an iPhone in a beaker of the liquid:

    3:00p
    Brocade Data Center Repels Zombie Attack
    ZombiesOnElevator

    Some of the “Zombies” that showed up on the Brocade campus.(Photo: Brocade)

    Why are zombies always attacking data centers? We can only assume that it’s because their brain-eating habits direct them to the biggest brains they can find.

    Zombies have featured in an exercise by the Google disaster recovery team, which tested its readiness with a scenario in which zombies invaded Georgia and sought to devour the brains of tech staff in their Atlanta data center. The data center teams at RagingWire and DataCave have also explored the zombie readiness of their facilities.

    But none of these have gone as far as the team at Brocade, which has published several videos documenting a zombie attack on their corporate campus in San Jose, Calif. It turns out the networking vendor was having some fun with a customer request to test their network’s survivability from a Zombie Apocalypse. I guess we should be grateful they didn’t request a Dancing Zombie Apocalypse, or we might have wound up with another “Harlem Shake” video and a meme collision.

    Jed Bleess, Director Strategic Solutions Lab at Brocade, explains that these are new species of Data Center Zombies that “hunt for networks seeking the data that courses through switches, the life-blood of a company, then ripping them apart to byte on the bits spurting out of a fatally wounded network.” Will it work on the Brocade VDX switch?

    In an accompanying blog post, Brocade suggests the Data Center Zombies headed elsewhere for their dinner, using a Google map to offer an inside joke for folks familiar with the corporate geography of San Jose.

    Here’s Part One:

    Now here’s the second installment:

    3:30p
    7 Key Considerations for Big Data Adoption

    There’s little argument that the interchange of information and data continues to increase. Large organizations are now working with more customers and end-users that all have data-related requirements. As data growth has continued to rise, many IT environments are beginning to see the pattern where these data sets were becoming very large. And, in many cases, these big data sets are difficult to control.

    Let’s analyze some numbers:

    • According to IBM, the end-user community creates 2.5 quintillion bytes of data — so much that 90% of the data in the world today has been created in the last two years alone.
    • Every hour, Walmart controls over 1.5 million customer transactions. All of this information is transferred into a database working with close to 3 petabytes of information.
    • According to FICO, the Credit Card Fraud System currently in place helps protect over 2 billion accounts all over the globe.
    • Currently, Facebook process over 220 billion photos from its entire user base. And this number is growing by the minute.
    • Finally, The Economist recently pointed out that we are now able to decode the human genome in under 1 week – where it took 10 years to do so originally.

    In 7 Key Considerations for Big Data Adoption, we are able to see the key points to look out for when working with large amounts of information. Big data can be a very powerful tool when that information is properly utilized, quantified and controlled.

    This white paper illustrates the 7 important criteria to plan around when big data becomes a serious consideration for your organization. This means being able to answer and have an understanding around the following:

    • Data Warehousing.
    • Data Sources.
    • Business and IT Needs.
    • Data Governance.
    • Accountability for Insights.
    • Adoption Roadmap.
    • Service Providers.

    Remember, data needs will only continue to grow. As IT consumerization gains more momentum and more user data becomes available, there will be the direct need to manage and control big data sets. Download this white paper today to understand the 7 key considerations around big data adoption.

    << Previous Day 2013/03/04
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org