Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, January 3rd, 2013

    Time Event
    1:00p
    QTS Acquires Herakles to Expand into Sacramento

    One of the power rooms at the Herakles data Center in Sacramento, Calif,. which has been acquired by QTS (Quality Technology Services).

    QTS (Quality Technology Services) has significantly expanded its footprint in California, acquiring the Herakles data center, the company said today. The Herakles facility is a 92,000 square foot Tier III data center in Sacramento, with 52,600 square feet of raised floor space and 9 megawatts of power capacity. The terms of the deal were not disclosed.

    QTS now has three wholly owned and operated facilities in California, with two other facilities in Santa Clara. “The acquisition of this technologically advanced data center adds a strategically located asset to our portfolio,” said Chad Williams, CEO of QTS. “Located just 120 miles north of our Santa Clara, Calif., data center and less than 90 miles from San Francisco, this facility expands our Northern California presence and offers California-based and national customers in each facility regional disaster recovery options as well as colocation and cloud services.”

    Sacramento is a popular location for data center infrastructure for disaster recovery, as it’s outside the state’s earthquake zone and about a two-hour drive from either the Bay Area and Silicon Valley. Sacramento also offers moderate power pricing from the Sacramento Municipal Utility District (SMUD). The Sacramento market features a cluster of commercial data centers, as well as some facilities housing infrastructure for California’s state government

    The Herakles facility will be fully integrated into QTS’ data center portfolio, dubbed QTS Sacramento. The company said California is among the most requested markets for QTS customers looking to expand the size of their data center footprint.

    Herakles has been providing data center services in the Sacramento, Calif. market since 2001, delivering uptime exceeding 99.999% over the years. Data Center Knowledge posted a video look inside the facility a few years back.

    “We’re proud of what our team has accomplished at Herakles, and when it came time to transition the asset to the current ownership, it was important to us to make the right decision on behalf of our customers,” said Lou Kirchner, president and CEO of Herakles.   “QTS has an impeccable reputation in the data center industry for combining reliability with flexibility and product innovation.   Additionally their commitment to making a difference in the markets they serve will make QTS an important addition to the Sacramento business community.”

    Herakles customer Next Cloud offered up a statement regarding the acquisition as well, noting the advantages of QTS’ national footprint brings to Herakles customers.

    “QTS’ acquisition of the Herakles data center now gives us the opportunity to remain in the data center environment that has proven to be very beneficial to our business while giving us the advantages of a relationship with a national data center provider,” said Gary Lamb, founder and chief technology officer-Next Cloud, Inc. “As we explore options for cloud services and business continuity, we now have the convenience of expanding with a single service provider.”

    The Bank Street Group served as exclusive financial advisor to Herakles in the transaction.

    1:30p
    Water Consciousness Continues in the Data Center

    As Director of Mission Critical, leads the Mission Critical Division of Gray Construction. Ron is a 24-year veteran of the construction industry with a focus on mission critical facilities and design-build. You can find him on Twitter at @RonVokoun.

    Ron Vokoun, GrayRON VOKOUN
    Gray Construction

    “Water is the new oil.” That is a statement made by a “futurist” at a leadership forum I attended back in 2006. It’s an idea made increasingly popular by Steven Solomon in his book “Water: The Epic Struggle for Wealth, Power, and Civilization.” I am not arguing this political position or debating the accuracy of this statement, but rather using it as a starting point for a conversation on water.

    Let that statement sink in for a moment though. To compare water to oil is to say that water is rare, of great value, and something that countries are willing to go to war over. If true, it certainly should be given more attention in our data center discussions.

    There is a complex relationship between the use of water and energy in the data center, which will be discussed in my next column. For now, I want to focus on cooling technologies that are proven to reduce the consumption of water, and the availability and alternative sources of water for data center cooling, to highlight the sustainable possibilities that exist.

    Water Reduction vs. Resource Optimization

    Before diving into the various cooling technologies, it is important to stress that both water and energy consumption can be reduced through the implementation of ASHRAE TC 9.9 for both temperature and humidity. Outside air (OSA) economization should also be a part of any cooling system to take advantage of the free cooling that TC 9.9 enables.

    If one focuses purely on reducing water use, technologies such as air-cooled chillers and heat wheels are easy choices that use no water in the cooling process. But if taking a more holistic view, you may pay a premium in the form of increased energy use with these systems.

    Evaporative cooling is often assumed to use more water than a traditional water-cooled chiller system, but as is illustrated below, this is not the case, at least in Phoenix.

    In many locations there isn’t a single answer for resource optimization. Rather, a combination of technologies to optimize water and energy use while staying within thermal guidelines is best. To illustrate this, as well as water use between technologies, compare the water and energy use for four cooling options in the data below.

    Water-consumption

    Click to enlarge.

     

    Annualized-Power-Consumption

    Click to enlarge.

    As you can see, direct evaporative cooling coupled with either air-cooled chillers or water-cooled chillers uses far less water than water-cooled chillers alone. Air-cooled chillers use no water, but result in increased energy use. And direct evaporative cooling also uses less power than water-cooled chillers. The caveat is that water use and the performance of specific technologies varies based upon location.

    These are just a few of the technologies available to reduce your cooling water use. Now let’s turn our attention to the availability of water.

    2:30p
    Ten Predictions: Why 2013 Will Be a Big Year for DCIM

    There are a number of reasons why 2013 might be the year for Data Center Infrastructure Management. DCIM software has had a number of years to mature, addressing the need for high levels of discipline in the data center. There are various factors that have contributed to the emergence of DCIM as a priority for data center operators- even if many of them are still sorting out the best way forward.

    “The economy led to a break/fix mentality in the data center at first, but we seem to be out of that now,” said Mark Harris, Nlyte Software’s vice president of marketing and strategy. “The energy crisis put a lot of adult supervision into the data center. A lot of questions were being asked, and a lot of those questions were about the physical makeup.”

    DCIM provides a holistic view of the entire data center ecosystem, dynamically recognizing all the pieces and how they’re interrelated. It helps to plan ahead, both for growth and potential disasters (what happens when a piece of equipment is removed?). It is being touted as the ERP application for the data center. And as with ERP (Enterprise Resource Planning), it is addressing a complicated challenge. When a device is introduced, changes or fails, it changes the makeup of these complex facilities.

    Here are 10 predictions for what 2013 holds for DCIM, based on conversations with industry executives:

    1.   DCIM Will Evolve to Track More of the Business Happening in the Data Center

    There are many DCIM offerings that provide limited infrastructure visibility, focusing on one part of the equation like energy consumption. Those that have the most success in the DCIM space will evolve to address monitoring and management of the entire data center environment. DCIM became a buzzword and hot topic predominantly because of increases in the price of power, the single largest line item in terms of operational cost. DCIM presented a way to lasso some of these costs and rein them in. But power management is just one—albeit important—piece of the total DCIM suite.

    “More customers are recognizing the strategic value (of DCIM),” said William Bloomstein, market strategist for iTRACS.  ”It’s not just about power monitoring or point products, it’s about a holistic view.”

    2.  More Understanding = Better Informed Customers

    The consensus among vendors is that a growing number of customers are asking informed questions.  In terms of the hype cycle, potential customers are starting to understand what DCIM means to them. “Customers are starting to get the value. The learning curve is accelerating,” said Bloomstein.

    A lot of vendors like Nlyte reported stronger sales in the second half of 2012, and believe this is the result of customers finally grasping the value of DCIM in early 2011.

    “Very few vendors have broken out and said we’re knocking it out of the park,” said Harris. “Everyone’s made a few sales …but that started to change in the second half of 2012. The first half was still in the ‘wait and see’ mode. The second half saw business ramp up significantly.”

    Most vendors we spoke with said that they’ve seen a tremendous amount of budgeting for DCIM on the part of customers. “There’s been a lot of curiosity, and this year [2012] we’re finding RFPs and budget items,” said Harris. 

    3.  It’s Still Early, But the Market is Crowded. Consolidation Lies Ahead.

    There are between 80 and 100 vendors claiming to offer DCIM solutions, and many will start sounding alike. “Most companies are scrambling,” said Harris. “They are doing something in the DCIM category in the next 12 months. There’s been a tremendous amount of segmentation, and it’s confusing for the customer as to how to get it all working together.

    “(The vendors) all do different things – one guy is bringing paint, the other guy is bringing beachballs,” Harris added. “You’re getting two stratifications, one is the integration suite, and the second is the enhancers.”

    When the dust settles, expect around a dozen integration suites out there at the most. These suites will build up their capabilities through M&A and through organic means. thinning out the herd and forming more complete offerings.

    4.   Expect New Competition

    There are a number of companies realigning themselves and declaring themselves as DCIM vendors,” said Harris. The consensus is that there will be a bigger DCIM play from the big four – BMC, HP, IBM and CA Technologies – in 2013, with DCIM becoming an extension to their overall strategies.

    5.   Pricing Will Become Clearer

    A recent Gartner report noted that the industry needs to make pricing more understandable and transparent, recommending that vendors move to what equates to a rack-based pricing model. In 2013 we will see a lot of simplification in terms of pricing DCIM, and the current favorite model is rack-based pricing.

    “Complexity of pricing is getting in the way,” agreed Bloomstein. “Pricing is a very important part. Every vendor seems to have a different pricing model. The faster that people can come up with a consensus, the faster adoption can occur. The more customers can make comparisons between pricing models, the better for everyone. Make it easier for organizations to justify the purchase of your DCIM solution by shifting to a simplified, rack-based pricing model.”

    “Pricing models are going to continue to be all over the board; per rack pricing will be the winner,” said Harris. ”The others will figure out how to be compatible (with the per rack model).” Harris also commented on the pricing descrepancies. “The numbers are all over the board, $500 to 5,000 per rack depending on the amount of functionality you want. Most suites will be $1,000 or less per rack.”

    2:50p
    Building a Data Center Network for 10Gbe Servers

    The current data center infrastructure has evolved beyond simple server design and workload deployments. Now, cloud computing, software services, shared storage and virtualization seem to dominate many data center infrastructures. All of this was built around the need to create a more robust, scalable infrastructure capable of better supporting the end-user and the computing experience. With BYOD, distributed environments and more users working on a “data-on-demand” schedule – engineers are working hard to find ways to deliver all of this data, quickly.

    Join Chuck Girt (OneCommunity), John Scheidemantel (NYSE Euronext), and Calvin Chai (Juniper Networks) as they discuss the importance of a diverse infrastructure and how to better support the core networking platform. In the Building a Data Center Network for 10Gbe Servers webinar, Juniper outlines how their environment is capable of not only scale, but advanced networking efficiencies.

    The idea is simple, you can have the most advanced servers, a great storage backbone and top of the line chassis equipment – but without a good core network, all the user experience will be throttled. Administrators are able to utilize an advanced and powerful 10Gbe network backbone to deliver virtualized workloads and control a more user-dense environment. In the Building a Data Center Network for 10Gbe Servers webinar, Juniper outlines how to take an environment and make it scale. Having a powerful networking backbone not only increases resiliency, but creates an easier to manage environment. The underlying goal, as outlined by the webinar, is to take an old, Rigid IT environment and transform it into a flexible, virtualized infrastructure. Click here to access this webinar.

    << Previous Day 2013/01/03
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org