Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, July 15th, 2013

    Time Event
    12:23p
    Ballmer: Microsoft has 1 Million Servers
    Microsoft has more than 1 million servers

    The interior of a container packed with servers at a Microsoft data center in Chicago. (Image: Microsoft Corp)

    Microsoft now has more than 1 million servers in its data centers, according to CEO Steve Ballmer, who confirmed the number during his keynote address during last week’s Worldwide Partner Conference in which he handicapped the size of the server platforms for the world’s leading  cloud computing providers.

    “I claim there really are almost no companies in the world, just a handful, that are really investing in scaled public cloud infrastructure,” said Ballmer (see transcript ). “We have something over a million servers in our data center infrastructure. Google is bigger than we are. Amazon is a little bit smaller. You get Yahoo! and Facebook, and then everybody else is 100,000 units probably or less. So the number of companies that really understand the network topology, the data center construction, the server requirements to build this public cloud infrastructure is very, very small.”

    With Ballmer’s comments, Microsoft becomes the first of the major search and cloud players to confirm how many servers they have running in their data centers. For Google has released data suggesting it has at least 900,000 servers under management, and media reports suggest they have deployed at least one million servers over the life of the company. But the company has never confirmed a number.

    Ballmer didn’t indicate whether he had first-hand knowledge of  other companies’ server counts or was guesstimating. It’s hard to imagine Google or Amazon sharing information about their server platforms with Microsoft, as those companies believe the details of their infrastructure are a competitive advantage. Even Facebook, which has open sourced much of its designs for its data center hardware but remains vague about its server count, saying only that it operates “hundreds of thousands” of machines.

    Will Ballmer’s statements end all the server secrecy? Will we see Larry Page and Jeff Bezos disclosing the size of their server armadas? In disclosing the number, Ballmer was making the point that massive scale matters in public cloud, and Microsoft has it.

    There are those who assert that server counts are a silly metric. The Internet disagrees, as our examination of Who Has the Most Servers? continues to be one of DCK’s most popular features. We’ve updated that story to add the new number for Microsoft, as well as updates for Facebook, OVH, Akamai, eBay and Rackspace. How many servers do these companies have? Look here to find out.

    1:51p
    New Numbers: Who Has the Most Web Servers?
    Akamai  operates more than 127,000 servers around the globe. (Photo: Akamai)

    A look inside the network operating center for Akamai Technologies, which operates more than 127,000 servers around the globe. (Photo: Akamai)

    In light of Steve Ballmer’s confirmation that Microsoft has more than 1 million servers, we’ve updated our list of Who Has The Most Web Servers? to reflect the official Microsoft number. The updated list also features new totals from Facebook, OVH, Akamai, Rackspace and eBay. Check out the updated Who Has The Most Web Servers. The list includes companies that have publicly confirmed their server counts, with a separate section to discuss those that don’t disclose the data, but are likely large enough to merit discussion.

    2:32p
    Stream Data Centers Building in Minnesota Market
    stream-dallas-yard

    A look at one of the dedicated utility yards at a Stream Data Center in Dallas. Stream is expanding into the Minnesota market with a new facility in Chaska. (Photo: Stream Data Centers)

    There has been a lot of data center news out of Minnesota this year, with several providers entering this key emerging market. The most recent is Texas-based Stream Data Centers, which is launching in the Minneapolis market with plans to develop a purpose-built, 75,675 square foot data center in Chaska, a southwest suburb of Minneapolis.

    “Our customers are some of the largest businesses in corporate America,” said Rob Kennedy, Co-Managing Partner at Stream. “This purpose-built data center will serve their enterprise IT and data center needs and give them access to Chaska’s robust power and fiber infrastructure.” This will be the company’s 16th facility in the U.S. and comes on the heels of the ground breaking of a similar facility in San Antonio Texas.

    Known as The Stream Private Data Center, the Chaska greenfield development will have dual-feed power from two utility substations and will be served by eight fiber providers. The building has the option of being divided into three private suites or one tenant can control the whole facility. Each suite will provide customers with 1,2000 kilowatts of critical load and will have conduits and pads in place, allowing for expansion to up to 2,400 kilowatts.

    Each suite will offer a private utility yard as well as independent back-up generators, power and cooling infrastructure. And, each will provide disaster recovery office space, redundant private telco rooms and 10,000 square feet of raised floor in a Private Data Hall.

    The facility will be constructed to withstand 185-mph winds and uplift, and Stream Data Centers will build it out using a 2N electrical and N+1 mechanical configuration.

    More Activity In the North Star State

    The company notes that the presence of a new data center in the Minneapolis area also suggests opportunity for local economic development. Since data centers are typically long-term users of commercial real estate, they can boost tax rolls for cities without adding extra traffic to the streets or more students in local-area schools.

    Minneapolis-St. Paul is the third largest metropolitan area in the Midwest and home to corporate headquarters for 19 Fortune 500 companies and several large, privately-held firms.

    Minnesota offers tax abatements to its data center tenants, who can take advantage of tax incentives offered to encourage IT and data center investment throughout the state. It also offers sales tax rebates on the purchase of computers and related equipment, including networking and storage systems, cooling and power infrastructure, software and even electrical power.

    Other providers have recently made moves in Minnesota:

    • ViaWest recently announced it is building a 150,000 square foot facility.  
    • Cologix is growing in the Minnesota market, having purchased the Minnesota Gateway located in the carrier hotel at 511 11th Avenue South. It gave the company 20,000 square feet in the most connected building in Minnesota. The 511 Building is a 270,000 square foot building adjacent to the Metrodome.
    • DataBank acquired VeriSpace back in March, moving outside its primary Dallas footprint. The company has
    • Compass Datacenters received a 50% property tax abatement for a planned 89,000 square foot facility in Shakopee, Minn.
    • Digital Realty Trust acquired a fully leased facility in Eagan, Minn. in April as a sale-leaseback.

    Stream Data Centers develops and operates data center facilities for corporate users, including fully commissioned Private Data Centers and powered-shell Ready-to-Fit Data Centers.  Stream Data Centers has a 14-year track record of providing space for enterprise data center users including Apple, AT&T, The Home Depot, Catholic Health Initiatives, Nokia and others.  During that time, Stream has acquired, developed and operated more than 1.5 million square feet of data center space in Texas, Colorado, Minnesota, and California representing more than 125 megawatts of power.

    Stream’s work in the Minneapolis area comes just weeks after the ground breaking of a similar facility in San Antonio, Texas, in June 2013. That 75,840-square-foot building will be fully commissioned and ready for occupancy in April 2014.

    3:48p
    Are You Up in the Air with Cloud Computing Benefits?

    Alan McMahon works for Dell in enterprise solution design, across a range for products from servers, storage to virtualization, and is based in Ireland.

    alan-mcmahon-tnALAN McMAHON
    Dell

    Cloud computing is used in many different ways these days, but not everyone is completely on board the cloud train quite yet. For those holdouts who just aren’t sure whether or not they’re really benefiting from the cloud, a close look at the technology shows that you can improve almost any IT department with a bit of cloud computing. You don’t have to go as far as setting up a platform or complete infrastructure on the cloud, as even a small amount of cloud applications could provide you with cost and time saving features.

    Flexibility

    The main advantage of the cloud computing world is its flexibility but initially this can seem confusing as you can have private clouds, public clouds, or a mix of both. A private cloud is your own virtualized environment, while a public cloud is one that another company provides for you. Companies may opt to do a mix of both, with their own proprietary applications in their private cloud, and additional applications in the public cloud. This allows the business to use their own developed software in a cloud that they personally handle the security of, while getting the cost saving benefits of the public cloud by avoiding upfront equipment costs and other overhead.

    Private clouds do require costs in development and deployment, since these are typically built from scratch. The control that a company has over a private cloud is complete, as opposed to public cloud offerings. There are also some service providers who focus on private cloud services, so you get the benefit of private cloud without all of the associated upfront cost. It’s a good compromise between doing it all yourself, especially if you aren’t completely sold on the cloud computing idea. This method of cloud computing is commonly used by federal agencies who can’t risk their data being in even a slightly unsecured location.

    Shared Resources

    Public cloud computing is not for the sole use of a single company. These resources are shared with many users, and the hardware the cloud computing company provides is built on a system that makes the most efficient use of it. The biggest advantage of this type of cloud is the cost savings. You don’t have upfront costs, downtime for deployment, or time for implementation. You may have to spend some time training your employees to get used to the cloud, but it’s not going to be any bigger time investment than other types of software.

    Cloud Bursting

    Then you have hybrid cloud computing. This type of cloud computing interfaces with both public and private clouds, so you can get the best of both worlds. One way that is common use of hybrid cloud computing is “cloud bursting.” This uses public cloud resources to handle times of heavy traffic or increased load on the private cloud servers, such as an e-commerce store during Cyber Monday or the Christmas holiday season.

    It’s going to take some time for the entire IT world to get on board with cloud computing, but the benefits are well worth it. A mix of private and public cloud services is a great compromise for companies who are worried about security issues in relation to the cloud. You’ll find that you spend much less time worrying about maintenance, support, and deployment when public cloud providers are taking care of software updates and other necessary tasks for you. You’ll also save budget, since you won’t have a lot of servers or hardware sitting around that you simply don’t need most of the time, but only reaches capacity during heavy traffic times. Even cloud computing in a small scale is worth taking a look at, as it allows access to much more powerful hardware than you would otherwise have.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:44p
    Animal Logic Contemplates Cloud Options for VFX Rendering

    Award winning visual effects and animation studio Animal Logic is an Australia-based company behind many blockbuster movies. To accomplish rendering on movies and keep pace with rapidly growing compute demands, the company has had to constantly evaluate on-premise equipment upgrades or cloud opportunities.

    Animal Logic operates a supercomputer to perform rendering, which helped it win the 2007 Best Animated Feature Film Oscar for “Happy Feet.” In 2012 the company made an upgrade to its system, which resides in a containerized data center from IBM, located outdoors next to offices at Fox Studios in Australia.  The supercomputer was upgraded with 450 new HP blade servers with Intel CPUs (16 cores and 64GB memory), to bring total capacity up to 10,000 cores. It also has an EMC Isilon clustered storage system with around 500TB of primary data storage. See IT News for some images of Animal Logic’s systems.

    Recently the company has been working on movies such as “The Great Gatsby”, “Walking with Dinosaurs” for the BBC and “Iron Man 3″.  With the demands of all of this computing power Animal Logic has had to look to its service provider Steam Engine for additional resources. The changing trends in cloud computing and supercomputers has left the company contemplating options for on-shore clouds and new supercomputers. There’s also Amazon Web Services, which announced its AWS Asia Pacific (Sydney) region late last year.

    “We have definitely looked with great interest at Amazon Web Services and what they are offering,” Animal Logic head of technical operations, Xavier Desdoigts, told the Sydney Morning Herald. “We are really trying to see whether it make sense for us to use those services.

    “It is a horses for courses kind of set-up whether it makes sense for us to invest in our own capabilities or… use someone else’s services,” said Desdoigts. “Looking at the future is about what is the best level of capabilities for the next two to three years. Will there be a mix of solutions? Absolutely.”

    For a sense of the kind of graphic effects Animal Logic produces, check out this before-and-after video from Chris Godfrey, the VFX supervisor on the film.

    [Error: Irreparable invalid markup ('<p [...] http://player.vimeo.com/video/68451324">') in entry. Owner must fix manually. Raw contents below.]

    <p>Award winning visual effects and animation studio <strong>Animal Logic</strong> is an Australia-based company behind many blockbuster movies. To accomplish rendering on movies and keep pace with rapidly growing compute demands, the company has had to constantly evaluate on-premise equipment upgrades or cloud opportunities.</p> <p>Animal Logic operates a <a href="http://www.smh.com.au/it-pro/cloud/the-logic-of-rendering-movies-in-the-cloud-20130508-2j6y8.html">supercomputer</a> to perform rendering, which helped it <a href="http://www.datacenterknowledge.com/archives/2010/03/25/rendering-happy-feet-2-at-30kw-a-rack/">win</a> the 2007 Best Animated Feature Film Oscar for &#8220;Happy Feet.&#8221; In 2012 the company made an upgrade to its system, which resides in a containerized data center from IBM, located outdoors next to offices at Fox Studios in Australia.  The supercomputer was upgraded with 450 new HP blade servers with Intel CPUs (16 cores and 64GB memory), to bring total capacity up to 10,000 cores. It also has an EMC Isilon clustered storage system with around 500TB of primary data storage. See <a href="http://www.itnews.com.au/Gallery/233931,powering-legend-of-the-guardians-a-supercomputer-story.aspx/3">IT News</a> for some images of Animal Logic&#8217;s systems.</p> <p>Recently the company has been working on movies such as &#8220;The Great Gatsby&#8221;, &#8220;Walking with Dinosaurs&#8221; for the BBC and &#8220;Iron Man 3&#8243;.  With the demands of all of this computing power Animal Logic has had to look to its service provider Steam Engine for additional resources. The changing trends in cloud computing and supercomputers has left the company contemplating options for on-shore clouds and new supercomputers. There&#8217;s also Amazon Web Services, which announced its AWS Asia Pacific (Sydney) region late last year.</p> <p>&#8220;We have definitely looked with great interest at Amazon Web Services and what they are offering,&#8221; Animal Logic head of technical operations, Xavier Desdoigts, told the Sydney Morning Herald. &#8220;We are really trying to see whether it make sense for us to use those services.</p> <p>&#8220;It is a horses for courses kind of set-up whether it makes sense for us to invest in our own capabilities or&#8230; use someone else&#8217;s services,&#8221; said Desdoigts. &#8220;Looking at the future is about what is the best level of capabilities for the next two to three years. Will there be a mix of solutions? Absolutely.&#8221;</p> <p>For a sense of the kind of graphic effects Animal Logic produces, check out this before-and-after video from Chris Godfrey, the VFX supervisor on the film.</p> <p align="center><iframe src="http://player.vimeo.com/video/68451324" height="270" width="470" allowfullscreen="" frameborder="0"></iframe></p> <p><a href="http://vimeo.com/68451324">The Great Gatsby VFX</a> from <a href="http://vimeo.com/user10120109">Chris Godfrey</a> on <a href="https://vimeo.com">Vimeo</a>.</p>
    6:58p
    Romonet’s Portal 2.0 Targets True TCO Measures for IT Services

    Estimating IT costs in the data center is a headache Romonet is looking to relieve with Portal 2.0, an updated version of its SaaS-based data center performance and lifcycle management software suite. The suite brings together an in-depth view of facilities, IT and business to swiftly, simply and intelligently forecast, plan and track business performance.

    Romonet wants to eliminate the historical practice of estimating IT costs, which makes it impossible to judge whether the data center is actually working as expected.

    “Management by exception has proven invaluable to the success of many businesses: bringing it into the data center is a logical step,” said Zahl Limbuwala, CEO of Romonet. “Portal 2.0 removes the headaches traditionally associated with capacity planning and cost modelling, allowing organizations to focus on the investment decisions that will best help their business.”

    With Portal 2.0, users can predict, analyze, and continuously improve the performance of the data center, minimizing the Total Cost of Ownership (TCO). Businesses have an immediate view into how efficiently it is managing and utilizing its data center infrastructure spend.

    Predictive Modeling

    Portal 2.0 adds a new operational capability using the same predictive modeling technology currently employed by Romonet. The portal has the ability to compare “expected vs actual” performance down to individual sub-systems, enabling quick identification of operational issues so they can be fixed before they impact service. Romonet’s perspective is that it’s the quality of the information, rather than the volume, that makes the difference in data center cost.

    The ability to reveal the true costs of IT services means it can prevent the data center from becoming a cost black hole: absorbing investment with no clear indication of how that investment benefits the business. For service providers, understanding total delivery costs means they can ensure that they understand margin per client or per service.

    “DCIM and sub-system metering has become very popular in the data center as operators grapple to manage the reliability & performance of their increasingly complex and costly data centers,” said Limbuwala. ”However having actual metered data at such a granular level without knowing what each meter ‘should’ be reading can actually lead operational teams into a false sense of security. With Portal 2.0 deployed on top of a DCIM/metering solution operational teams can now see an ‘expected’ value against each sub-meter. Ultimately this allows operators to quickly spot divergences in performance anywhere in the system and stop potentially service disrupting issues before they occur. That’s the true power and nature of predictive analytics.”

    The company says that unlike many DCIM solutions, Romonet’s Portal 2.0 can be deployed and delivering value in days without disruptive and costly hardware or software agents. “Typically we can model any running data center in no more 3-5 man-days and have them up and running in Portal getting value in under two weeks” said Limbuwala.

    Romonet Portal 2.0 helps organizations in the following ways:

    • Identifies exactly where and how many meters organizations actually need to use in their data center infrastructure to gain the best insight into their performance and justify an investment in the right level of metering.
    • Reduces risk in capital investment by accurately modeling the outcome of each investment option.
    • Optimizes performance by swiftly identifying discrepancies between expected and actual performance of new infrastructure while still at the commissioning stage, allowing issues to be quickly identified and addressed before going live.
    • Lowers Total Cost of Ownership by giving organizations insight into their IT costs and allowing them to see exactly where savings can be made.

    << Previous Day 2013/07/15
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org