Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, April 8th, 2013

    Time Event
    11:30a
    New Facilities: Grocery Store to Become Hospital Data Center

    There were a number of announcements of new data center projects last week. Here’s a review of some of the announcements:

    Windstream Opens Virginia Data Center – Windstream Hosted Solutions held a special event last week to celebrate the grand opening of its enterprise-class data center in McLean, Va. The 65,000-square-foot facility was designed to meet the growing demand for cloud-based and dedicated managed services.

    Hospital to Convert Grocery Store into Data Center – A Pennsylvania hospital has received a state grant to help pay for the ongoing project to convert a former grocery store into a state-of-the-art data center that will house new electronic record-management systems. The new facility will house 27,000 square feet of data center space upon opening.

    123.Net to Open Remodeled West Michigan Data Center - 123.Net, a Michigan based telecommunications company, today announced that it will be celebrating the opening of its newly remodeled Grand Rapids data center. The facility, located at 400 76th Street in Byron Center, Michigan, has approximately 3500 square feet of data center space and more than 150 cabinets.

    Conlink Plans New Data Center in Grand Rapids - Comlink Inc., an East Lansing-based company that specializes in data hosting and “cloud” services, is entering the Grand Rapids market with a 9,000 square-foot data center. The facility at 3950 Sparks Drive SE, will employ 20 initially, the company said.>

    12:20p
    Fusion-io Unveils High-Capacity ioFX for Workstations

    fusion-iofx-flat

    Heading into the National Association of Broadcasters (NAB) 2013 trade show this week, Fusion-io (FIO) announced that the ioFX workstation acceleration platform is now available with 1.6 TB of capacity, in addition to the original 420 GB form factor. The high capacity ioFX is ideal for video editing and computer assisted design (CAD),

    “Digital production is undergoing a resolution revolution as production moves to 4K and beyond, while production budgets and deadlines continue to tighten,” said Vincent Brisebois, Fusion-io Director of Visual Computing.  ”To overcome these opposing forces, the Fusion ioFX can help digital artists efficiently deliver creative work faster, even when faced with the most demanding production requirements. Fusion-io is proud to collaborate with industry leading software developers and hardware companies to deliver breakthrough acceleration for the tools used by professional artists worldwide.”

    Targeted at artists who compose, edit, playback and finish digital content, the ioFX is also ideal for encoding, transcoding, particle simulations and working with large amounts of cached data. The 1.6 TB ioFX connects via PCI Express, and significantly improves workstation application performance.  Fusion-io has worked closely with the industry’s leading entertainment hardware and software providers to optimize the ioFX for visual effects production.

    “NVIDIA GPUs provide powerful performance to professional workstations, which is further boosted with the ioFX high speed memory platform,” said Greg Estes, industry executive, media and entertainment, NVIDIA. “ioFX dramatically increases the amount of high-resolution content that can be sent to NVIDIA Quadro graphics boards for processing at extremely high speeds, enabling better artist interactivity and, ultimately, better client satisfaction for our customers.”

    “From accelerating 3D painting in MARI, to reviewing shots in HIERO, to compositing in NUKE, Fusion ioFX adds powerful acceleration that can significantly enhance our applications,” said Bruno Nicoletti, Head of Technology and Founder at The Foundry. “All of our software is designed to remove as many technical barriers from production as possible, and Fusion-io acceleration takes that one step further with the ioFX integrated into artist workstations. As the amount of data artists work with in today’s highresolution formats continues to increase, the ioFX can help creatives spendmore time manipulating their work with much more interactivity than before.”

    Fusion ioMemory products such as the ioFX also include Fusion ioSphere remote monitoring and management software, allowing IT teams to monitor and manage all Fusion-io solutions deployed throughout a studio from a single interface.  The 1.6 TB Fusion ioFX will be available in summer 2013, and will be on display at NAB 2013, at a number of leading Fusion-io software and hardware Technology Alliance Program member booths.

    HP Z Workstations feature Fusion ioFX

    Fusion-io also announced that it is collaborating with global workstation leader HP to integrate the Fusion ioFX into the award-winning HP Z820, Z620 and Z420 Workstations. Additionally, professionals interested in adding the Fusion ioFX to their current HP Workstation can purchase the ioFX as a custom integration component.   The new solution will integrate the Fusion ioFX into the powerful HP Z Workstations with Intel Xeon processors. The architecture is designed to deliver an industry-leading platform for digital content creation applications by moving beyond the performance limitations and bottlenecks of traditional systems.

    “I consider HP one of the best engineering companies in the world, so I’m thrilled to see HP and Fusion-io working together to advance workstation computing architectures,” said Steve Wozniak, Fusion-io Chief Scientist. “The Fusion ioFX brings the intelligence of the Fusion-io approach to HP’s incredible workstations, adding even more powerful application performance to the precision engineering HP is known for around the world.”

    “HP is the workstation industry leader, and our customers demand to be the first to get cutting edge solutions that deliver performance, reliability and innovation,” said Jeff Wood, vice president of product management, Commercial Solutions Business Unit, HP. “Providing the Fusion ioFX in our high-end HP Z Workstations will offer customers improved performance to tackle their most challenging projects faster.”

     

    1:00p
    Best of the Data Center Blogs for April 8th

    Here’s a roundup of some interesting items we came across this week in our reading of data center industry blogs.

    Are You Asking the Right Questions: Standards – At the Compass Points blog, Chris Crosby looks at the devolution of industry standards: ” Failure to ask for, and receive, objective evidence of a provider’s adherence to the standards that underlie their performance claims places the customer in the position of having to make their decision based more on the sizzle rather than the steak.”

    Meet DSSD, Andy Bechtolsheim’s secret chip startup – An interesting startup profile from Stacey Higginbotham at GigaOm: “For almost three years many of the creators of Sun’s Zettabyte File System have been slaving away in a Menlo Park, Calif. building trying to build a chip that would improve the performance and reliability of flash memory for high performance computing, newer data analytics and networking. Funded by Andy Bechtolsheim, the startup is called DSSD, and a recent hiring campaign plus the release of several patents offers some clues as to what this stealthy startup is about.”

    5 Ways to Fool-Proof Your Data Backup Strategy – At the RagingWire Enterprise blog, Jerry Gilreath shares tips on backup: “In honor of World Backup Day, I’d like to give you five points, of advice. These are by no means complete. They’re just common-sense notes from the perspective of someone that has been in the thick of it.”

    Honey, I Positively Pressurized the Hot Aisle! – Aisle containment is an extremely effective efficiency strategy. But it pays to get the right expertise, as Data Centers Unclouded notes in looking at a project that didn’t: “The end result was a hi tech-looking pod that looked like a duck, walked like a duck…. but it didn’t quack like a duck.”

    Cutting Confusion over Open System Software in the Data Center - At the Schneider Electric blog, Damien Wells looks at the various meanings of open: “With the use of data center management software and DCIM on the rise, the requirement for applications to be Open System is increasing. However, there has been some confusion in the use of terminology, especially between Open System Software and Open Source Software which has also confused the specific benefits that each type presents for the end user.”

    1:15p
    The Importance of Intangibles in Cloud TCO Analysis

    Ravi Rajagopal, Vice President at CA Technologies, has led and managed organizations that delivered innovative and practical technical and business solutions for corporations and governments around the globe.

    Ravi_Rajagopal-tnRAVI RAJAGOPAL
    CA Technologies

    In my last post, I discussed how the cloud changes the economic value of IT, and revealed a new model to understanding TCO and ROI. That’s the only way an organization today can make rational decisions about IT investments.

    One of the most popular cases for adopting the cloud is that it promotes organizational agility. Once you go cloud, the argument holds, the organization can now do things it never could do before, or can do established things much faster.

    Cloud Expands Horizons

    As an example, I know a company that recently switched from an on-premise call center to cloud-based solution. Among other things, moving to a cloud service now meant they could hire people all over the world, wherever there was an IP connection. Before, employees had to be on-premise at a limited number of locations.

    They gained access to a much larger labor pool. They could offer more flexible hours to employees, and even let them work from home or while traveling. And they opened up to new geographic markets they couldn’t even dream of servicing before. That’s agility.

    If we’re talking about making rational economic decisions about the cloud, how can we account for the transformative impact it can have? This is hard to quantify beforehand, as are many hidden infrastructure costs in IT. Most organizations remain blissfully ignorant about the full impact of these intangible costs.

    Focus on the Intangibles

    That makes it hard to arrive at a good, hard-dollar decision. But if you don’t focus on the intangibles, you won’t have a complete picture of the hard numbers. Once you have a handle on tangibles, start perimeterizing the intangibles. They might not be core to the decision, but you can get a sense of their boundaries.

    And the more data you have, the better the organization will be at making the decisions. That company that moved to a cloud-based call center? Their move to the cloud was initially close to break-even. Their understanding of the intangibles served to reassure them that they were making the right decision, economically and strategically.

    What are Your Outcomes?

    One way to measure the intangibles is to focus on the outcomes rather than on the inputs. You could, for example, start looking at some of the customer statistics, both in absolute numbers and in overall trending. For that to happen, you need a good baseline. You must also work with the same questions and parameters so you can make a valid before-and-after comparison.

    For example, you have the customer stat baseline that’s set before you made the transition. Once you’ve made the transition, you look again at your customer stat parameters. That will tell you whether you’re moving in the right direction, and how you can optimize your execution.

    Above all, it’s essential you understand your legacy environment, both tangible and intangible, so you can make a fully informed decision beforehand. Then, when you make the transition, you’re in a great position to compare.

    The broader discussion here is that there are substantial benefits to being a data-driven organization, which many organizations are not. Most businesses are measuring some things, but few are measuring everything. If you’re not a data-driven organization, taking a holistic approach to cloud TCO analysis is a great way to get started on becoming one—and the best, and perhaps only way, to really measure the cloud for business value.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:00p
    Sweden’s Östersund Gets in the Data Center Game

    The global IT infrastructure is evolving beyond centralized data centers and into a distributed system of globally connected points. With more users, more devices, and a lot more data – the data center has become an integral part of any organization. US-based firms are now trying to bring data closer to the user to help deliver better end-user performance.

    As cloud computing and IT consumerization continue to push technology forward, organizations will need to seek out new places to house their data centers. The city of Östersund, Sweden wants to be one of those new data center destinations, and is outlining the merits of its location for medium to web-scale data centers.

    sweden

    The site at Torvalla Industrial Park, located just outside the City Centre, is prepared for construction and is ”shovel ready”. The Östersund site offer includes:

    • Location in the heart of the Power region
    • National electricity price zone 2 – among the lowest price in Europe
    • Reduced energy tax, 34 % lower than South Sweden
    • Extremely stable grid, no interruptions last 30 years
    • Redundant electricity supply connected to the national grid

    Download this white paper on the new Östersund site to learn about an advanced location capable of delivering a powerfully redundant infrastructure. In designing a robust data center, there are many considerations that fall into place. In this white paper, all of those concerns are addressed and expectations are taken to the next level.

    With the perfect climate, optimal cooling capabilities, and the ability to deliver a packet in 16ms or less to St. Petersburg – the Östersund site creates the optimal opportunity for any organization to distribute their infrastructure. The focus around data delivery and cloud components will only continue to evolve. With that evolution, organizations will need to find new places to locate their data centers; to ensure the consistent availability of information for the end-user.

    4:30p
    In Dublin, Cool Climate Fuels Cloud Computing Cluster
    google-dublin-cooling

    Google’s Paul Dunne (left) and John Bruton TD, Ireland’s Minister for Jobs, Enterprise and Innovation, inspect the air cooling system at the new Google data center in Dublin. (Photo: Google)

    DUBLIN, Ireland – In the rainy western suburbs of Dublin, the cloud draws near the earth, filling the halls of data centers for the world’s largest cloud computing services. This city has emerged as a primary hub for server farms supporting the growth of cloud services across Europe, as Microsoft, Google and Amazon build powerful facilities with halls packed with servers and storage.

    Dublin is unique amongst major European data center hubs in that its appeal is based on climate, rather than connectivity. While the thriving data center communities in London, Amsterdam and Frankfurt are built atop network intersections in key business cities, Dublin has become one of the world’s favored locations for free cooling – the use of fresh air to cool servers. It is a prime example of how free cooling is giving rise to clusters of energy-efficient facilities in cool climates.

    The free cooling revolution was unleashed by a simple realization – servers are much sturdier than were previously imaged. After many years of housing servers in digital meat lockers, research by Intel and Google (among others) demonstrated that IT gear can function in warmer environments with only a fractional increase in hardware failures.

    This has led to a shift in thinking about how to design and build data centers, which has allowed the industry’s largest players to slash millions of dollars from their electricity bills by using fresh air to cool their armadas of servers, rather than power-intensive air conditioners and chillers.

    In Europe, Dublin has been the chief beneficiary of this trend, boasting an ideal climate in which the temperature virtually never exceeds the upper ranges for using fresh air to cool the data center. The growing interest in free cooling has helped Dublin build upon its status as a technology destination for major U.S. technology companies. Both Microsoft and Google have more than 2,500 workers in development hubs and office operations in Dublin. In recent weeks both Facebook and Yahoo have announced plans to add hundreds of employees at new offices in Dublin.

    • In late 2008, Amazon announced that it had expanded its cloud computing services to Dublin, adding a European region for its EC2 compute service. By 2010, Amazon’s Dublin cloud hub was experiencing robust growth, and has continued to acquire land in the Dublin  area for future expansion of its cloud operations.
    • In 2009, Microsoft opened its Dublin data center, investing $500 million in the 550,000 square foot facility, which effectively functions as a large air handler, moving fresh air through the facility to cool servers. In 2012 Microsoft added a second phase, investing another $150 million to add of 13 megawatts  of power capacity.
    • In 2012, Google opened a  €75 million data center in Dublin’s Profile Park. Like the Microsoft facility, the Google data center is optimized to use fresh air to cool tens of thousands of servers.

    These technology  titans have boosted the existing data center ecosystem in Dublin, home to at least 13 data centers for providers such as Digital Realty Trust, Interxion, TelecityGroup and SunGard.

    The expansion of the data center sector has been welcome news for the Irish economy, which hs been hit hard by the economic downturn that began in the fall of 2008.

    “Our technological infrastructure is rapidly improving and cloud computing is one area where our climate gives us advantages,” said Minister Richard Bruton, Minister for Jobs, Enterprise and Innovation. “The Government will build on announcements like this with more ambitious policies to take advantage of this potential and contribute to our recovery.”

    5:00p
    Closer Look: Microsoft’s European Cloud Hub

    Microsoft-Dublin-HotAisle-4

    If there’s a poster child for Ireland’s ideal climate for free cooling, it would be the huge data center in Dublin that powers Microsoft’s online services in Europe. Our photo feature, Inside Microsoft’s European Cloud Hub, examines how Microsoft has optimized its data center design to make efficient use of fresh air, and follows the path of the air through the giant facility.

    7:32p
    HP Project Moonshot: Low-Power Chips To Increase Density

    moon-shot2013

    HP is now selling its first Project Moonshot systems–the bleeding edge of servers–which HP states is “the world’s first software-defined server to run Internet scale applications.” Also, Moonshot 1500 is using a low-power processor–specifically Intel’s Atom 1260 processor found in cell phones–that uses less energy, less space and reduces complexity and cost.

    There’s been an overall movement toward ultra-low power servers, and Project Moonshot is HP’s attempt. HP clearly sees an opportunity in building low-power, many-core servers, which can slash power usage over large footprints of Internet infrastructure.

    As CEO Meg Whitman said, “We’re living in a period of enormous change. There will be hundreds of billions of devices going to be connected.” As the IT world enters the era of the Internet of Things where every device and appliance is connected, creating and storing data, an increasing demand for compute is evolving. ” It’s no longer about petabytes, but brontobytes,” said Whitman. “And all of this takes a lot of elements in the background. We’re on a path that is not sustainable from a space, cost, and energy perspective.”

    Converged Infrastructure

    The Moonshot 1500 platform uses a converged infrastructure, using workload-optimized, extreme low-energy “server cartridges” in a unique enclosure that pools resources across thousands of servers through using HP Converged Infrastructure technology. This allows the sharing of resources—including storage, networking, management, power and cooling.

    The HP Moonshot 1500 System chassis is similar to a blade chassis, but on steroids. It is a 4.3U (7.5 inches tall) chassis that hosts 45 independent hot-plug ProLiant Servers, all attached to multiple fabrics.

    One moonshot system can get 180 servers in the system, including built-in switches. High-speed uplinks connect all the servers, with 10 terabits per second of I/O. One rack of Moonshots can replace 8 of traditional 1u 2p servers. It uses 89 percent less energy, 80 percent less space, 97 percent less complexity, which leads to 77 percent less cost.

    While the first Moonshot version on the market uses Intel processors, additional servers shipping later in 2013 will take chips from multiple partners such as AMD, Calxeda, Applied Micro and Texas instruments.

    Project Moonshot represents a new class of server designed to run Internet-scale workloads, and target specific workloads such as those that support gaming, genomics, telecom, video analysis and more.

    Client-server infrastructure was not designed to handle the level of computing that Internet-scale organizations are running, according to HP, and that the economics behind social, mobile, cloud and big data will deteriorate.

    The company also announced the Pathfinder Innovation Ecosystem: a program focusing on servers for different workloads. There are Internet-scale organizations today operating over one million servers. Additionally, many enterprises in finance have tens of thousands of servers. HP sees the opportunity to market a solution that meets the needs of these kinds of businesses. HP working on how to move large enterprises from a general purpose server, and move to a new era of software-defined server. The future is all about the software-defined server, according to HP. These are servers, which are specifically designed for different workloads and they’re looking to power a range of applications.

    HP first revealed it was building such low-power machines in the fall 2011.

    Close up of Moonshot 1500 from HP. The company today rolled out the units which use low-power processors.

    Close up of Moonshot 1500 from HP. The company today rolled out the units which use low-power processors.

    << Previous Day 2013/04/08
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org