Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, April 7th, 2014

    Time Event
    11:30a
    HBO Streaming Service Crashes During ‘Games of Thrones’ Premiere

    Winter is coming. Unfortunately for users of HBO GO, it was preceded by downtime. The streaming service crashed just after 9 p.m. last night, just as millions of users were tuning in for the eagerly-awaited premiere of season 4 of “Game of Thrones,” the premium cable service’s flagship series.

    “HBO GO did experience issues due to overwhelming demand around the premiere of Game of Thrones,” the company said in a statement to the Hollywood Reporter. “The service has returned to several platforms and we are working hard towards full recovery, which we expect soon.”

    The HBO GO team used its Twitter feed to alert users to alternate methods of viewing the premiere, using Game of Thrones puns that were likely lost on irritated viewers.

     

    12:00p
    Equipping The Next Generation of Data Center Professionals

    As Google and Facebook expand their Internet infrastructure, they are building massive data centers far beyond Silicon Valley. In towns like Council Bluffs, Iowa and Prineville, Oregon, these companies are bringing more than just servers into the community. They’re providing local teenagers with hands-on experience with server hardware and data center environments.

    As data centers become more complex, the task of training a new generation of professionals to staff them becomes more important. Recent news stories illustrate several initiatives in which the Internet’s largest companies are building interest in data center careers.

    Student-Run Data Center

    In Iowa, students at Abraham Lincoln High School are operating a $67,000 data center, which was funded through donation by Google (which operates a data center in Councile Bluffs) and Echo Group, a local IT company. The working facility houses medical research data from the University of Washington, Karlsruhe Institute of Technology in Germany and Stanford University School of Medicine, as well as providing IT support for the Council Bluffs Community School District.

    The facility, which opened this week, was profiled in the Daily Nonpareil, a local newspaper. Students in the district’s Emerging Technologies Academy monitor the servers at the data center, which took about six months to build.

    Chris Russell, an operations manager for Google in Council Bluffs, said the data center exposes students to real world experience and gets them interested in technology careers. “The real benefit for kids is to work in a real world with high expectations,” said David Fringer, the district’s chief technology officer.

    Google has awarded $820,000 in community grants to organizations in the Council Bluffs area in 2009.

    Servers at School Spark Enthusiasm

    In Oregon, Facebook’s old servers are seeding innovation among teens near the company’s facility in Prineville. Facebook recently donated 15 used servers to Crook County High School, which spurred the creation of a computer technology club where students are learning about server components. Assistant Principal Joel Hoff sees this as the nucleus of something big.

    “In five to 10 years, we hope to have a full Computer Science major,” Hoff told the Central Oregonian. “It would be great to provide opportunities for our students to be eligible for good, high-paying jobs right out of high school.”

    Facebook provided $8,000 in funding for the high school science department and tech club as part of its larger Local Community Action Grant program in Prineville, where the social networking company last month awarded $105,000 to 21 different local programs. That brings the total volume of Facebook grants in Prineville to $415,000 since 2011.

    A group of the grant recipients recently had a tour of the Facebook Prineville data center, including a visit inside the server hall. Here’s a photo via the data center’s Facebook page:

    prineville-cag

    1:04p
    The Time is Right for DCIM as a Service

    Daniel Tautges is the CEO of Pinpoint Worldwide Consulting

    The Data Center Infrastructure Management Market has been a bit of a paradox, in that everyone seems to now understand the value of DCIM but the market continues to grow at a much slower pace than projected.

    In Pinpoint Worldwide’s DCIM adoption survey, the main blockers to DCIM adoption were “Cost” and/or “Lack of Features”.  Would a lower cost, full feature, option lead to greater market adoption? Could DCIM-as-a-Service be the answer?  Pinpoint believes that the market is ready for DCIM as a Service (DCIMaaS)adoption and that it could provide a foundation for quicker DCIM growth.

    SaaS Has Multiple Benefits

    A direct advantage of SaaS vs. Enterprise is the upfront acquisition cost.  Most SaaS models are priced based on user and/or consumption whereas typical enterprise models are priced based on size.  You pay whether you use the system or shelf it.  In the early days of DCIM, the complexity to administer the systems became a major roadblock in usability.  Often, DCIM products were never put into production.

    Feature expansion is a difficult process for most companies as, even if covered by maintenance, allocating and managing the resources to make the change if difficult.  SaaS has a distinct advantage of a low up front cost, paying for what you use, and always incorporating the latest, most recent version of code.

    Operating expense of enterprise software can be significant, as the hardware and operational costs can easily exceed the initial software expense.  When looking at Return on Investment (ROI), the hardware, software, and operations over the life of the solutions needs to be calculated to determine the real ROI.  With a SaaS delivery model, the provider manages the support infrastructure that allows much quicker turn-up, eliminating new equipment procurement, provisioning, and commissioning.  By removing the operating cost from the solution, ROI’s are more easily attained, with a much quicker payback.

    Mobility is a key ingredient in leveraging the power of a SaaS product.  In SaaS the application can be accessed virtually anywhere there is an Internet connection.  With DCIMaaS, the service would be available whether accessed from a desktop, notebook, or mobile device irrespective of the physical location.

    SaaS Needs Reliable Infrastructure

    A SaaS service requires high levels of reliability and the risk for vendors that cannot provide it is huge.  I would argue that the risk of downtime is much greater for a SaaS vendor than it is for internal IT.  If a SaaS service becomes unavailable, typically, it is unavailable for a large portion of their clients.  This service level could result in both lost customers and, in some cases, actual SLA penalties.

    To alleviate the possibilities of performance related problems most vendors either operate their own data centers or contract with leading hosting companies that insure that there is enough capacity to support their current clients.  I propose that most SaaS applications have a much higher standard of operation than typical Enterprise applications and can support a better level of both performance and reliability.

    The underlying features in DCIM have low to high-level access level and data level for most organizations.  As an example, asset management, modeling, capacity planning, business planning, and change management does not require access to any real-time systems.  If the DCIM application is unavailable it could have impact but it will not bring the data center down.  Further with some exceptions, the data in these systems is typically not of a high security level.  If an outsider had access to the data, it would not be ideal but it would not introduce huge risk.

    The only capability of DCIM that represents warranted risks are control and device access.  If DCIM is doing control, changing device configuration, and/or doing discovery then a much greater security level needs to be considered.  A true SaaS approach might not be realistic where control is involved.  But, even in this case, a SaaS hybrid or middleware approach could be employed.

    DCIMaaS is in its First Stages

    Today, there are not many options for DCIMaaS.  The major vendors have all announced a DCIMaaS offering, but only Optimum Path Inc. offers a completely rewritten, true single instance, muti-tenant, solution.  The other vendor’s products appear to be entirely re-purposed from their existing enterprise product.  This limitation will directly affect their ability to price their products at a low enough level to gain the true cost benefits of DCIMaaS.

    DCIMaaS could be the key to unlocking the potential of Data Center Infrastructure Management.  DCIMaaS projects to offer a high level of features, lower costs, with higher performance and availability than traditional Enterprise approaches with manageable level of risk.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

     

    1:20p
    Teradata Unveils QueryGrid Analytics Platform

    At the Teradata Universe event in Prague, Teradata (TDC) announced QueryGrid, a cohesive analytic environment for big data with integrated processing within and outside of its Unified Data Architecture. The event conversation can be followed on Twitter hashtag #tduniv.

    Optimize, Simplify and orchestrate data processing

    Teradata QueryGrid provides optimized analytics across the enterprise and beyond. It gives users seamless, self-service access to data and analytic processing across different systems from a single Teradata Database or Aster Database query. Using analytic engines and file systems, QueryGrid concentrates their power on accessing and analyzing data without special tools or IT intervention. In doing this, data movement and duplication is minimized by processing data where it resides.

    The aim of Teradata’s Aster Discovery platform launched last year was to leverage multiple analytical capabilities. Teradata cited IDC research that showed only 10 percent of organizations have the features and functionality needed to explore data and discover insights. QueryGrid will reach out and integrate with Aster functions such as SQL-MapReduce, graph, Teradata databases, relational databases, and several programming languages such as Perl, Python, R, and Ruby. It will also provide remote, push-down processing in Hadoop by using Apache HIVE as a mechanism to project structure onto data and query it using a SQL-like language called HiveQL.

    “Teradata pioneered integration with Hadoop and HCatalog with Aster SQL-H to empower customers to run advanced analytics directly on vast amounts of data stored in Hadoop,” said Ari Zilka, CTO, Hortonworks. “Now they are taking it to the next level with pushdown processing into Hadoop, leveraging the Hive performance improvements from Hortonworks’ Stinger initiative, delivering results at unprecedented speed and scale.”

    In a recent blog post, Teradata Labs President Scott Gnau talks about what the company refers to as liquid analytics, or the bi-directional ability to bring analytics to the data or continue sending data to the analytics. Liquid analytic algorithms and systems can help channel the appropriate data to the right part of your analytic architecture, or the right analytics to where the relevant data resides.

    “Attempts at federation have been unsuccessful for many reasons. To deliver value from big data, customers should create an architecture that allows the orchestration of analytic processes across parallel databases rather than federated servers. Teradata QueryGrid is the most flexible solution with innovative software that gets the job done,” said Scott Gnau, president, Teradata Labs. “After the user selects an analytic engine and a file system, Teradata software seamlessly orchestrates analytic processing across systems with a single SQL query, without moving the data. In addition, Teradata allows for multiple file systems and engines in the same workload.”

    Database 15

    Terdata announced version 15 – the next major release of its database. Queries can be initiated from the Teradata Database to access, filter, and return subsets of data from Hadoop, Aster, and other database environments to the Teradata Database for additional processing. The analysis can incorporate data from the Teradata Database and Hadoop. At its last partner conference Teradata talked about integrating JSON.  JSON (JavaScript Object Notation) is a lightweight data-interchange format that is  a widely used format for the Internet of Things.  With JSON Teradata is able to take multi-structured data and incorporate it into its data warehouse.  JSON enables frictionless data loading and agile, schema-on-read access for extremely fast analytics for the Internet of Things.

    EDW 6750

    All of this big data and analytics is supported by the Teradata Enterprise Data Warehouse 6750 appliance. The EDW 6750 now has 3 times more flash memory, with up to 1.5TB of active memory per cabinet, and 40 SSDs per cabinet. This means that 43 percent more hot data can be kept in SSds. The EDW 6750 uses Intel Xeon processors, and delivers 40 percent more processing power with a total of 12 core processors possible. The appliance has achieved a four-fold energy efficiency improvement compared with 3 years ago.

    1:30p
    ElasticBox Raises $9 Million to Advance Modular Application Development

    After launching version 2.0 of its platform last summer, cloud company ElasticBox announced that it has raised $9 million in a series A funding round, led by Nexus Venture Partners and Intel Capital. The company previously received with $3.4 million in seed funding from Andreesen Horowitz and Sierra Ventures.

    “The cloud has fundamentally improved how people access and use infrastructure. But developing cloud based applications is still a lengthy, expensive and broken process that is stuck in the dark ages, like way back in the days of bare metal,” said Ravi Srivastav, CEO and co-founder of ElasticBox. “ElasticBox empowers the developer with preconfigured Boxes that they can mix and match to create applications — similar to the way a DJ blends beats and samples to create new music. It brings Dr. Dre-like creativity to enterprise application development.”

    ElasticBox introduced a modular way to develop applications in the cloud through a new approach called Boxes – using encapsulated, fully configured components of your application architecture that can be combined to create and run applications in the cloud. Enterprises create hundreds or thousands of internal applications to support business processes and create competitive advantage. ElasticBox empowers these enterprises to build better applications faster, which then helps them innovate more quickly. Netflix is one such company.

    “ElasticBox has provided us with the technology that we had been looking for, but had never found a great solution,” said Mike Kail, VP of IT Operations at Netflix. “With ElasticBox we are able to both create custom Boxes as well as leverage their preconfigured Boxes to deploy our internal applications, which allows us to focus on innovation instead of orchestration.”

    “ElasticBox’s pioneering work removes the complexity and cost associated with developing, deploying and managing apps in the cloud and truly delivers the promise of cloud elasticity to enterprises,” said Arvind Sodhani, president of Intel Capital and executive vice president of Intel Corp. “We look forward to helping Ravi and team accelerate their growth with enterprises around the world.”

    2:00p
    EMC Expands Data Protection Portfolio

    From archive to continuous availability solutions EMC announced new products to address a broad range of data protection requirements. EMC adds new integrations with primary and protection storage platforms, as well as hypervisors and enterprise applications.

    The EMC Data Protection Suite features snapshot management for EMC Isilon, EMC VNX and NetApp arrays, support for VMware and Microsoft cloud infrastructures, full integration between Avamar and Data Domain, and enhanced security features and Linux support for Mozy public clouds.

    With the introduction of a new EMC Data Domain Operating System EMC also introduces DD Boost for Enterprise Applications, which empowers application admins to protect their own environments and includes support for Oracle, SAP solutions, the SAP HANA  platform, IBM DB2 and Microsoft SQL Server. New secure multi-tenancy feature for cloud deployments enables Data Domain systems to deliver secure isolation for large enterprises and service providers and effectively function as a protection storage platform in data protection as-a-service deployments.

    New releases of VPLEX and RecoverPoint now combine to deliver MetroPoint topology, an industry-unique 3 site business continuity capability for mission critical applications. The new VPLEX Virtual Edition was launched, as a virtualized solution that offers both continuous availability and data mobility in a low-cost, 100 percent software deployment. With it EMC extends the market for continuous availability and mobility as customers of all sizes can now take advantage of proven VPLEX technology. VPLEX Virtual Edition will also be made available as part of VSPEX Proven Infrastructure solutions.

    “Over the past few years we’ve delivered numerous integrations that, together with a workload-agnostic protection storage platform as an anchor, provide a Protection Storage Architecture,” said Guy Churchward, President, EMC Data Protection and Availability Division. ”Today, we are announcing products and new integrations that enable customers to deliver a complete spectrum of data protection and availability capabilities as a holistic, consumable service. Data protection-as-a-service is a key tenant for establishing effective protection for software-defined data centers—a ‘bedrock’ that needs to be in place for organizations to confidently transition to 3 rd platform software infrastructures.”

    3:00p
    WesterosCraft: Game Of Thrones Meets Minecraft, Powered by Linode

    If you just can’t get enough of “Game of Thrones” after last night’s season 4 premiere, you can immerse yourself in the world of Westeros – inside Minecraft.

    Hundreds of open source developers are working together to build WesterosCraft, a replica of the world of Game of Thrones within Minecraft. It’s a collision of two cultural phenomenons, and it’s being hosted on virtual servers at cloud hosting firm Linode.

    WesterosCraft is a community of users recreating the entire world within the confines of Minecraft. The projected completion date is the end of 2014. When you look at progress so far, it is astounding, considering I’d have trouble making a table.

    Once the world is completed, it will be the virtual landscape for a Game of Thrones roleplaying Game (RPG). Founder and project manager Jacob Granberry needed to find a hosting provider that could handle the potential traffic and high volumes of concurrent users, all contributing to building, and later playing the game.

    “Once we took a look at Linode and discovered its many capabilities, we knew we’d found the perfect match and stopped the search,” says Granberry. Linode will handle thousands of users once this is up and running fully. It is a unique project worth keeping an eye on.

    Cultural Phenomenon Collide

    Minecraft is an open world sandbox game that is a global phenomenon. While its graphics leave something to be desired upon first entering the world, the ability to create and build is unparalleled. A quick Youtube search for Minecraft reveals that it has evolved into an amazing ecosystem of creations. If you’re not playing Minecraft, chances are your kids are.

    Created by Swedish programmer Markus “Notch” Persson, Minecraft has really captured the cultural zeitgeist. It sold into the double-digit millions on Xbox, sold 15 million on PC, and the pocket edition sold over 16.5 million copies. It is huge phenomenon.

    Game of Thrones is an HBO fantasy/sci-fi drama that makes dragons even more awesome than they already are. Complex plot, political intrigue, and a complete disregard for the lives of characters (that you’ll undoubtedly fall in love with) has made it a major smash hit.

    See a gallery of WesterosCraft progress so far here.

    7:00p
    A Day in the Life of a Modular Data Center Factory

    Where do data center modules come from? From the module factory, of course. Here’s a look inside the process at the new IO factory in Chandler, Arizona, where the company builds its IO.Anywhere modular data centers. IO uses large air casters – also known as “air skates” – to move the modules move down the production line. The air casters use compressed air to raise heavy loads (similar to a hovercraft) and allow one or two IO staffers to move modules as heavy as 20 tons. The facility employs more than 100 workers, including industrial and manufacturing engineers, assembly technicians and specialized technology subcontractors to produce modules on assembly lines. Once the modules are completed, they are loaded onto trucks for shipping to their destination. This video runs about 3 minutes, 30 seconds.

    8:40p
    DataTank: Immersion Containers for Industrial Bitcoin Mining

    As larger players enter the Bitcoin mining space, data center providers are customizing solutions to deliver greater density and efficiency for custom mining infrastructure. The latest example is DataTank, a new offering from Allied Control that houses ultra-high density Bitcoin hardware in immersion cooling tanks inside a data center container.

    Allied Control, a Hong Kong-based engineering company, says DataTank can “future proof” Bitcoin infrastructure, allowing miners to quickly refresh their hardware as more powerful systems are unveiled in the fast-moving technology arms race in cryptocurrency mining.

    The containerized solution, which was introduced for the Inside Bitcoins conference in New York, can house 1.2 megawatts of mining gear, housed in tanks filled with Novec, a liquid cooling solution created by 3M. Each container can house six tanks, each supporting 200kW of hardware.

    Inside each tank, densely-packed boards of ASICs (Application Specific Integrated Circuits) run constantly as they crunch data for creating and tracking bitcoins. As the chips generate heat, the Novec boils off, removing the heat as it changes from liquid to gas.

    Modular: the Right Form Factor for Bitcoin Mega-Mines?

    Allied Control says modular form factors are ideal for Bitcoin mining, limiting the amount of infrastructure needed. No raised floor environments or room-level temperature control is required, allowing the system to run with extreme efficiency (Allied Control claims a PUE of 1.01) in geography, including warmer climates in Asia. Hardware can be reduced to chips on boards, and easily changed out as more powerful systems are released for a “true wash-rinse-repeat experience” of refreshes.

    “Imagine you can make computers that consist of not much more than chips on boards,” said Kar-Wing Lau, VP of Operations at Allied Control. “You don’t have to worry about heat dissipation, power delivery, fans, heatsinks, waterblocks, pumps, or the mechanical infrastructure to stitch all that together. Systems cost less to make and don’t produce more e-waste than the strict minimum. They basically make money faster for the business that uses them, and they run extremely efficient with almost no energy wasted for cooling.”

    Allied Control says it is in talks with several large Bitcoin mining operations about setting up data centers in the U.S. that coudl scale to 10 megawatts of capacity. It says the 1.2 megawatt capacity of the containers allows them to be located in areas with modest power capacity, and be distributed across multiple markets.

    allied-control-container

    Allied Control’s vision for what a multi-module Bitcoin mining center might look like. (Image: Allied Control)

    << Previous Day 2014/04/07
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org