Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, July 11th, 2014

    Time Event
    12:30p
    University of Delaware Puts Nail in Coffin of Data Center and Power Plant Project

    After more than a year of debate, the University of Delaware has killed a data center project proposed by a company called The Data Centers on one of its campuses.

    The proposal included a 279 megawatt power plant at the project’s site, a plan that became contentious after a group of residents in Newark, Delaware, where the facility would be built, started protesting it out of concern about the project’s environmental impact. The company was planning to invest about $1 billion in the project, which would include up to 900,000 square feet of data center space.

    This week the university terminated its lease agreement with the company, effectively putting a bullet in the project’s head, Delaware Online reported. The decision came after a group of faculty and administrators issued a report that concluded that the proposal was inconsistent with the institution’s plans for the property.

    Plant too risky, impact unknown

    One of the report’s key conclusions was that the massive Combined Heat and Power cogeneration plant’s potential environmental impact is unknown, since no such plant has ever been built in the country. “The proposed 279 MW cogeneration facility is significantly (by at least two times) larger than any other on-site power generation facility known to us at data centers in the United States,” the authors wrote.

    The scale and “unorthodox” design of the plant make it hard to confirm the developer’s environmental and commercial goals. The reviewer group also added that TDC has not really submitted a detailed design, providing instead a conceptual description, which makes it difficult to evaluate for environmental impacts.

    The unorthodox design they referred to was one that includes operation in the “island mode,” or unplugged from the electrical grid. “Switching between island mod and on-grid is non-trivial, and the necessary technologies to achieve this mode of operation should be demonstrated to ensure that TDC’s plans are viable,” the report read.

    Innovation hub won’t have a power plant

    The reason TDC submitted its proposal to the University of Delaware was because the institution was looking for sustainable infrastructure projects to build on its STAR campus. The plan is to build an innovation hub that combines facilities for research in energy and the environment, health and life sciences and defense and national security.

    After TDC submitted, a university steering committee reviewed the proposal and signed a land lease with the company in 2012.

    Last year however, some information TDC provided lead the university to believe the company had shifted focus to place greater emphasis on the power plant and selling excess energy to the grid, which did not seem to align with its plans for the campus. This is what led to creation of the review group whose recent report resulted in cancellation of the lease.

    1:00p
    Digital Realty to Give One Lucky Startup Free Data Center Space and Power

    Digital Realty Trust has kicked off a contest for startups, offering the winner a free 4 kW cabinet with power and fiber connectivity at one of its data centers as well as access to conference and break rooms in its facilities.

    The contest is part of the San Francisco-based developer and data center provider’s Digital Accelerator program. Digital Realty has grown along with a number of its tenants that started using its services in the startup phase and grew into successful businesses, and the new accelerator is a way to stimulate this dynamic to continue.

    It is also a good way to meet lots of potential customers for the company’s retail colocation business, which the predominantly wholesale provider has been expanding since 2012.

    Digital Realty provides data center space for companies like SoftLayer, Facebook, CenturyLink, NetApp and SingleHop, among many others. SoftLayer is an example of a firm that started its relationship with the wholesale data center provider early on in its life and has grown to the point where its owners were able to sell to IBM last year, reportedly for about $2 billion.

    Digital Realty is now accepting applications for the contest, which will be reviewed by its senior management, and the winner will be announced around the end of September.

    Matt Miszewski, senior vice president of sales and marketing at Digital Realty, said, “Everyone at Digital Realty, from our engineers and infrastructure analysts to our senior management team, is looking forward to sharing our resources and knowledge with the next generation of digital innovators.”

    Applications are due by August 31, 2014. They can be submitted at digitalaccelerator.co

    Here’s what the winning startup will get:

    • Access to conference rooms for up to two hours at a time in any of Digital Realty’s data centers worldwide
    • 24/7 access to break rooms in any of Digital Realty’s data centers worldwide
    • Informal meetings with members of Digital Realty’s management team, subject to availability
    • One locking cabinet with 4 kW of basic power capacity at a Digital Realty data
    • Two 208-volt circuits delivered in a primary redundant configuration with plug strips
    • One fiber cross connect

     

    1:30p
    EFF Releases NSA Data Center Photo Into Public Domain

    Hoping to help support conversations about the U.S. National Security Agency, activist organization Electronic Frontier Foundation has released an aerial shot of the NSA’s newest data center in Bluffdale, Utah, into the public domain, which means anybody can use it for any purpose.

    The photo was taken from Greenpeace’s massive blimp that flew over the data center site to protest the NSA’s overzealous data collection from electronic communication networks last month. The protest action was organized by Greenpeace, EFF and the Tenth Amendment Center, a California think tank that advocates for reducing the power of the federal government to give more power to the states.

    The EFF’s position is that the NSA’s approach of collecting as much data as it possibly can, which pushes it to build its massive data centers around the country, is misguided, simploy creating “ever-larger haystacks in pursuit of needles,” EFF activist Parker Higgins wrote in a blog post.

    The $1.5 billion Utah data center was completed only recently, and the agency is already building another $860 million data center in Maryland.

    EFF, Greenpeace and the Tenth Amendment Center organized a flyover in June 2014 above the NSA's Bluffdale, Utah, data center. (Photo: Greenpeace)

    EFF, Greenpeace and the Tenth Amendment Center organized a flyover in June 2014 above the NSA’s Bluffdale, Utah, data center. (Photo: Greenpeace)

    First Look Media has a collection of stunning aerial shots of several NSA facilities around the country.

    2:00p
    CloudSigma Lowers Barriers to Using Big Data Available in Public Domain

    Public cloud provider CloudSigma has a vision for a cloud-based “data discovery” model that will take Big Data generated around the world and beyond (yes, in space) and put it in the hands of users. The company is building ecosystems out of publicly available data, storing it and making it accessible for users to build services around. “Data has gravity, and that naturally draws computing into the cloud,” said Robert Jenkins, CEO, CloudSigma.

    There are lots of data in the public domain, but the barriers to access and leverage it is high. The Zurich-based company is trying to address this problem of “elitism,” making the data available on-demand in its Big Data cloud so users can easily access and temporarily mount it on virtual machines to build services. There is a wealth of possibilities across industries and across disciplines.

    “Even though it’s accessible, most people can’t use it,” Jenkins said. “What we’re doing is storing this data for free and making money on the compute end of things, as users access it on an as-needed basis to build better systems and offerings.”

    Inspired by European science cloud

    The work was born out of the Helix Nebula Marketplace, launched to provide European scientists easy access to commercial cloud services. CloudSigma is one of the companies represented on the marketplace. Helix Nebula is a public-private collaboration launched to support massive IT and data requirements of European research organizations, including European Molecular Biology Laboratory, European Space Agency and European Organization for Nuclear Research (CERN), best known for its Large Hadron Collider.

    The vision is a cloud-based discovery model. “Making public data extensible is a big movement, and there’s been huge progress,” said Jenkins. “It’s becoming yesterday’s problem. The problem now is the coordination problem. The challenge with Big Data is you can’t move it around. Cloud is in a unique position to solve this problem.”

    “Say I’m a user that has an idea for a better climate model or trading algorithm, but the bar is too high to set up the proper resources,” he said. “We’ve been working on putting the data in public cloud so it’s much more useful. We dedupe the data, replace ownership with access. All you need is short access, next to on-demand, ad-hoc computing. We’re happy because you’re doing the computing, and we can give public institutions the free storage.”

    From ash-cloud data to MRI scans

    CloudSigma is already working with a number of institutions and has several agreements in place with organizations like the European Space Institute and an Iceland organization that maps ash cloud and volcanic activity.

    It’s working with institutions that do neurological research as well. “We’re building a cloud backend for MRI scanners that sends it up to the cloud, renders and creates a much higher quality image,” said Jenkins. “They get a better quality service and it takes the computing out of the operating room.” Most general MRIs aren’t done in the highest resolution because of cost, but with cloud it has the processing power on-demand to produce better images, which helps with Alzheimer’s research, as an example.

    Many choose to donate their scans to science, and the company is aggregating publicly available scans from hundreds of hospitals in Europe and all over the world. It will allow neurological research to see time series of brain scans for research.

    Jenkins also spoke of a satellite launched in Europe for world observation. It has magnetic field sensors, can track ground water movement and the movement of the Earth with down-penetrating radar. The satellite generates a terabyte of data a day, and Jenkins wants to put this data in people’s hands as well.

    More closed financial data product in the making

    The possibilities are endless as there’s an increasing movement to make more publicly available data useful for the public.

    CloudSigma’s data ecosystems also go beyond public data. Going forward, the company is also looking to provide a similar ecosystem for financial data “We’re looking into the financial services industry and seeing how we can expose it securely to service providers,” said Jenkins. This kind of data is of course more proprietary and sensitive, so the company is building a secure service for it.

    4:58p
    Cray’s $174M Supercomputer on Order by Agency Overseeing US Nukes

    Supercomputer maker Cray has been awarded one of its largest contracts ever, to provide a $174 million XC supercomputer and Sonexion storage system to the U.S. National Nuclear Security Administration (NNSA), an agency within the Department of Energy that oversees management of the country’s nuclear stockpile.

    Dubbed “Trinity” by the NNSA, the new system will be a joint effort between the New Mexico Alliance for Computing at Extreme Scale (ACES) at the Los Alamos National Laboratory and Sandia National Laboratories as part of the NNSA Advanced Simulation and Computing Program (ASC). The agency’s fastest computer system to date will be used to ensure the safety, security and effectiveness of the nuclear stockpile.

    The NNSA says that Trinity is the first Advanced Technology (AT) system for the ASC program and will kick off a new computing strategy, which requires all systems to service NNSA mission workload while preparing the ASC applications for transition onto future advanced architectures. The XC will be the first to use the “Burst Buffer” concept and “Advanced Power Management” as part of the platform. These technologies will be provided as part of a fully integrated system consisting of compute nodes, memory, high-speed interconnect and parallel file system.

    “Both Los Alamos and Sandia have a long history with Cray, going back to the beginning of the supercomputing era and most recently with the Cielo platform,” said Gary Grider, High Performance Computing Division leader at Los Alamos. “That history continues with the Trinity platform that will provide next-generation supercomputing in support of the U. S. nuclear security enterprise.”

    The new Trinity system is expected to deliver more than eight times greater applications performance than the current Cray XE6 system named Cielo. Cray says that it will deliver the new XC supercomputer in a phased deployment, that will include a multi-petaflop supercomputing system and a multi-petabyte Sonexion storage system. The system is set to be located at Los Alamos National Laboratory. It will use the next generation of Intel Xeon Haswell and ‘Knights Landing‘ Xeon Phi processors. 

    Cray says that the new supercomputer will have 82 petabytes of capacity and 1.7 terabytes per-second of sustained performance. Leveraging the Lustre file system, it will feature a unique design that allows scalability from five gigabytes per second to more than a terabyte per second in a single file system. 

    “NNSA’s selection of the Cray XC supercomputer, powered by future Intel Xeon and Intel Xeon Phi processors, will deliver great application performance for a wide set of codes while the binary compatibility between the processors will allow the NNSA to reuse existing codes,” said Charles Wuischpard, vice president and general manager of workstations and HPC at Intel. “Intel is excited to build upon our longstanding and successful collaboration with Cray to deliver this vanguard HPC system to the NNSA.”

    5:15p
    Friday Funny: Clean Suit

    Attention, attention, DCK friends – another weekend is (practically) here! Let’s have a little fun this afternoon with a brand new Friday Funny!

    Diane Alber, the Arizona artist who created Kip and Gary, has a new cartoon for Data Center Knowledge’s cartoon caption contest. We challenge you to submit a humorous and clever caption that fits the comedic situation. Please add your entry in the comments below. Then, next week, our readers will vote for the best submission.

    Here’s what Diane had to say about this week’s cartoon, “You have to have your data center cleaned every once and while . . . but this seems a little extreme?”

    Congratulations to the last cartoon winner, Shahid Javed, who won with, “That blue cabling must be connecting us to the cloud…”

    For more cartoons on DCK, see our Humor Channel. For more of Diane’s work, visit Kip and Gary’s website.

    5:31p
    Amazon Intros 800-Pound Gorilla Into Cloud Storage and Collaboration Space

    Amazon Web Services (AWS) is deepening its service portfolio for mobile cloud developers and expanding into the online storage and collaboration market with a new service called Zocalo, which promises to be a fierce competitor to the likes of Box, Dropbox, Sugarsync or Carbonite.

    Zocalo is a consumer-friendly storage and collaboration service with a nice front end, but the backend is secure and tailored for enterprise needs.

    While S3 provides the foundational online storage piece of Amazon Web Services (the company’s massive public cloud infrastructure business), Zocalo is more of a commercial service with all the trimmings and interface built-in. It is an attempt to provide a service free of complexity, which the company claims has been plaguing enterprise solutions in this space.

    Secure, low-cost, any device

    Many use email as a way to share documents and files, but the approach is unorganized and doesn’t provide the necessary security needed in an enterprise setting. Zocalo lets users share and gather feedback through a secure central tool on the device of their choice, including laptops, iPad, Kindle Fire or Android tablets. It integrates with existing corporate directories, such as Active Directory, and provides administrators with flexible sharing rules, audit logs and control of where location is stored.

    All data is encrypted in transit and at rest. It’s also available with Amazon Workspaces, the recently launched Desktop-as-a-Service option.

    Zocalo is priced at $5 per user per month, including 200 gigs of storage. WorkSpaces customers are given up to 50 gigabytes for free, and pay $2 a month instead of $5 for the 200 gig package.

    Like the other easy-to-use online storage and collaboration providers, there’s no hardware to configure, no software to deploy – it’s meant to just work easily. This means serious competition for the online storage and collaboration space that provides a consumer experience within an enterprise setting.

    Business appropriate for a giant

    As we noted in our coverage of the latest $150 million funding round for Box, online storage and collaboration is an expensive business to be in and is often unprofitable unless you have serious scale. The size of Amazon, coupled  with the tie-in with WorkSpaces, translate into a strong offering that is very likely to last.

    Many users of early startups in the space had to worry about their providers potentially folding — and rightfully so, as several companies in the space had closed their doors, forcing mass data migrations. One particularly illustrative example was the demise of Nirvanix.

    “Customers have told us that they’re fed up with the cost, complexity and performance of their existing old-guard enterprise document and collaboration management tools,” said Noah Eisner, general manager for Amazon Zocalo. “AWS was increasingly being asked to provide an enterprise storage and sharing tool that was easy to use, allowed users to quickly collaborate with others and met the strict security needs of their organizations. That’s what Amazon Zocalo was built to do.”

    More mobile development tools

    The recent mobile development additions address the still rising tide of mobile applications.  More users are accessing the Internet through phones and tablets and developing specifically for mobile experiences continues to grow.

    The company launched Amazon Mobile Analytics, a mobile Software Development Kit (SDK), and Cognito.

    Mobile Analytics  provides developers with analytics detailing usage reports within an hour of data being sent to the app.

    Cognito is user identity and data synchronization service which lets developers create apps that can authenticate users through public log-in providers (sign in with Facebook accounts, for example). It also keeps app data, including user preferences and game state, synced between devices.

    The SDK is for helping developers across several platforms access these new services in addition to the ones they already use.

    8:00p
    Data Center Jobs: Vision Technologies

    At the Data Center Jobs Board, we have a new job listing from Vision Technologies, which is seeking a Project Manager in Chantilly, Virginia.

    The Project Manager is responsible for providing support for site QA walk-through (Internal and w/ Client), monitoring punch-list closeout, monitoring documentation process, documentation delivery, warranty application, verifying job costs are accurate, reviewing job for open materials and sub POs, and final invoicing.

    To view full details and apply, see job listing details.

    Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed.

    8:30p
    Service Providers and Bitcoin: Huge Opportunity, But Tough Economics

    This is the second feature in our three-part series on Bitcoin mining infrastructure. Read the first one here.

    Is Bitcoin mining a huge opportunity for data center service providers? Or will the bulk of infrastructure for the virtual currency be built out by Bitcoin entrepreneurs?

    There’s plenty of activity on both fronts as the Bitcoin network experiences explosive growth. As large mining operations build out low-cost hashing centers, some data center providers are leasing significant amounts of space and power to Bitcoin specialists, while others remain wary of the density requirements and economics of the deals.

    Bitcoin customers are seeking to acquire lots of capacity. But they want high-density space, prefer their contracts short and cheap and specialize in a speculative and rapidly changing business.

    “There’s about 150 megawatts of Bitcoin capacity being shopped around right now,” said Mark MacAuley, a Bitcoin enthusiast and managing director at RampRate, which helps enterprise clients find data center space. “There’s this massive demand. It’s a huge opportunity, but this is a different animal.

    “Most of the data center facilities out there are Tier III and designed for mission-critical loads,” said MacAuley, who has done consulting work for cryptocurrency specialists. “Bitcoin is not mission-critical. Street power is fine. Cost is the biggest driver for them, and most traditional providers can’t meet their price point. None of these guys want long-term contracts. The service providers want to get into this business, but their product isn’t suited to this demand.”

    N+1 or N+2 don’t cut it, but ‘N-0.5′ makes sense

    Some providers have managed to build thriving Bitcoin businesses by adapting their offerings for the sector’s specialized needs. One company that has embraced the opportunity is C7 Data Centers in Utah. Bitcoin customers represent 17 percent of C7′s business, according to CEO Wes Swenson, who said his firm houses 4.9 megawatts of Bitcoin capacity, with another 5 megawatts scheduled to come online this fall.

    “It’s an interesting challenge,” said Swenson. “Most of the data centers that I know are not taking this business. Many of these Bitcoin miners can’t find data centers that meet their requirements for cost and concentration.”

    To accommodate Bitcoin customers, C7 has rolled out a new data center design that Swenson describes as “N-0.5.″ It offers high-density space with lower reliability and no service-level agreement. The room is cooled using ambient air and cold-aisle containment, with no UPS or generator backup for the mining rigs. The design requires less infrastructure and up-front investment, which makes these deals work for the provider.

    “I’m making the margin I need to reinvest in the business and still have great pricing,” said Swenson. “But I don’t know how you can run an N+1 or N+2 data center and still make money on Bitcoin.”

    Space at $1.5 million per megawatt

    A recent deal illustrates how service providers are adjusting their offerings. CyrusOne recently signed a lease for 41,000 square feet of space in its Phoenix data center for a customer housing high-density equipment in immersion cooling tanks. CyrusOne delivered the space at a cost of $1.5 million per megawatt, significantly below the $7 million per megawatt the company typically spends on enterprise data center space with “five nines” of uptime.

    CyrusOne didn’t identify the tenant or its business, but industry observers note that the deal aligns closely with the requirements of Bitcoin miners. CyrusOne CEO Gary Wojtaszek says only that the Phoenix customer “does not require the same resiliency as traditional deployments.”

    “One of the key engineering challenges we were presented with was how to achieve the lowest operating expense possible for the customer,” Wojtaszek said in CyrusOne’s recent earnings call. “Our engineering teams work closely with the customer and we were able to deliver the environment for less than $1.5 million per megawatt.”

    << Previous Day 2014/07/11
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org