Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, April 18th, 2016

    Time Event
    7:19p
    Can Nature Help Us Solve an Impending Data Storage Crisis?

    Will the cold storage data center of the future include a DNA synthesizer? According to a new research paper by the University of Washington and Microsoft, it’s a strong possibility.

    Today, we generate data faster than we can increase storage capacity. The volume of digital data worldwide is projected to exceed 16 zettabytes sometime next year, the paper’s authors wrote, citing a forecast by IDC Research. “Alarmingly, the exponential [data] growth rate easily exceeds our ability to store it, even when accounting for forecast improvements in storage technologies,” they said.

    A big portion of the world’s data sits in archival storage, where the densest medium currently is tape, offering maximum density of about 10 GB per cubic millimeter. One research project has demonstrated an optical disk technology that’s 10 times denser than tape.

    Nature’s Data Storage

    But there’s another approach that promises storage density of 1 Exabyte per cubic millimeter, or eight orders of magnitude higher than tape. That approach is encoding data the same way nature encodes instructions for building every living thing on Earth: DNA.

    In addition to density, DNA storage addresses another big limitation of archival storage: longevity. Tape can hold data for 10 to 30 years before data integrity starts to corrode, and spinning disks are rated for three to five years. DNA’s observed half-life is more than 500 years in harsh environments, according to the paper.

    The idea to store data in the form of synthetic DNA has been around for a long time, but the huge improvements in cost and efficiency of synthesizing and sequencing genes in recent years have made its feasibility a lot more probable. Its state of the art went from a 23-character message in 1999 to a 739 kB message in 2013.

    As today’s booming biotech industry delivers orders-of-magnitude cost and efficiency improvements in DNA sequencing and synthesis, it quickly raises the limits of how much data can be stored using the method. Growth in sequencing productivity eclipses Moore’s Law, the paper’s authors wrote.

    Big DNA Storage Improvements Proposed

    The work presented in the paper pushes the technology further in two big ways: the researchers propose a way to improve integrity of stored data (current DNA storage error rates are about 1 percent per nucleotide) and a way to access individual pieces of data in a sequence randomly (with the current approach, you have to sequence and decode an entire DNA pool to access a single byte within).

    The paper proposes an architecture for a DNA storage system that includes a DNA synthesizer, a storage container, and a DNA sequencer. The synthesizer encodes data to be stored, the container holds pools of DNA that map to a volume, and the sequencer reads DNA sequences and converts them to digital data.

    It addresses the error problem with redundancy, an approach that has been proposed before but without regard to the impact of redundancy on storage density. The new encoding scheme introduced in the paper offers “controllable redundancy,” where you can specify a different level of reliability and density for each type of data.

    The problem of random access is solved by using the same technique molecular biologists use to isolate specific regions of a DNA sequence in research. Polymerase Chain Reaction is a technique used to “amplify” a piece of DNA by repeated cycles of heating and cooling. The DNA storage researchers use PCR to amplify only the desired data, which they say accelerates reads and enables specific data to be accessed without sequencing the entire DNA pool.

    While DNA storage is not practical today, the rate of progress in DNA sequencing and synthesis in the biotech industry and the “impending limit of silicon technology” make it something computer architects should seriously consider today, the researchers conclude. They envision hybrid silicon and biochemical archival storage systems as the ultimate cold storage of the future.

    7:49p
    GoDaddy CTO Elissa Murphy Resigns to Join Google
    By The WHIR

    By The WHIR

    GoDaddy chief technology officer and executive vice president of its cloud division Elissa Murphy resigned last week to join Google.

    According to a regulatory filing by GoDaddy, Murphy’s resignation is effective on May 17, 2016.

    GoDaddy told Fortune that GoDaddy CIO and infrastructure officer Arne Josefsberg will take over Murphy’s position. Josefsberg joined the company in 2014 after serving as enterprise cloud company ServiceNow and, prior to that, a 26-year career with Microsoft.

    GoDaddy CEO Blake Irving told Fortune that Murphy was “instrumental in the evolution of the company, and while we will miss her greatly, we wish her the very best as she takes on a new and exciting challenge at Google.”

    Read more: GoDaddy Launches Cloud Servers and Bitnami-Powered Cloud Apps Worldwide

    Irving worked with Murphy at Microsoft and Yahoo, where she was the VP of engineering. Murphy joined GoDaddy as Irving had committed to changing to face of GoDaddy, including dropping sexist ads. According to a report by Wired, now GoDaddy has more women in technical and engineering jobs than at places like Google and Facebook. Murphy was instrumental in setting up GoDaddy’s partnership with the Anita Borg Institute.

    With Murphy, GoDaddy updated its online infrastructure, including building atop software tools like OpenStack, Hadoop and Spark. While some wonder whether GoDaddy will be able to keep up the momentum it has built on overhauling its technology with Murphy, the company will continue to roll out new services in support of cloud. For example, on Monday , GoDaddy said it is building on its existing partnership with Microsoft and is now offering GoDaddy domains and SSL certificates through the Microsoft Azure App Service platform.

    This first ran at http://www.thewhir.com/web-hosting-news/godaddy-cto-elissa-murphy-resigns-to-join-google

    9:53p
    What’s the Storage Engineer’s Future Role in IT?

    Matt Watts is Director of Technology and Strategy for NetApp EMEA.

    Over the next four years, 35 percent of the core skills you have today will change. As a storage engineer, what do you think these changes might be?

    I recently had an interesting meeting with storage engineers who were considering a move to a commodity type storage infrastructure with little or no data management capabilities.

    They were so enamored by the technical specifications of the new equipment that they said they were quite comfortable giving all of the data management capabilities that were inherent in their existing storage array up to the virtualization layer or to the applications themselves. Basically all of the value that used to be offered like data management, data protection, and replication capabilities were going to be given to other teams in the organization.

    If you take all the value you used to deliver and give it to someone else to do, what exactly will you do in the future and why would your company need so many of you?

    As hypervisor and application levels of protection become more robust you effectively give up a significant part of the value you used to offer. You need a clear view on what you believe your role is going to be in the future, because it clearly will be quite different than the role you currently have today.

    IPgraph

    I’ve been in the storage business 20 years. Back then, our focus was really granular. We had to focus on which area of the physical disk where we stored the data; outer-edge, outer-middle, center, inner-middle, or inner-edge. Each location provided different performance.

    We then evolved to choosing different RAID levels. Each type of RAID offered benefits but also drawbacks. Raid 0 was fast but with limited protection. RAID 10 was fast as well, but it had a significant capacity overhead. RAID 4 and 5 were efficient, but they had parity overheads which could limit performance.

    Today, RAID has evolved so far that whilst it’s still a consideration, it’s just a tiny part of what we concern ourselves with. Now we worry about the volumes and policy. We think about what volumes are going to be created, what level of protection they require, as well as replication and application integration. This policy type management is where we are now.

    Going back to my meeting with storage engineers, what was really interesting was that even this seemed to be no longer their concern. They are now giving the data management capabilities back to the teams that manage the application or virtualization layer.

    So what’s left? The logical conclusion in this case is that managing storage is becoming less complex and therefore requires fewer people.

    The Business of the Business

    Within IT, the focus has to move more to understanding what the business wants to do with the applications and the data they create. Storage engineers need to be closer to these teams. The connection between those using the service and those delivering it has to be much stronger. You used to look at the physical, but now the question is much more about the reasons behind the data being created: how to make it simple, keep it fluid and agile, and with accelerated cycles. It’s a much more service oriented approach, tying into best practices.

    With more companies moving to a hybrid cloud approach, some of the storage you are going to be deploy may not be your own. If a service is required for test or development and you need to deploy quickly, then maybe AWS is an option.

    You need to help create the policy framework for keeping data on premise or deployed to a hybrid cloud. And if you choose a hyperscale provider such as AWS, then you’d better be sure you can easily bring data back.

    Technology moves at an incredibly fast pace. It’s incumbent on all of us to consider what technologies may be appropriate for the future. But equally important is for us to consider what our role will become as these changes occur.

    The changes in technology are both exciting and intimidating. As an IT professional, it is critical that you see yourself as a business professional and not simply a technology specialist. Finding ways to contribute to the success of the business will be a key skill in the future.

    10:42p
    Coca-Cola Selling Atlanta Data Center as it Shifts Apps to Cloud

    Coca-Cola is one of the big corporations shrinking the amount of data center capacity they operate on their own by moving more and more applications to cloud service providers.

    The company’s Atlanta data center is up for sale, according to the commercial real estate firm Jones Lang LaSalle. The facility is nearly 90,000 square feet.

    Amazon, Google, and other cloud providers host more than 20 percent of Coke’s computing environment today, which could go up to 50 percent in the next two to three years, the beverage company’s CIO Ed Steinike recently told the Wall Street Journal.

    Like many other corporate giants, Coke wants to take advantage of the infrastructure flexibility enabled by cloud services and the new technologies they enable it to use – technologies the company would take a long time and a lot of money to develop support for on its own. Those are things like mobile workforce, latest security tools, and user interface design capabilities.

    Read more: How Juniper Went from 18 Data Centers to One

    The WSJ article also highlights corporate data center consolidation via cloud by General Electric and a mentions a recent warning to investors by JP Morgan Chase that traditional IT infrastructure vendors are under a serious threat.

    Gartner expects spending on public cloud services worldwide to grow 16 percent this year, reaching more than $204 billion, while forecasting global IT spending to decline half a percent from last year to $3.49 trillion.

    Read more on the WSJ site.

    << Previous Day 2016/04/18
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org