Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, November 2nd, 2015

    Time Event
    1:00p
    Data Center Network Traffic Four Years from Now: 10 Key Figures

    As more people get mobile devices and as people and businesses use more cloud services, cloud traffic will grow faster than data center traffic.

    The latest update to Cisco’s Global Cloud Index, an annual assessment and future projection of global network traffic trends, forecasts that while data center networking traffic will triple from 2014 to 2019, global cloud traffic – considered a subset of data center traffic by Cisco – will more than quadruple within the same timeframe.

    In fact, five years from now, cloud traffic will account for most data center network traffic, according to Cisco.

    Here are 10 key figures from the latest cloud index:

    26 months: The length of time the world’s entire population in 2019 would have to stream music to generate the amount of traffic Cisco expects to travel to, from, and between the world’s data centers that year.

    10.4 zettabytes: The total volume of data center traffic projected for 2019. Cisco defines data center traffic as “all traffic traversing within and between data centers as well as to end users.”

    3.4 zettabytes: The volume of data center traffic in 2014.

    16 terabytes: Data storage capacity of a Samsung flash drive currently considered to be the world’s largest hard drive. A zettabyte is about 1 billion terabytes.

    8.6 zettabytes: Total volume of cloud traffic projected for 2019. That’s most of the data center traffic projected for that year. The report defines cloud traffic, a subset of data center traffic, as traffic generated by cloud services hosted in scalable, virtualized cloud data centers, accessible through the internet.

    8.4: The number of workloads per physical server in cloud data centers in 2019. That’s up from 5.1 last year.

    3.2: Workload density per physical server in traditional (non-cloud) data centers expected in 2019 – a small increase from 2 workloads per server in 2014.

    56 percent: Percentage of cloud workloads that will run in public cloud data centers in 2019 – up from 30 percent in 2014.

    44 percent: Percentage of cloud workloads that will run in private cloud data centers in 2019 – down from 70 percent last year.

    >2,500 kilobits per second: Download speed requirement for advanced cloud applications. The upload requirement is 1,000 kbps or more, and latency requirement is under 100 milliseconds.

    Here’s a Cisco infographic on global data center traffic projections:

    Cisco cloud index 2015 infographic

    4:00p
    Debunking Unfounded Cloud Security Fears

    Working with modern cloud and data center technologies isn’t always easy. We’re seeing a boom in cloud adoption and data center utilization. A recent Cisco report estimates that by 2019, more than 86 percent of workloads will be processed by cloud data centers. Still, with all the success of cloud computing, it’s practically impossible to avoid the cloud security conversation.

    Let’s face it, your data is valuable, and the bad guys are constantly looking for ways to get in. There is a direct economization around the hacking industry where the value of information continues to grow. Juniper Research pointed out that the rapid digitization of consumers’ lives and enterprise records will increase the cost of data breaches to $2.1 trillion globally by 2019, increasing to almost four times the estimated cost of breaches in 2015.

    Today, we’re seeing breaches of unprecedented scale. What’s often lost in the conversation, however, is the fact that the vast majority of these breaches happen within more traditional data center environments. Have we seen a massive cloud security breach at Amazon Web Services? What about Microsoft Azure? Yes, we’ve seen cloud outages that happen for various reasons – nothing is ever perfect – but the biggest cloud service providers have yet to see a massive security breach? The reality is that the cloud may be a lot safer for a business than you think:

    • FEAR: I’m worried about this whole “shared” cloud infrastructure!
    • REALITY: Modern cloud environments are specifically designed for secure multi-tenant workload delivery. That’s their DNA. They secure the underlying infrastructure, ensure that no systems can interact, and allow you to create your own environment on top. If you’re still concerned, simply ask for a dedicated server or dedicated virtual infrastructure. You might pay a bit more, but you’ll “own” that space. Still, multi-tenant environments when designed properly are good ways to consolidate users, improve costs, and still create a secure operating space.
    • FEAR: I can never have my compliance or regulation-bound workloads in the cloud.
    • REALITY: Today, organizations across all verticals are migrating to cloud. Providers now allow all sorts of workloads to live or pass through their infrastructure, and a multitude of certifications have been updated to accomodate it. They include PCI/DSS, SOX, HIPAA, FISMA, IEEE, and many more. Even the government got into cloud with FedRAMP. Here’s an example: HIPAA compliance in general can be a cloud nightmare. And so, a recent change to HIPAA (the Omnibus Rule) now allows for the creation of a “business associate.” This allows cloud providers to work with protected healthcare information. Similarly, you can create powerful cloud e-commerce gateways that also PCI/DSS compliant.
    • FEAR: My business leaders worry about not being able to manage what they can’t see.
    • REALITY: Cloud management has come a really long way. In fact, one of the dominant cloud models today is the hybrid cloud. In a recent report, Gartner analysts say that the use of cloud computing is growing, and by 2016 this growth will increase to become the bulk of new IT spend. 2016 will be a defining year for cloud as private cloud begins to give way to hybrid cloud, and nearly half of large enterprises will have hybrid cloud deployments by the end of 2017. This means organizations are finding ways to connect their on-premise envionments with the cloud. It’s important to see your cloud environment as an extension of your business. You will need to build in good management, good resource control policies, and good security best practices. However, cloud providers today are offering some granular management and visibility control solutions. You have much more control over your data, applications, and even users in the cloud than ever before. One of the best ways to see for yourself is to test it out. Almost every major cloud provider will let you test a small piece of their cloud for a group of users or a few applications.

    There are a lot of fears caused primarily by lack of knowledge bout all the cloud options. There are many use cases for cloud computing, and yours may be unique. Some organizations plan entire data center migrations, while others only want to move one app. Either way, working with a cloud provider that has cloud services as their specific line of business is a really good idea. At the very least, testing out cloud platforms is much easier than ever before. Platforms like AWS, Azure, and Google Cloud Platform were designed with secure multi-tenant delivery in mind. Many create their own proprietary systems to control security, network routing, user load balancing, and automation.

    Before you dive into cloud with both feet, however, know that there is no silver bullet for security. That means poorly designed workloads and cloud environments can make you vulnerable. Know what you’re hosting and how it’s being accessed, and always apply security policy best practices. This means testing out your systems and deploying an architecture that, if an attack does occur, can fence and stop the intrusion quickly. Cloud and security can absolutely get along.

    5:15p
    ‘DataGeddon’ and Data Center Location

    Lisa Rhodes is Vice President of Marketing and Sales for Verne Global.

    Most of the world’s data has been created in the past few years, and thanks to the proliferation of IoT and Big Data, there doesn’t seem to be an end in sight. Some even refer to this phenomenon as “DataGeddon”. So, what exactly is DataGeddon? It can simply be described as the sheer volume of data being collected, managed and stored. There is a direct correlation between companies’ tailored communications campaigns to loyal consumers and the amount of data generated.

    Companies have audiences that trust them, and these companies must create content that triggers an emotional response to act, purchase or consume. Consumers start to show greater brand loyalty because the company knows more about their preferences and tailors their communications accordingly. For someone who is interested in the brands they purchase having a social and/or environmental benefit, they will likely want all the information they can get to support not only the brand, but also the social causes it represents.

    Companies need to consider where this data is going to be stored and the necessary infrastructure that has to be in place to support it. This is becoming a critical business decision as the offices of the CIO, CMO and data scientists merge together. As companies move toward a more connected and interactive experience for their customers, these devices must have access to a large amount of servers that provide the necessary processing and data.

    At the heart of this lies the data center where all of this content is stored and processed. However, data centers require a substantial amount of power that is not cheap or always readily available and reliable. A recent report showed that in 2013, U.S. data centers consumed an estimated 91 billion kilowatt-hours of electricity—enough to power all the households in NYC twice over—and we’re on track to reach 140 billion kilowatt hours by 2020. For this reason, companies are looking for more creative solutions to address their growing storage needs, such as moving their data centers to Nordic locations that provide a reliable power grid, renewable energy, cool climate and high-speed networks.

    With companies continuing to generate more and more data as they aim to create a customized experience for their customer’s, the significance of how and where their data is stored has become a key topic for not only companies, but their customers as well. Executives are seeing the need to be more creative and moving forward, as they question how they can combat the threat of “DataGeddon,” the old adage of “location, location, location” might just be part of the answer.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    6:37p
    Bloomberg Buys 3MW of Solar for New York Data Center

    Bloomberg has agreed to purchase energy from a high-capacity solar power project planned by SunEdison to supplement the energy mix powering Bloomberg’s new data center in Orangetown, New York, a suburb on the Hudson River, north of New Jersey.

    The New York-based market data and news giant has contracted for a 2.9 MW output of the future solar project. The plant is expected to generate only 5 percent of the 7 MW data center’s energy requirements, but the associated reduction in carbon emissions is estimated to be more than 11,600 metric tons over the course of the agreement, according to SunEdison. That’s equivalent to taking 6,800 cars off the road, the renewable energy company said in a statement.

    Launched earlier this year, the Orangetown facility is one of three Bloomberg data centers. The company built it after its older data center in Manhattan came close to an outage as a result of flooding brought on by Hurricane Sandy in 2012.

    Read more on Bloomberg’s data center infrastructure in this Data Center Knowledge feature

    Bloomberg is buying the energy through a long-term power purchase agreement with the developer – an increasingly common way to buy renewable energy for data centers, employed to a great extent by companies like Google, Microsoft, and Facebook.

    Such PPAs don’t supply energy directly to the data centers. A typical PPA provides funding to build a renewable energy project in the data center’s vicinity that will add generation capacity to the same grid that powers the data center.

    The data center operator usually gets renewable energy credits equivalent to the amount of energy generated by the new plant it contracts for and uses them to offset the facility’s carbon footprint.

    There are also other, more complex schemes, employed by Google’s parent company Alphabet, where the data center acts as an energy buyer and seller on the wholesale market. Most data center operators that use PPAs today, however, don’t take that route, as it requires for the operator to register a company as a utility.

    SunEdison has scored much larger renewable PPAs for data centers recently. Earlier this year Equinix, one of the world’s largest data center service providers, contracted with the developer for energy from a future 150 MW solar project in an agreement expected to offset energy consumption of Equinix’s entire California data center footprint.

    Also this year, HP signed a 112 MW PPA with SunEdison to generate enough solar power to cover HP’s massive Texas data center operations.

    8:01p
    Anexio Buys Bankrupt Net Data Centers’ East Coast Assets

    Anexio Technology Services, a Raleigh, North Carolina-based IT services firm, has acquired East Coast data center assets of the bankrupt service provider Net Data Centers, formerly Net2EZ, giving some relief to one of the troubled company’s landlords DuPont Fabros Technology.

    Squeezed by high operating costs and competition in a busy market, Net2EZ filed for bankruptcy early this year and changed its name to Net Data Centers shortly after. Three Twenty-One Capital Partners, the investment banking firm Net retained to help sell its assets, said the client’s financial problems were caused by its “above-market” data center rental rates.

    On DuPont Fabros’s third-quarter earnings call last week, the wholesale data center provider’s executives acknowledged that Net’s rates were substantially above market. Anexio has signed new leases with DuPont at all four East Coast data centers Net uses at lower rental rates and for less capacity.

    Anexio leases will generate $0.085 per kW annually for the landlord, compared to $0.16 per kW that was generated by Net leases, DuPont CFO James Foster said on the call. “Given that Net’s leases were well above market, the base rent on the new leases decreased 34 percent per kW month on a cash basis and 18 percent per kW per month on a GAAP basis,” he said.

    Anexio leased about 2 MW less capacity, taking about 4 MW across the three facilities in Virginia and one in New Jersey.

    Prior to the Anexio deal, DuPont had a revenue-sharing agreement with Net, which could not continue paying rent. The data center provider has been receiving more than 80 percent of all revenue from Net customers in its facilities through the agreement, making $3.6 million by the time the Anexio deal closed and the agreement ended.

    DuPont CEO Christopher Eldredge said on the call the lease would give Anexio the ability to provide managed services on top of the Net data center footprint. The company has not had a substantial data center footprint until now, he said.

    Andrew Power, CFO of Net’s other big data center landlord in California, Digital Realty Trust, said Digital was “on the cusp of a happy ending to the Net Data Centers bankruptcy.”

    Net, he said on Digital’s third-quarter earnings call last week, has chosen to assume all of its leases at Digital’s downtown Los Angeles and El Segundo with no changes to rental rates. Net has been current with Digital on its obligations made after its bankruptcy petition.

    The tenant still owes Digital $1 million in pre-petition rent, which the landlord expects to collect this quarter.

    8:30p
    Enterprise Storage Startup ClearSky Data Raises $27M

    logo-WHIR

    This article originally appeared at The WHIR

    Storage network provider ClearSky Data announced on Monday that it has raised $27 million in a Series B investment round led by Polaris Partners. ClearSky said it will use the funding to grow its sales, marketing and operations organizations and add new points of presence.

    Other investors that participated in the round include previous investors General Catalyst and Highland Capital Partners. Akamai Technologies also participated with a strategic investment. With this latest round, ClearSky’s total funding has reached $39 million.

    ClearSky uses a managed services delivery model for enterprise storage, which Polaris Partners said represents “the future of enterprise infrastructure.”

    “Managing primary storage is a huge and poorly served challenge in the enterprise infrastructure market. Legacy systems have failed to address customer needs and no one has figured out how to leverage the economics and agility of the cloud – until now. ClearSky has addressed this challenge head-on, creating a breakthrough software-as-a-service (SaaS) offering for primary storage,” managing partner of Polaris Partners Dave Barrett said in a statement. “We’ve entered the next generation of the enterprise and ClearSky has a great opportunity to build the next major infrastructure company.”

    Barrett will join the ClearSky board of directors and Andy Champagne, vice president and chief technology officer, Akamai Labs at Akamai Technologies, will join as a board observer.

    “ClearSky’s global storage network is the way for enterprises to move their data to the cloud—with the security, performance and availability they need to succeed,” Andy Champagne, vice president and chief technology officer, Akamai Labs at Akamai Technologies said in a statement. “At Akamai, we know firsthand how to build a global network that meets the stringent requirements of enterprise customers, and we look forward to working with the ClearSky team to help customers worldwide as they adopt this new approach to enterprise storage.”

    Based in Boston, ClearSky has colocation facilities in Boston, Philadelphia and Las Vegas. The company currently has 40 employees.

    Earlier this year, storage startup Scality raised $45 million in a Series D funding round.

    This first ran at http://www.thewhir.com/web-hosting-news/clearsky-data-raises-27-million-to-scale-enterprise-storage

    8:31p
    Netflix Updates Open Source Projects with Docker Containers, More

    varguylogo

    This post originally appeared at The Var Guy

    Netflix is taking steps to make its collaboration with open source developers easier by overhauling the Netflix Open Source program. Among other changes, the company will now release open products as Docker containers to simplify access.

    Netflix began aggressively open-sourcing certain parts of its code base and related products in 2012. It has also offered prizes to developers who create new open source code that is useful for Netflix, particularly in the cloud.

    Now, the company has announced an overhaul of its open source initiative. That’s not because open source hasn’t served it well. On the contrary, according to Netflix—the changes reflect new challenges that have arisen as the number of open source projects that Netflix supports has grown very large.

    To make it easier for developers to find projects of interest, the company has updated its GitHub portal for open source projects. Projects are now categorized according to type, with clearer explanations of how projects relate to one another.

    In addition, Netflix will make most of its open source projects available as Docker containers. It intends with that change to make it easier for developers to get code running quickly. “We have found that it is far easier to help our users’ setup of our projects by running pre-built, runnable Docker containers rather than publish source code, build and setup instructions in prose on a Wiki,” the company explained.

    That’s not all. There are more changes to come in Netflix’s open source endeavors, it says, including more transparency about development activity on projects, as well as documentation. It did not announce a time line for implementing those changes.

    This first ran at http://thevarguy.com/open-source-application-software-companies/110215/netflix-updates-open-source-projects-docker-containers-an

    << Previous Day 2015/11/02
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org