Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, December 22nd, 2015

    Time Event
    1:00p
    Data Centers in 2016: Vendor Predictions

    Every year starting around mid-November we get flooded with emails from vendors in the data center market (and outside of it) with their predictions for the following year. Some are interesting, some aren’t, and most are of course self-serving.

    We’ve recently gone through all of the predictions vendors sent us so far this year, and tried to distill the ones we think are the most interesting and meaningful ones for the data center industry.

    Here they are, the top data center predictions for 2016 from vendors in the space:

    Data Center Networking

    Doug Murray, CEO, Big Switch Networks

    The top five hyperscale players (Google, Facebook, Microsoft, Amazon, Alibaba) will exceed 20 percent of total world-wide infrastructure spend, further exacerbating the “lead (read: actually be the top five), follow (read: do what the top five does), or get out of the way (read: outsource to cloud)” environment that’s upsetting traditional IT provider businesses.

    Open Source in the Data Center

    John Engates, CTO, Rackspace

    What many companies don’t want is vendor lock-in. Choosing the wrong technology or provider ranks up there with security as one of the biggest worries keeping CIOs up at night. And that’s why I predict open source will continue to play a critical role in cloud growth. Last year I described OpenStack, at age 5, as boring, and explained why that was good. Boring means stable and that stable foundation will allow enterprise in 2016 to more fully embrace open source cloud solutions and making them part of their companies’ overall cloud strategies.

    Flash Storage for Big Data

    John Schroeder, CEO and co-founder, MapR

    Storage (Particularly Flash) Becomes an extremely abundant resource; next-generation, software-based storage technology is enabling multi-temperature (fast and dense) solutions. Flash memory is a key technology that will enable new design for products in the consumer, computer, and enterprise markets. Consumer demand for flash will continue to drive down its cost, and flash deployments in Big Data will begin to deploy. The optimal solution will combine flash and disk to support both fast and dense configurations. In 2016, this new generation of software-based storage that enables multi-temperature solutions will proliferate so organizations will not have to choose between fast and dense; they will be able to get both.

    Applications First, Infrastructure Second

    Docker

    The ability to develop apps that can be deployed anywhere using containerization lessens the demands, and therefore the focus, on the data center and software-defined-data center architectures. Dev, Ops and IT as a whole will be allowed to think more in terms of what benefits end users most, unconstrained by the costly and time-consuming infrastructure management, expansion and enablement technologies that have confined them to date.

    IoT and Data Center Interconnection

    Tony Bishop, VP, Global Enterprise Vertical Strategy and Marketing, Equinix

    Record-breaking adoption and expansion of IoT devices and sensors will continue to accelerate, resulting in a flood of data that severely strains network capacity and security. We predict that in 2016, enterprises will work to more seamlessly combine networked intelligence with the data being processed by sensors and actuators. This will enable the enterprise to gain more control of its information and enhance its ability to use the IoT to quickly adapt to changing conditions, create new value and drive new growth. Enterprises can improve their agility in rapidly changing IoT environments by deploying infrastructure that enables direct and secure connections between the multiple components that must be in synch to exploit the real-time insights the IoT offers. That kind of interconnection ensures employees, partners and customers can get the information they need, in the right context, using the devices, channels and services they prefer. Businesses will realize this, meaning robust interconnection will become a more prominent solution in this space in 2016, as an intersection point between IoT, clouds and the enterprise becomes increasingly necessary.

    Data at the Jagged Edge

    Scott Gnau, CTO, Hortonworks

    Businesses must look beyond the edge of their data centers all the way out to the jagged edge of data. Data flows now originate outside the data center from many devices, sensors and servers on, for example, an oil rig in the ocean or a satellite in space. There is a huge opportunity to manage the security perimeter as well as to provide complete data provenance across the ecosystem. “Internet of Anything” creates a new paradigm that requires new thinking and new data management systems, and these solutions will mature and permeate the enterprise next year.

    From Centralized to Distributed

    John Hawkins, VP, marketing and communications, vXchnge

    Location. Location. Location. Producers are leveraging a federated model when it comes to data centers. Many are relying on a higher number of strategically located data centers rather than utilizing the hub. For example, rather than storing massive amounts of data in a few select data centers, application providers are moving their applications to “the edge,” (in locations where they can serve customers locally, and reach more businesses and more consumers in more markets) in order to be closer to the consumer to reduce latencies and perform at higher rates.

    6:36p
    Pivotal Buys CloudCredo in European Enterprise PaaS Market Push

    WHIR logo

    Article courtesy of theWHIR

    Pivotal, the EMC-controlled provider of Big Data and Platform-as-a-Service products geared toward enterprise developers, has acquired London-based CloudCredo, a European specialist in enterprise PaaS based on Cloud Foundry, the companies announced Monday.

    Cloud Foundry is an open source PaaS created by VMware, which later passed the helm to Pivotal, its sister company. Cloud Foundry is now governed independently as an open source project, but Pivotal continues to contribute substantially.

    CloudCredo was founded in 2013 and provides enterprise PaaS deployment and services, as well as log analysis technology through its stayUp subsidiary, which is also included in the deal.

    “CloudCredo enhances Pivotal’s powerful next-generation portfolio of products and services by bringing extensive knowledge of deploying, running and customizing Cloud Foundry for some of the world’s largest and most admired brands,” said Rob Mee, CEO of Pivotal. “With this expertise, we can better help our customers transform their enterprises by embracing and leveraging Pivotal’s Cloud Native platform more quickly.“

    The companies tout CloudCredo’s “unrivaled experience” with SLA-driven Cloud Foundry deployments, and Fortune reports it was the first company to integrate Cloud Foundry with Docker. Cloud Foundry added native Docker support in November, as Microsoft Azure was launching the Platform-as-a-Service (PaaS). Support for Docker, and containers in general, was also a point of emphasis in Pivotal Cloud Platform Group VP and GM James Watters’ conversation with the WHIR about Cloud Foundry in March.

    These two strengths indicate what Pivotal may integrate into its services from its new acquisition, while CloudCredo will continue to operate in London and push Pivotal Cloud Foundry into the European enterprise market.

    “The pool of truly elite Cloud Foundry systems talent, in other words BOSH, in Europe is limited. So is the pool of services companies with a proven track record of moving the dial on training and management in cloud native development,” said James Governor, analyst and founder of RedMonk. “Pivotal gets both by acquiring CloudCredo.”

    There was another Cloud Foundry Foundation member company interested in buying CloudCredo, according to Fortune, possibly several. Foundation members, in addition to Pivotal, EMC, and VMware (which are all related through the pending Dell deal), include IBM, SAP, Oracle, and HP Enterprise.

    As a “blowout” year for cloud and hosting acquisitions closes out, it should be little surprise that strength in support for containers, development SLAs and global markets are attractive to buyers.

    This first ran at http://www.thewhir.com/web-hosting-news/pivotal-adds-cloud-foundry-expertise-european-clout-with-cloudcredo-acquisition

    6:42p
    Five Key Criteria for CIOs When Choosing an IaaS Provider 

    Yoav Mor is a Cloud Solution Evangelist for Cloudyn.

    Enterprises are the new focus of public cloud providers, and conversely, the flexibility and scalability of the public cloud has sparked a genuine interest in enterprises across the globe. The initial challenge that enterprises face once they decide to make the move to the cloud is choosing an infrastructure as a service (IaaS) provider that’s right for them.

    If you’re a veteran CIO, you may have been used to having face-to-face interactions with a managed service provider or colocation provider when your organization was in need of new infrastructure capacity. But the cloud has changed the way business is done. With the cloud’s enterprise appeal, suddenly you are faced with the option to purchase infrastructure services with the click of a button, online, and it’s up to you to make the right decision.

    The question is, how do you qualify IaaS solutions? How can you increase your speed of innovation without harming your current environment? How can migrating to the cloud help you reach your business objectives? Below are 5 key criteria for CIOs to consider when choosing an IaaS provider:

    Understand Your Cloud Provider’s Service Level Agreement (SLA)

    In the public cloud, SLAs cover a range of elements that affect your service, such as performance, high availability, and security. Now that more and more enterprises are migrating to the cloud, cloud providers are becoming pressured to create and abide by enterprise-class SLA guidelines that actually suit their enterprise customers’ needs. It’s your job to make sure the SLA looks realistic and contrary to what you might think, it is open for negotiation.

    Find the Balance Between Performance and Cost

    While lower, more flexible costs are well known characteristics of the cloud, in order to really maximize on costs, your cloud environment needs to run in the most efficient way possible. You should constantly strive to find new ways to reduce costs, so long as you continue to meet your workloads’ needs (e.g., performance, security, and availability). Additionally, you should be able to provision resources based on business objectives, such as differentiated performance between dev/test and production environments.

    Migrating From an On-Premises Environment to the Cloud Doesn’t Happen Overnight

    You need to learn about how your on-premises environment will work in the cloud before any workloads can be transferred over, and ultimately prepare your on-premises environment in such a way that it will be able to extend or migrate to the public cloud. The public cloud welcomes such migration setups, bringing you one step closer to making the usage and costs in each environment completely transparent. As CIO, it’s your duty to ensure that the migration is successful and that you utilize the right cloud management tools so there is complete transparency into what is going on both on-premises and in the cloud.

    Vendor Lock-In

    Once you migrate to the cloud, you have to decide how locked-in you want to be. The public cloud provides a lot of freedom of choice: You can leave whenever you want by simply releasing your VMs. Conversely, with on-premises environments, due to the fact that you generally go through lengthy processes to purchase or lease expensive hardware with long term commitments, the hardware is less easy to get rid of if it’s no longer needed. However, when you start using “up the stack” public cloud capabilities like AWS Beanstalk or Database as-a-Service, or make upfront financial commitments such as with AWS Reserved Instances or Enterprise Agreements in Azure, vendor lock-in gets a bit trickier. A certain level of experience is required in order to compare between different vendors’ similar offerings and to understand just how locked-in you could be. In order to avoid lock-in, you need to first know your own objectives, as well as the operational features of the application that you want to move to the cloud (or vice versa). Then, analyze the types of services that you would use to run your application in order to understand the level of lock-in you would experience once you migrate.

    Do Your Homework

    When you choose a new vendor, it’s important to verify that they have real use cases, and that of those use cases, at least one is applicable to what your organization needs. In the on-premises world, this is known as talking to references; but in the cloud, you first have to understand what your specific use case is so that when you look in a potential vendor’s case study repository, you can look for one that suits your needs. Be sure to drill into the technical details of your specific needs (e.g., compliance, regulations, security, etc.) in order to really understand how certain vendors have handled other customers with similar situations. This is very important with the current state of the market, because certain capabilities may be limited in the cloud that you are used to having on-premises.

    Final Thoughts: Transparency Is Key

    The initial stages of public cloud migration are low risk. Think of them as a test drive for cloud vendors. However, in order to understand if a certain vendor is right for you, you should also know if you have the right tools to manage and control the vendor’s infrastructure and services. You can do this by trying to calculate your cloud costs, or what they will be, and by making sure your vendor’s offerings serve your objectives. These practices will ensure that the cloud remains to be a great advantage for your organization.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    10:04p
    Salesforce Buys Wind Power for its Entire Data Center Load

    Salesforce has contracted for 40 megawatts of wind power from a West Virginia wind farm, becoming the latest cloud giant to enter into a utility-scale renewable-energy purchase agreement for its data centers.

    The purchase covers more capacity than all of the cloud-based business software giant’s servers consume in data centers that host them. Unlike other cloud giants, Salesforce doesn’t own and operate its data centers, leasing capacity from commercial data center providers instead.

    While companies like Google, Facebook, and Microsoft, which own and operate a lot of their data center capacity have been signing larger renewable energy purchase agreements and more frequently, there’s been an uptick in renewable energy investment by data center providers this year. This uptick indicates there’s now more interest from major data center customers, such as Salesforce, in carbon-neutral colocation.

    Equinix, the world’s largest retail colocation provider, made renewable energy purchases first for its California data centers in September and later for the rest of its sites in North America. Its major competitor, Digital Realty Trust, started offering customers the option of premium-free renewable energy anywhere in the world for one year in January. Las Vegas-based Switch has invested in a 100 MW solar farm near Reno, Nevada, where it is building a massive data center campus.

    The wind farm that will generate energy for Salesforce is expected to produce 125,000 MWh a year – more than all Salesforce data centers combined used in fiscal 2015.

    The wind power will not go directly to Salesforce data centers, but the wind farm, due to come online late next year, is on the same power grid as the bulk of the company’s data center load, Salesforce director of sustainability Patrick Flynn wrote in a blog post announcing the agreement.

    Flynn didn’t say where those data centers were, but as we reported earlier, the company has data centers in Northern Virginia and Chicago. Both of those regions and West Virginia, where the wind farm is being built, are served by the electrical transmission system operated by PJM Interconnection.

    Salesforce made a commitment to powering all of its operations with renewable energy in 2013. As part of the commitment, the company promised to favor data center sites with access to renewable energy as a matter of policy and encourage utilities that serve its data centers to increase supply of renewables in their generation mix.

    “This is our biggest step yet toward powering 100 percent of our global operations with renewable energy,” Flynn wrote about the recent wind power deal.

    The agreement is what is often referred to as a “virtual power purchase agreement.” Such agreements offer a way around the many roadblocks on the way to procuring large amounts of renewable power.

    Not all states allow wholesale power purchases by non-utilities, and not all utilities sell renewable power as an end-user product. Power purchase agreements like the one Salesforce has entered into decouple location of the generation plant from the place of consumption.

    Salesforce will pay the developer of the wind farm – whom it did not name – a fixed rate in return for renewable energy credits, which it will then apply to regular grid energy consumed by its data centers. The developer will sell the energy they generate on the wholesale market and either pay Salesforce if the wholesale price is higher than the fixed rate or collect the difference from the cloud company if the sale price is lower.

    << Previous Day 2015/12/22
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org