Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, April 13th, 2015

    Time Event
    12:00p
    Missouri (Finally) Passes Data Center Tax Incentives

    Missouri has approved data center tax incentives, following previous attempts that came up short. Governor Jay Nixon signed a bill passing the incentives thanks to some key differences this time around.

    The features that made it more favorable include a minimum investment, job creation requirements, and a ceiling cap. “It’s a more fiscally responsible law,” said David Orwick, an attorney whose practice focuses on real estate and construction industries and specializes in data centers.

    The new incentives require a $25 million investment and 10 jobs for new data centers and an investment of $5 million and five new jobs for expansions. The jobs must be high quality and pay 150 percent more than the county average wage to qualify. Benefits provided are based on an economic model. These thresholds can also pertain to the data center developer and user.

    Orwick believes previous attempts failed in part because the bill was tied to other incentives government couldn’t agree on. “Since this incentive was always tied, it never made it through. It passed this time because it stood on its own – capped at the net fiscal benefit to the state. They can now point to this as a revenue-neutral event.”

    Missouri Chamber of Commerce President and CEO Dan Mehan called the legislation a “game-changer.”

    Data center provider Cosentry, which does business in the Midwest with two facilities in Missouri, also saw the news as favorable.

    “Based on Cosentry’s experience in Nebraska and Kansas, we think the new data center tax laws passed in Missouri will help drive more business and job growth in the state,” said Cosentry CEO Brad Hokamp. “More importantly, the new laws will also benefit our end customers who are considering utilizing our multi-tenant data centers in St. Louis and Kansas City, as the cost to build will be lower.”

    The state recognized that job creation is fairly modest with data centers, but that they provide high-paying, quality jobs.

    “It’s not necessarily the direct job creation, it’s the ripple effect it has on other jobs and other aspects of the community,” said Orwick. “If a data center has only 10 people working directly, when you count vendors coming in to service, individual users of colo space, and the employees service those fields, the ripple effect is huge.”

    While Nixon vetoed a previous attempt pejoratively, calling it “Friday Favors,” the measures weren’t ready because they were arguably too broad.

    “Consistent with the fiscally responsible approach to economic development we’ve pursued from day one, this bill will help attract high-tech data center investments and jobs – without putting our budget or taxpayers at risk,” the governor told The Missouri Times. “I thank the General Assembly for including the safeguards and accountability measures necessary to protect taxpayers.”

    Taxes are just one of many factors when determining where to build. Missouri also touts good connectivity, climate, and is considered a pro-business state overall. There’s also a free-cooling opportunity for about half the year, according to NOAA.

    For those looking into bunker-style data centers, there’s also an abundance of underground sites in the state. One big recent project is LightEdge’s SubTropolis.

    Missouri’s industrial electricity costs are about 10 percent lower than the U.S. average, according to The Missouri Partnership in a 2013 study. However, much of that is generated by coal. Despite Missouri being an optimal environment for wind power, according to NRDC, a state map shows there is very little activity in Missouri despite significant activity in surrounding states.

    Missouri has established a Renewable Energy Standard that will require 15 percent of the state’s energy to come from renewable sources by 2021.

    Big data center projects have spurred renewable energy before, as exhibited in North Carolina, so there’s a chance for industry sway to do something good there.

    The Midwest is a somewhat burgeoning data center location. Several different areas are collectively called “The Silicon Prairie.” For example, Kansas City was first to welcome Google Fiber.

    In terms of multi-tenant data centers, St. Louis is home to 365 Data Centers, midwest-focused player Cosentry, and Ascent calls it headquarters, to name a few.

    Minnesota and Arizona have done particularly well with attracting data centers following the institution of tax rebates, so Missouri should benefit as well.

    1:00p
    CenturyLink Uses Natural Gas to Power Data Center

    Fuel cells that run on natural gas continue to make their way into the data center market. Following several major web-scale data center deployments, Sunnyvale, California-based Bloom Energy has sold a 500kW on-site power generation plant to CenturyLink, which has deployed it to power an expansion of its data center in Southern California.

    The company initially announced the deal in 2013. The data center expansion and the Bloom installation are now complete, awaiting commissioning this month.

    The fuel-cell system will provide part of the expansion’s 2MW load, making the Irvine facility the first multi-tenant data center in Southern California to use natural gas as a source of energy. Until now, Bloom’s data center business has been primarily with single-tenant facilities. An Apple data center in North Carolina and an eBay data center in Utah are two marquee examples.

    Using natural gas may help the colocation provider make its services more attractive to customers that care about powering their infrastructure with clean energy. While there reportedly isn’t a lot of interest in clean energy among typical colocation customers, there is some, and there are also signs that the level of interest is growing.

    Bloom CenturyLink 2

    Bloom Energy fuel cells at CenturyLink’s Irvine, California, data center (Photo: CenturyLink)

    CenturyLink regularly gets customer inquiries about renewable energy in its data centers, Drew Leonard, the company’s vice president of global colocation, said.

    “That is only going to increase over time,” he said. “It’s going to make or break a lot of companies’ decisions about where they’re going to … colocate equipment.”

    Part of CenturyLink’s business case for buying the Bloom fuel-cell system is also savings. The company expects operational savings over time, because gas is a lot cheaper than electricity in Irvine, and because gas lines are more reliable than electrical transmission lines. There are also federal tax incentives for using fuel cells and state incentives in California.

    While the system is serving critical load, it is a pilot project. CenturyLink has more than 50 colocation data centers around the world, and if the deployment goes as expected, it may potentially turn into a much bigger deal for Bloom.

    As eBay demonstrated with its latest data center in Utah, Bloom fuel cells make a very unusual, lean electrical design possible. Fuel cells are the primary source of all power for the data center with utility grid serving as backup, making uninterruptible power supplies, transfer switches, and generators unnecessary.

    “That’s the beauty of this architecture,” Peter Gross, vice president of mission critical systems at Bloom, said. “This is what you call a mission-critical solution, where this replaces the UPS, and this replaces the generator.”

    Gross couldn’t say exactly how many data centers have Bloom fuel cells powering them today citing confidentiality agreements with customers, some of whom are government agencies. But the number is somewhere close to 10, he said.

    3:00p
    Puppet, Cumulus, Dell Partner on Network Automation

    Puppet Labs has announced a partnership with Cumulus Networks and Dell, providing an integration that will improve network automation for customers.

    The new integrations provide customers a full-stack solution for their hardware, operating systems, and configuration management. While server automation is mature, the network remains very manual. These integrations extend automation to the network, providing a more flexible network and eliminating operational inefficiencies that come with a manual processes.

    Puppet Labs created a native Puppet agent for Cumulus Linux and a new module that allows customers to use change management processes across servers and switches

    “Now data center IT teams can automate and manage both the network and the compute infrastructure exactly the same way,” said Reza Malekzadeh, VP of business for Cumulus Networks, in a press release. “Puppet Enterprise native support for the Cumulus Linux OS equates to immediate OpEx savings as adoption of open networking expands.”

    In an outside-the-box solution, Dell and Cumulus Networks have partnered to offer a reference platform on the S6000-ON switch. It allows customers to use a unified change management process across their servers and switches, enabling faster application deployment.

    “As more organizations deploy OpenStack, we expect to see open networking solutions as a major part of those implementations,” said Adnan Bhutta, director of global strategy for Dell’s Open Networking, in a press release. “Dell customers can now build a complete, highly automated and cost-effective solution with Dell Open Networking switches and Cumulus Linux OS, with Puppet Enterprise for automation across both servers and networking.”

    3:30p
    Managing Software Defined Data Center Through Single Pane of Glass

    David Eichorn, AVP and Global Data Center Practice Head, Zensar Technologies, has more than 20 years of experience in IT and telecommunications.

    Just a few years ago, “Software-Defined Data Center” was thought by many to be the next buzz term in the IT industry. Since then, the “software-defined-everything” movement has taken off with the realization there is value not just in the hardware, or in the product, but in the software itself. In fact, industry research firm Enterprise Management Associates named 2014 the year of the Software Defined Data Center (SDDC). How will the industry evolve this year? I predict organizations will seek new solutions to manage this type of data center, ultimately leading to increased operational efficiencies.

    The Evolution of the Data Center

    Let’s take a look back at how the adoption of software defined data centers began. Over the last several years, there has been a significant shift toward converged infrastructure. Traditional servers, storage, and networks were distinct products managed separately by multiple management platforms. We then saw the emergence of converged systems, such as Cisco’s Unified Computing System (UCS). However, these systems were still cabled together by multiple components and systems from various product vendors.

    The next evolution of the converged system was a move toward the software defined model. As such, each component (servers, storage, and networks) are not just a compilation of components from various product vendors, but managed as a single unified framework; thus, enabling the organization to tap into compute and storage resources more gracefully. Organizations could now add more compute and storage power through software, rather than by adding systems. Ultimately this provides increased performance, resiliency, and ease of management.

    In the software defined data center model, the management of the converged data center is handled by software. As a result, organizations are benefiting from the optimization and pooling of resources to improve efficiencies, ensuring that servers are never over- or under-utilized, and taking advantage of the full expanse of the data center’s physical assets.

    Organizations can also reap the benefits of the cloud while maintaining their legacy applications. They can more easily complete the migration to, and management of, hybrid cloud environments. This enables them to lower costs by reducing the infrastructure resources needed, providing scalable infrastructure, and enabling the efficient roll-out of software upgrades. In addition, Open Source technology prevents the organization from being locked into a particular vendor or protocol by providing a more malleable platform. Open Source also allows for access to multiple technologies, which can all be managed under one umbrella, resulting in increased productivity and decreased costs.

    Managing the Software Defined Data Center

    In order to take advantage of the software defined data center, it is necessary to take a holistic view of the various layers that make up the data center stack; i.e. virtualization, software, middleware, database, and application layers, as well as the hardware and cloud environments in which everything connects. All of these components should be operated and centrally managed on a common software based management platform.

    However, it is not easy to manage this complex of a system while maintaining the integrity of every layer within the SDDC. For example, if a glitch occurs in one layer, it can impact other areas of the data center environment. It is important to understand the interdependencies between the various pieces of the data center in order to seamlessly manage the environment and solve any problems that may occur.

    Working with a Managed Service Provider

    Many organizations turn to infrastructure-as-a-service and software-as-a-service vendors to help manage their data center growth. Typically, these vendors are not necessarily equipped to deliver the services associated with managing the software environments they host. However, managed service providers are not relegated to a specific technology and can help organizations manage their data center environments; therefore, removing the burden of managing IT from the business. Managed service providers can also help the organization significantly accelerate the provisioning of virtual machines and related core services.

    When choosing a managed service provider, it’s important to look for one that can manage both hybrid IT environments and the associated hardware, OS, applications, and network layers.

    It can be difficult to find a single managed service provider that does not rely heavily on subcontractors. Organizations should choose one that offers a single point of contact and maintains continuity throughout the entire process, from planning to building to operating the data center. The provider should also ensure that a wrapper of security is built around the entire SDDC environment. In this context, security should be viewed as a vertical layer that spans across every horizontal layer of the data center stack.

    A Single Pane of Glass

    Organizations can benefit from working with a qualified managed service provider that offers a unified framework and unified view into their data center operations. The offerings should be combined with proactive 24×7 monitoring of the complete environment even across multiple data center or geographies. This constant monitoring allows the managed service provider to remedy issues in real time and anticipate problems before they occur. This prevents bottlenecks and other IT holdups from impacting the end-user experience and helps keep the bottom line intact. When data centers are managed this way, organizations are able to move data much more quickly. For example, provisioning that would have previously taken days, can now be completed in just hours or minutes.

    By choosing a managed service provider that can manage the data center holistically across multiple locations and environments, organizations can take full advantage of the benefits this data center trend offers, including consolidation efforts, and increased efficiencies that ultimately improve the bottom line.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:00p
    Data Centers: Total Cost of Ownership and Cost of Risk

    Calculating the total cost of ownership (TCO) of a data center is growing in complexity. There are more significant elements to include in your calculation these days, says Mark Evanko of BRUNS-PAK, which is a company of facilities and IT professionals who create design/build solutions for mission-critical data centers.

    mark_evanko_dcw“What we are seeing is the total cost of ownership of a data center, such as the design-build, colocation, cloud, disaster recovery and network costs need to be considered, and you also need to consider the associated risk elements,” Evanko said.

    At the spring Data Center World, which will convene in Las Vegas on April 19-23, Evanko will be presenting on how to approach TCO in today’s current environment, especially in light of major security issues experienced by enterprises recently.

    According to Evanko, the considerations are not all that simple any longer. “It’s not all design or build, or colocation or cloud?” he said.

    The following are among the many considerations, that owners and operators need to consider:

    • Facilities infrastructure
    • Energy efficiency
    • Computer hardware
    • Cloud (private or public)
    • Colocation
    • Disaster recovery
    • Migration
    • Software
    • Modular (scalability?)
    • Containers
    • Network costs
    • Service Level Agreements
    • Personnel
    • CAPEX vs. OPEX
    • Legal
    • Consulting Design-Engineer

    Security of Data: A Big Concern and a Liability

    Security of customer data held by big retailers (and even health insurance providers) has been in the news recently because of multiple, large data breaches. Forbes reported in March, “Home Depot was the latest in the chain of large companies under a cyber attack targeted at their payment terminals, where a security breach left approximately 56 million credit and debit card numbers exposed.” This is just one recent example–Target, Kmart, Staples, and Michael’s, among others were also hit by security issues with consumer data.

    As could be predicted, lawsuits are now popping up. The New York Times reported in March about lawsuits related to the Target’s data breach, “A federal judge on Thursday gave preliminary approval to a $10 million settlement of a lawsuit brought by customers of Target, which experienced an online attack involving confidential customer data during the holiday season in 2013.” This is a small amount compared to what the hackers have already cost the company, with $1 billion in total losses projected. Also, multiple C-level executives at Target were fired.

    How does this relate to data centers? Evanko said the use of third-party services (such as colo or cloud) increases data breech and security risks and potential for data loss and business disruption. “Colocation becomes ‘in vogue’,” he said. “There’s no thinking, you are going outside, and you are not tying up capital. It seems it is without the risk element. However, when an enterprise’s data is at a third-party provider, the liability of the third-party provider for damages is zero.”

    Evanko explained this means industry changes, and noted, “Everyone is becoming informed about TCO and risk. With hacking and security, if your data is off site, there is no recovery.”

    He did note that legal actions would be inevitable, and said that there is consumer protection legislation in the works, sponsored by U.S. Sen. Bob Menendez (D-New Jersey), to hold providers accountable to customers for data loss.

    “I am not against colo or cloud,” Evanko said. “It’s right for a temporary app or non-critical data, but maybe you should keep the ‘crown jewels’ at home.”

    There are other elements of risk outside data breeches, such as a third-party provider going out of business or bankrupt, he said. “Then you’d have migration costs – sometimes it takes years to migrate.” He shared an example where a cloud provider filed for Chapter 7 and Chapter 11 protection, and clients were given 30 days to move their apps and data. “It’s impossible. You are going to experience an extended interruption in your service,” he said.

    When security risks and a multiplicity of other factors are accounted for, the whole total cost of ownership model changes significantly, Evanko said.

    For more information, sign up for the spring Data Center World, which will convene in Las Vegas on April 19-23, and attend Evanko’s session, “Data Center Total Cost of Ownership vs. Risk. “

    6:06p
    Oracle Data Center to Launch in Japan for Local Cloud Users

    Oracle will open a new data center in Japan this year, the company’s executive chairman and CTO Larry Ellison said while speaking at Oracle CloudWorld Tokyo earlier this month. It will be the 22nd Oracle data center worldwide.

    The data center will serve Oracle’s cloud-based applications, Platform-as-a-Service, and Infrastructure-as-a-service to the Japanese market.

    As cloud meets enterprise IT, the need for local data centers serving these enterprises is growing. Enterprises like the idea of the hosted model, but often can’t deal with the uncertainty of generic cloud regions outside of a country’s borders or not knowing where data resides.

    Oracle is going after data sovereignty needs with its cloud, listing in-region hosting as a feature. The need for local Oracle data centers means the company has aggressively expanded its global footprint, with recent announcements about data centers in Germany and China.

    Oracle has also launched several targeted clouds, including one for government and recently for financial services and for retail.

    At CloudWorld, Ellison described a fundamental shift occurring over the last decade in the large business-computing market. He said Oracle applications have been rewritten for cloud; they’re not just the same old applications in a hosted model. Applications are modernized to realize true cost and operational benefits. Oracle’s evolution is working based on its financial results.

    “It’s gone from an idea to a multibillion-dollar business in the blink of an eye, and growing very rapidly,” said Ellison at the recent event.

    At last year’s CloudWorld, the company rolled out close to 200 new Software-as-a-Service applications. Cloud is now big business for Oracle. More than 60 million users are on Oracle cloud.

    6:30p
    Hybrid Cloud Grows as On-Premise Hosting Expected to Fall 14% by 2018: Report

    logo-WHIR

    This article originally appeared at The WHIR

    Hybrid cloud adoption is set to triple over the next three years, according to new research by Peer 1 Hosting released on Monday.

    The study included responses from over 900 IT decision makers from North America and the UK. While only 10 percent of respondents said they primarily use hybrid cloud, 28 percent said they will be using hybrid cloud by 2018.

    Use of on-premise hosting is expected to fall from 31 percent to 17 percent over the next three years, and private cloud is expected to drop from 52 percent to 41 percent during the same period.

    Companies motivated to adopt hybrid cloud cite cutting IT costs (49 percent) and improving processes and operational efficiencies (45 percent) as their top priorities. Challenges in achieving these priorities include security (53 percent) and data protection (46 percent).

    “The proposition of hybrid cloud is very compelling: cost savings, business agility and operational efficiencies are the qualities that IT decision makers are looking to bring into their organisations. From the research, it’s clear that hybrid cloud adoption is outpacing that of public and private cloud,” Toby Owen, Vice President Product, Peer 1 Hosting said.

    “Hybrid cloud adoption appears to be held back by concerns largely related to security and data protection. Clearly these are areas where businesses cannot compromise. Fundamentally these concerns are about a perceived lack of control. As the industry responds to this, with truly scalable, flexible and controllable hybrid cloud solutions, I believe that IT decision makers will be quicker to adopt hybrid cloud than this research suggests.”

    Peer 1 has been tracking cloud adoption for a few years. In 2013, a smaller survey Peer 1 conducted of 120 IT professionals found that 78 percent preferred a hybrid cloud solution over making a full transition to cloud.

    This first ran at http://www.thewhir.com/web-hosting-news/hybrid-cloud-grows-premise-hosting-expected-fall-14-2018-report

    << Previous Day 2015/04/13
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org