Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, March 4th, 2016

    Time Event
    1:00p
    White House Orders Federal Data Center Construction Freeze

    US government agencies are no longer allowed to build or expand data centers unless they prove to the Office of the Federal CIO that it’s absolutely necessary, according to a new memo released by the White House’s Office of Management and Budget.

    The new Data Center Optimization Initiative replaces the now six-year-old Federal Data Center Consolidation Initiative and has much stricter goals and additional rules meant to reduce the government’s sprawling data center inventory and the amount of money it takes to maintain it.

    The government spent about $5.4 billion on physical data centers in fiscal year 2014. The new initiative’s goals are to reduce data center spending by $270 million in 2016, by $460 million in 2017, and by $630 million in 2018, for a total of $1.36 billion in savings over the next three years.

    According to a recent report by the Government Accountability Office, the government’s efforts to reform IT spending have had varying degrees of success. Overall, data center consolidation and optimization efforts to date have resulted in about $2 billion in savings over the last five years, according to the GAO. For perspective, the government’s total annual IT spend is about $80 billion.

    According to OMB’s most recent estimate, there are currently more than 11,700 government data centers.

    While FDCCI and the subsequent Federal Information Technology Acquisition Reform Act required agencies to conduct data center inventories and identify facilities they could close and consolidate, set goals for footprint reduction, and created rules for regular progress reporting, they did not preclude agencies from building new data centers.

    That’s no longer the case. If an agency wants to build a data center or expand an existing one, it must make the case to the OFCIO that there is no better alternative, such as using cloud services, leasing colocation space, or using services shared with other agencies.

    DCOI also raises the amount of data centers agencies are required to close and puts in place a number of additional requirements for energy efficiency, server virtualization, server and facility utilization, and use of data center management tools.

    Read more: Congress to Mull Government Data Center Efficiency — Again

    Get to PUE under 1.5 or Shut It Down

    Every agency now has to install equipment for measuring every data center’s Power Usage Effectiveness. Data centers not slated for closure have to have PUE lower than 1.5 by September 2018.

    If improving the PUE in that timeframe is not cost effective, an agency has to look for ways to run the workloads the data center supports elsewhere, such as a cloud service or a data center shared with another agency.

    This applies to all so-called “tiered” data centers. More on that below.

    If It Has a Server, It’s a Data Center

    The memo defines a data center as any room that contains at least one server, regardless of how it’s being used, i.e. production, stage, test, development, and so on.

    The definition is limited to servers only. A room with switching and routing gear or security hardware will not be considered a data center under the new rules.

    The latest initiative introduces the concept of tiered and non-tiered data centers. A tiered data center, under the definition in the memo, is one that has separate space for IT infrastructure, a UPS system, an independent cooling system, and a backup generator, or, in other words, a typical data center.

    Any other kind of room with servers will be considered a non-tiered data center.

    Agencies previously used a four-tier system to describe data centers in their portfolios. Facilities that fell into any of those four tier categories will now be considered simply tiered data centers.

    The new rules also do away with the previous “core” and “non-core” data center distinctions.

    Close More Data Centers Sooner

    Agencies’ previous goals were to close 22 percent of tiered data centers and 50 percent of non-tiered data centers. The new goals are to close 25 percent of tiered data centers and 60 percent of non-tiered data centers by the end of fiscal 2018.

    Using DCIM No Longer Optional

    Not only do agencies have to replace manual inventory management and infrastructure monitoring with automated Data Center Infrastructure Management software in all of their data centers by the end of 2018, the General Services Administration will oversee a centralized DCIM software procurement program that they will have to use.

    The procurement program isn’t in place yet, so for now, agencies are free to select DCIM tools on their own. Once the program is in place, they will have to use it or make the case that it doesn’t fit their needs.

    7:17p
    Big Round of Layoffs at IBM’s Tech Services Division
    By WindowsITPro

    By WindowsITPro

    IBM is positioning itself as a leader in cloud and artificial intelligence, but on the way there it’s making some painful cuts.

    Yesterday, numerous employees reported that the company was undergoing a massive layoff — or as IBM calls it, a Restructuring Action — in which it is making deep cuts across the company, particularly in Global Technology Services, where it had previously announced coming eliminations.

    Mike Dorosh, a research director at Gartner who spent about 16 years at IBM, said that IBM’s cuts have become regular occurrences, but that this one was particularly dramatic.

    Every couple of years they reorient their organization around labor arbitrage, said Dorosh. Nobody’s quite sure how many people have been laid off, but I heard from colleagues that it was pretty massive. He said that reports that it was as much as one-third of the company U.S. workforce, however, were inaccurate.

    While IBM had declined to comment on the cuts to other outlets, it did point to a variety of openings across the company, and stated that it currently has 25,000 open positions.

    Indeed, the company is spending heavily to acquire talent, IP, and data in areas of growth, with billions in acquisitions spent in its Watson Health initiatives.

    Dorosh said the jobs being eliminated were likely ones like systems administration and storage management that could be off-shored to Hungary and India. US openings would likely be in hotter areas like OpenStack and virtualization.

    But the cuts will be particularly painful for those let go this time around: As ZDNet reported in January, IBM has cut its severance from six months pay to just one month.

    It sends a really strong message to the workforce that will be joining [IBM], said Dorosh. I worry that message is to be careful.

    The Facebook page Watching IBM, pictured above, was sharing stories of those impacted by the cuts.

    This first ran at http://windowsitpro.com/industry/ibm-undergoes-massive-layoffs-global-technology-services-gts-other-divisions

    7:54p
    Friday Funny: Lucky Data Center

    It’s got to be a lucky data center!

    Here’s how it works: Diane Alber, the Arizona artist who created Kip and Gary, creates a cartoon, and we challenge our readers to submit the funniest, most clever caption they think will be a fit. Then we ask our readers to vote for the best submission and the winner receives a signed print of the cartoon. Submit your caption for the cartoon above in the comments.

    Congratulations to Todd, whose caption won the Star Wars edition of the contest. His caption was: “Stop calling the cold aisle ‘The Dark Side,’ you Nerf herder!”

    Some good submissions came in for last month’s New Year Server Refresh edition – all we need now is a winner. Help us out by submitting your vote below!

    It’s the new year and it’s time for some upgrading!

    It’s the new year and it’s time for some upgrading!

    Take Our Poll

    For previous cartoons on DCK, see our Humor Channel. And for more of Diane’s work, visit Kip and Gary’s website!

    8:13p
    NTT Unveils Broad Expansion of Enterprise Cloud Services
    By The WHIR

    By The WHIR

    NTT Communications has launched the latest version of its Enterprise Cloud this week, with immediate availability in Japan and availability in the UK, Singapore, US, Australia, Hong Kong, and Germany to follow.

    NTT Comm said the improvements were driven by enterprise challenges, which include having to grapple with migrating traditional ICT operations to the cloud while simultaneously deploying cloud-native applications at a rapid pace.

    Based on OpenStack, the Enterprise Cloud solution includes dedicated bare-metal servers as part of its hosted private cloud service, where customers can access the automation and flexibility of a multi-hypervisor environment, NTT Comm said, with Cloud Foundry-based Enterprise Cloud PaaS available now to customers as well.

    Read more: Huge Sacramento Facility Part of Global NTT Data Center Push

    “As enterprise businesses move towards digitalization, globalization and cloud, NTT Com will continue to help customers innovate business processes and create new business models with Enterprise Cloud, which has been enhanced to meet requirements of both secure and reliable ICT and flexible and agile ICT,” Motoo Tanaka, Senior Vice President of Cloud Services at NTT Com said in a statement.

    NTT Comm has also launched Cloud Management Platform for the unified control of both Enterprise Cloud and third-party providers’ clouds, including Azure and AWS.

    The company said it plans to continue to enhance its capabilities, including SAP HANA, virtual private PaaS and enhanced cloud management capabilities.

    “The enhancements to NTT Com’s Enterprise Cloud offer enterprises the type of comprehensive platform needed for the digital transformation journey,” Melanie Posey, research vice president for IaaS/hosting services at IDC said in a statement. “Hybrid is the future state of enterprise IT. Enterprise Cloud accommodates the requirements of both traditional 2nd Platform 1 enterprise applications and agile DevOps-oriented cloud-native 3rd Platform 2 environments, while NTT Com’s cloud management platform is positioned to provide unified visibility, management, and control across the entire hybrid IT stack.”

    This first ran at http://www.thewhir.com/web-hosting-news/ntt-communications-updates-enterprise-cloud-to-address-hybrid-cloud-needs

    10:40p
    Demystifying the Data Center Sourcing Dilemma

    The data center plays a central role in any modern business. Strategies, go-to market initiatives, and organizational goals are all being built around IT capabilities. So, how do you make the most of the data center resources available to you? Most of all, how do you evolve the IT environment to support new business initiatives?

    Here are some key ways changes in the data center and IT landscape impact your business:

    • Increasing demand for IT services is driving data center growth, putting additional pressure on existing data centers.
    • Many data centers in operation are reaching the end of life and require major renovation.
    • Before spending money on the existing aging environment, many companies are evaluating the spectrum of data center sourcing options, from greenfield builds to cloud services.

    There is a spectrum of sourcing options, and while each option has its advantages and disadvantages, variables such as size, location, and investment strategy are the drivers that influence which solution is the best fit.

    At the Data Center World Global conference in Las Vegas later this month, Laura Cunningham, business consultant at Hewlett Packard Enterprise, will talk about the critical considerations around data center sourcing, and what to do with legacy components.

    Cunningham comes from a unique background as it relates to the data center community. She brings extensive experience in developing business cases for clients in order to justify data center investments emphasizing Total Cost of Ownership, as well as Return on Investment. Specifically, she translates technology specifications into a business case that can be understood by those outside of IT.

    In the session, attendees will get a chance to:

    • Learn how to get buy-in from all stakeholders on your data center strategy.
    • Consider what the financial decision makers will be focused on when weighing data center sourcing options.
    • Understand how different data center credits and energy policies can influence decisions.
    • Learn about aligning data center sourcing strategies with capacity requirements over time.
    • Learn how data center location impacts TCO.
    • Learn to leverage your company’s capital preference to best position its data center sourcing strategy.

    Join Laura Cunningham and 1,300 of your peers at Data Center World Global 2016, March 14-18, in Las Vegas, NV, for a real-world, “get it done” approach to converging efficiency, resiliency and agility for data center leadership in the digital enterprise. More details on the Data Center World website.

    << Previous Day 2016/03/04
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org