Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, August 11th, 2015
| Time |
Event |
| 12:00p |
Congress to Mull Government Data Center Efficiency – Again The US government is one of the world’s biggest data center users, its departments and agencies using IT housed in about 2,000 government-owned data centers. The government has been struggling to fix the inefficiency of this infrastructure for a long time.
One of the biggest efforts of recent years has been the Federal Data Center Consolidation Initiative, kicked off in 2010 by Vivek Kundra, who was then the federal CIO. FDCCI has gone through many stages of refinement, and agencies it applies to have achieved varying degrees of success.
While FDCCI and related application-centered infrastructure optimization efforts address redundancy, they don’t directly address energy efficiency of existing government data centers. Legislation currently on the docket for both the Senate and the House of Representatives seeks to write federal data center energy efficiency improvements into law.
Representative Anna Eshoo (D-Calif.), whose district encompasses much of the Silicon Valley, has been pursuing a law that would require government data center operators to improve energy efficiency for more than two years.
The bill she reintroduced this year, called the Energy Efficient Government Technology Act was first introduced in early 2013 and passed by the House as part of a broader Energy Efficiency Improvement Act of 2014 about one year later. It failed to pass the Senate, however.
In March, Eshoo reintroduced the act (H.R. 1268) together with Representative Adam Kissinger (R-Ill.) and with official support from two more Democrats and one Republican. A companion bill that carries the same name was introduced in the Senate last month (S. 1706).
An Eshoo spokesman did not respond to numerous requests for comment.
Better Efficiency, More Data
If enacted, the bill would require agencies to come up with plans to buy and use more energy efficient technologies and report on efficiency improvements in their data centers regularly. It would put the Office of Management and Budget in charge of tracking their progress.
It would also mandate creation of an “open data initiative” to collect and make public federal data center energy usage data.
Another part of the bill would require an update on the 2007 report by the Environmental Protection Agency on server and data center energy efficiency that encompassed all data centers, not just the federal government’s. The original report estimated that data centers consumed about 1.5 percent of all energy consumed in the US. This and other findings from the report have been widely cited by companies and government organizations ever since, but today the figures are severely outdated.
Efficiency Seen as “Nice-To-Have”
While data center operators in the private sector have every incentive to increase energy efficiency because it directly impacts their companies’ bottom line, the dynamics are different in government. Duane Davenport, who recently joined Upsite Technologies to develop the company’s federal data center business, said that from his observations, data center energy efficiency is “a nice-to-have … rather than a requirement” for government agencies.
A mandate to improve efficiency has the potential to move the needle, in his opinion. “In the federal space, very little happens unless there is an edict or a mandate,” he said.
Davenport has been selling into the federal space for more than 25 years, most recently as an account executive at Hitachi Data Systems. He has done deals with a wide range of civilian and defense agencies and departments on behalf of Hitachi as well as Gateway, Commercial Data Systems, Sun Microsystems, and Digital Equipment Corporation.
Vendors like Upsite, which sells data center airflow optimization products, obviously stand to benefit from the bill’s passage. Many companies that sell data center efficiency products and services are headquartered in Eshoo’s district in California.
Little Incentive to Improve
One of the reasons federal agencies are so slow to improve energy efficiency of their data centers may be that many of them don’t feel any impact from the power consumption of the mission-critical facilities they host their applications in.
“The actual departments or agencies themselves don’t know exactly what their energy efficiency is, because, oftentimes, the building that they’re in is operated by GSA, or operated by other departments or agencies, so they don’t even see their power bill,” John Lind, VP of sales into the public sector at QTS, a data center provider, said. Government data centers “are usually outdated” but unlike the private sector there has been little incentive to update them, he said. | | 3:00p |
Windows 10 Compatible Remote Server Admin Tool Coming Soon Taking a look back at the last two weeks in Microsoft news reveals a successful release of Windows 10 as illustrated by installations on more than 25 million devices and a doubling of usage on the MacOS version.
That’s all well and good for the masses, however, IT administrators have been chomping at the bit for a Remote Server Administration Tool (RSAT) that’s compatible with the new operating system. The current version, updated back in January, only works with Windows 7 and Windows 8.1. However, users won’t have to wait much longer, as reported by our sister site Windows IT Pro. Gabe Aul, Microsoft’s corporate vice president, just announced on Twitter that the RSAT for Windows 10 will be made available before September as will the third technical preview of Windows Server 2016.
While the RSAT update provides a much-needed tool that allows for remote control of roles and features installed on Windows Server 2012 , it’s not expected that much will change with Windows Server 2016 except for a few minor tweaks. As previously reported, the 2016 version will include Nano Server, a new “headless deployment option for Windows Server” that features “deep refactoring” focused on CloudOS infrastructure, born-in-the-cloud applications and containers. According to Microsoft, Nano Server will “eliminate the need to ever sit in front of a server.”
As you’ll recall, Microsoft ended support on Windows Server 2003 back on July 14 after the roll out of Windows Server 12, which some say is its best server OS in a long time. Architected from the ground up as a hybrid cloud enabler, it allows companies to take advantage of as little or as much of Microsoft Azure services as needed but still supply a solid private data center OS. | | 3:30p |
Six Reasons to Get User Experience Right Simon Townsend is Chief Technologist, EMEA, for AppSense.
User Environment Management (UEM) technology optimizes the desktop computing experience while reducing IT management complexity and cost. Not only does a solution that truly delivers end-to-end UEM deliver the fastest, easiest and lowest-cost desktop possible, it can achieve this regardless of the mix of physical and virtual devices or combination of multiple devices, locations or delivery mechanisms.
The benefits are both pervasive and persistent: A UEM produces a better user experience, lowers both capital and operational expenses, and provides a significant return on investment. Consider these six reasons to get the user experience right for both the business and its employees.
Achieve a Seamless User Experience
Advanced end-to-end UEM solutions deliver a seamless user experience across all desktops and devices. This includes efficient access to applications and data without slowing server performance or increasing storage requirements.
Easier Migration and Upgrades
Environments that use end-to-end UEM don’t need to worry about migrations or upgrades. Because the user has been decoupled from the underlying system, it’s effortless to migrate the user profile and data to new devices and operating systems. This reduces the time, cost and complexity of migration – a process that has traditionally been very tedious – and eliminates user disruption with literally zero-downtime migrations.
Better IT Control
The best end-to-end UEM solutions put granular control into IT’s hands so that they may manage corporate and application policies more effectively and efficiently. This can also assure accurate utilization of user privileges and work to prevent costly security breaches.
Better-Performing Desktops
An end-to-end UEM solution can dramatically speed the user environment experience, improving employee acceptance and productivity. With more efficient distribution of profile and application policies, users no-longer need to wait for pieces of their environment to load that they don’t need. In addition, smart controls allocate CPU, memory and disk resources to improve the overall quality of service, increase user density, and reduce hardware requirements.
Dramatic Cost Reductions Across the Board
Costs are reduced across the board for desktop infrastructure, delivering both capital expense (CAPEX) and operational expense (OPEX) advantages. They can dramatically lower desktop and support costs, conserve infrastructure expenses, and optimize application license expenditures. In addition, with the advantages listed above, end-to-end UEM fuels user productivity for greater workforce efficiencies.
Efficient Data Access and Management
Enterprises and their users further benefit from seamless access to data via secure and efficient processes that offer additional granular policy control and end-to-end security. As a result, users can access their work content on any Windows PC, Mac, iPad, iPhone or Android-based device with confidence and ease. Furthermore, data is future-proofed as storage requirements change without the need to ever migrate data when new devices are deployed.
When it comes to the user acceptance of new desktop implementations, it’s all about experience. To power user adoption while producing the greater IT efficiencies that will result in exponential value both today and into the future, select an end-to-end UEM solution that will help get user experience right. It will lower overall desktop costs, deliver the fastest log-on and in-session performance and give users the consistent, reliable experience they expect.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 4:26p |
Equinix Cloud Chief Chris Sharp Leaves to Join Digital Realty as CTO The shakeup at the top of Digital Realty’s executive management team continued Tuesday, when the company announced the appointment of Chris Sharp to the role of CTO. For the past two years, Sharp ran the cloud strategy of Digital’s biggest rival Equinix. Jim Smith, who has been the data center provider’s CTO since its founding more than a decade ago, is leaving the company, a Digital spokesman said.
The company has also appointed Philip Lin, who until now has led strategy and development for the Chinese data center services giant 21Vianet, to the newly created role of senior VP of strategy.
Both appointments are a strategic fit for San Francisco-based Digital. The company has put a lot of emphasis on growing the pool of cloud service provider partners its data center tenants can buy services from, and Sharp has led the development of what may easily be the world’s largest ecosystem of cloud providers that interconnect with customers and partners within Equinix data centers around the world.
Equinix execs have said that providing private network connectivity to cloud service providers is its fastest-growing business segment. The Redwood City, California-based company has been growing this ecosystem to include providers beyond Infrastructure-as-a-Service, focusing recently on Software-as-a-Service partnerships with the likes of Salesforce, Microsoft Office 365, and Google for Work.
Digital also recently acquired Telx, a major Equinix rival in the US, dramatically increasing the size of its retail colocation and interconnection business. The company’s primary business has always been wholesale data center leasing, but it has changed its model to include more services beyound selling large chunks of data center space and power capacity.
The appointment of Lin, who has led development at one of China’s biggest data center providers, signals a sharper focus on China and Asia Pacific in general. China is one of the world’s fastest-growing data center markets. While it has data centers in Hong Kong, Singapore, Japan, Melbourne, and Sydney, Digital doesn’t have presence in China, which is known to be a difficult place to do business for foreign companies without local partners. That may change following Lin’s appointment.
Both Lin and Sharp will report to COO Jarrett Appleby, who himself came on board earlier this year. He is a former COO of another Digital rival CoreSite. Prior to CoreSite, Appleby spent more than three years as chief marketing officer at Equinix.
Another major departure from Digital announced Tuesday was David Schirmacher’s. After more than 3.5 years as senior VP of operations, Schirmacher was appointed to lead design and construction in January of this year.
Digital’s executive team has been going through a major transformation, starting with the sudden departure of its founding CEO Mike Foust in March 2014. Foust was replaced by Digital’s former CFO William Stein. Digital hired career investment banker Andrew Power to replace Stein in the CFO seat.
In addition to changing the chief executive, the company created the role of CIO and hired Michael Henry, former CIO at Rovi, to fill it. | | 6:47p |
Seagate Unveils SSDs for HPC Storage Looking to give IT organizations two paths to accessing high performance solid-state drives, Seagate today at the Flash Memory Summit announced SSD offerings that plug into an NVMe interface as well as a PCIe Flash Accelerator Card.
Based on technologies Seagate acquired from LSI Technologies, Kent Smith, senior director of product marketing for Seagate Flash, said that as part of an effort to expand Seagate’s presence in high performance computing environments the long-time manufacturer of magnetic storage devices is increasing its investments in developing products based on non-volatile Flash memory technologies.
The advantage that SSDs bring in the HPC storage context is that they don’t require developers to work as closely with storage architects to optimize performance by making sure that particular sets of data are laid down in a precise manner on an HDD.
The 2.5-in. Seagate Nytro XF1440 provides access to 1.8TB of storage and M.2 XM1440 SSDs provides 960GB of storage at throughput rates of up to 20GB per second through the NMVe interface. In terms of power consumption, that equates to 25,000 IOPs per watt, according to Seagate.
“Both these offering are latency-optimized for the HPC segment,” said Smith. “They also draw half the power of other Flash storage systems for about one quarter of the price.”
Of course, the ability to use those SSDs depends on whether or not an IT organization has acquired relatively new servers that support the NVMe interface.
For those organizations looking for a more traditional way to connect to PCIe, Seagate is also making available its Nytro XP6500 as a PCIe flash accelerator card to access 4TB of raw capacity via a card that sports 4GB of on-board memory.
While Seagate is investing more in non-volatile forms of memory for primary storage, Smith said, the company is not moving away from traditional hard disk drives any time soon. It still produces over one million hard drives a day that get used within everything from consumer devices to enterprise-class storage system.
More IT organizations may be using SSDs for primary storage, but the price differential between SSD and HDD on a per-gigabyte basis means HDDs will continue to be used for primary, secondary, and tertiary storage for some time to come, Smith said.
While SSDs will clearly be playing a much bigger role in both HPC and traditional enterprise IT environments in the months and years ahead, most IT environments will be managing hybrid storage systems based on a mix of SSDs and HDDs for years to come. | | 7:34p |
EIG’s Acquisition of Verio and Site5 Nets 86,000 New Subscribers 
This article originally appeared at The WHIR
Endurance International Group (EIG) has acquired assets of web hosts Verio and Site5for approximately $36 million, adding around 86,000 subscribers and bringing EIG’s total subscribers to 4.4 million.
The acquisitions were revealed in EIG’s second-quarter results and earnings call. The Verio acquisition closed in late May for around $13 million, and the Site5 brand closed in late June for around $23 million, according to EIG founder, president and CEO Hari Ravichandran.
“We expect to manage these businesses at break‐even to marginally profitable for the rest of the year as we migrate their subscriber bases to our back end platform,” Ravichandran said in the earnings call. “Once on platform, we expect to reach favorable economics and adjusted EBITDA contribution consistent with our previous framework for realizing synergies from acquisitions.”
In the latter half of 2014, EIG made several acquisitions including web host Arvixe that added 150,000 domains and 70,000 hosting accounts, as well as BuyDomains andWebzai.
EIG and other large players, such as GoDaddy and UK2, have been acquiring more and more web hosting properties, driving consolidation of mass-market web hosting. Many customers of hosts acquired by EIG and others have reported changes to their quality of service, and frustrations with their new platform. However, in a sector where margins are small, consolidation and the associated economies of scale help web hosts turn profit.
EIG finished the second quarter with $182.4 million in GAAP revenue, an increase of 20 percent over Q2 2014.
“We are pleased with our second quarter performance, delivering results solidly within our expectations,” Ravichandran said. “We continue to anchor our approach to our two-pronged strategy of growing subscribers and average revenue per subscriber, whether through acquired means or our own gateway products such as hosting, web builders, mobile, content management and related products. Given the opportunity ahead of us, we will continue to work toward building a long-standing business through enhancing our gateway products and brands, and will be experimenting with new marketing programs through the rest of this year, with the firm belief that longer-term, our targets for revenue and profitability will benefit from these efforts.”
This first ran at http://www.thewhir.com/web-hosting-news/eigs-acquisition-of-verio-and-site5-nets-86000-new-subscribers | | 9:40p |
Ingram Micro Adds IBM’s SoftLayer to Cloud Marketplace Ingram Micro just made automating the deployment of a cloud service a reality for its customers with the addition of IBM‘s SoftLayer to its Cloud Marketplace, as reported by our sister site, Talkin’ Cloud. The new offering allows access to both virtual and bare-metal servers.
The enhancement by Ingram Micro illustrates a commitment to building its marketplace, which makes it possible to automate the acquisition and provisioning of cloud services, and fortifies its long-standing partnership with Big Blue.
“IBM Cloud’s SoftLayer is a major player in IaaS, an area where competition is intensifying, quickly narrowing and growing fast,” Darren Bibby, vice president of channels and alliances research at IDC, told Marketwatch.com in a recent article. “IDC expects 36 percent growth in 2015 in IaaS, which makes it an excellent fit for the Ingram Micro Cloud Marketplace.”
Another benefit of this latest venture is that Ingram Micro can tap into IBM’s in-country compliance efforts, an extremely important aspect considering the huge array of public and private clients in heavily regulated industries such as healthcare, government, financial and legal organizations. As a result, the company’s channel partners can now provide a secure, resilient, and scalable platform to its regional customers for placing sensitive data in the cloud.
The Ingram Cloud Marketplace is designed to provide a single portal through which solution providers can fully automate everything from provisioning those services to billing. Its portfolio of solutions covers all major business categories including: infrastructure, security, communication and collaboration, business applications and platform, and cloud management services.
After purchasing the SoftLayer brand in 2013, IBM launched its own cloud marketplace the following year and, like Ingram Micro, has invested serious bankroll in cloud computing. How serious? It’s plunked down more than $1.2 billion in a global data center expansion for SoftLayer, while another $1 billion is going toward Blue PaaS. Meanwhile, the company has spent over $7 billion in 17 acquisitions.
Read the full report on Talkin’ Cloud |
|