Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, November 13th, 2015
| Time |
Event |
| 1:00p |
Who May Use the World’s First Floating Data Center? After they get over the initial “huh?” when they hear about the idea of building a data center on a floating barge, infrastructure execs for big companies usually want to know how Nautilus Data Technologies achieves the kind of energy efficiency it claims its floating data center offers, Kirk Horton, the startup’s VP of sales and marketing, said.
“That whole water factor completely evaporates once the client comes onto the construction site,” he said. “They see this massive 230-foot barge, and the whole notion of this being on water is out of their mind.”
Nautilus came out of stealth earlier this year, announcing its plan to build an 8 MW floating colocation data center, promising high energy efficiency, competitive pricing, and, due to its unusual approach to real estate, mobility and higher security than data centers on land. The company has completed a smaller proof of concept, has raised $25 million in private equity, and is now building its first commercial facility in a US Navy port on Mare Island, a peninsula in Vallejo, California, about 20 miles northeast of San Francisco.
Nautilus staff have taken many IT execs on tours of the prototype and the construction site on the barge, the company’s execs said. While they weren’t at liberty to name all the organizations who were interested in the project, those who participated in the proof of concept include Silicon Valley’s A10 Networks and Applied Materials, as well as the US Navy itself, according to Horton.
Like many other federal agencies, the Navy is in data center consolidation mode and actively looking for alternatives for its massive data center fleet. The department had more than 200 data centers around 2013, when it started its consolidation efforts.
Last year, it set a goal of reducing the number of its data centers to 20 or fewer and outsourcing 75 percent of its data center needs to commercial providers, according to a conference presentation earlier this year by John Pope, director of the Navy’s current Data Center and Application Optimization program.
Nautilus execs couldn’t disclose any specifics about the company’s engagement with the Navy, but “we have top-secret clearance with the Navy, and we’re doing some sensitive work for them,” Arnold Magcale, the company’s co-founder and CEO, said.
The reason the department has so many data centers is that it has traditionally built them to support warfighter operations wherever they are, according to Pope’s slides. “Warfighters require world-wide, secure, reliable, and timely information,” one of the slides read. “Multiple independent data centers grew up organically to support the warfighter.”
One of the reasons the Navy may be interested in floating data centers is their mobility. A Nautilus data center can be deployed in any port that meets the security, power, and network connectivity requirements and moved elsewhere when it is no longer required.
In the public sector, there’s demand for the ability to deploy data center capacity in places where it’s not easy to build brick-and-mortar facilities. One example would be so-called “edge” locations, where the amount of people connected to the internet is growing and content and service providers need data center capacity close to those locations to serve new growing markets, Horton said.
Magcale said he expects to commission the company’s first floating data center, being built atop a vessel named Eli M, after his mother, in the first quarter of 2016. Once completed, the barge will be moved to another location on the West Coast, which the execs also could not disclose.
Nautilus claims its facility will use up to 50 percent less energy than a traditional data center of comparable size. The energy savings will come from its patent-pending data center infrastructure management software and a cooling system that uses sea water, a feature that will be especially welcome in drought-ridden California. | | 4:00p |
One of World’s Oldest Data Center Construction Firms Building in Virginia DP Facilities may not be one of the more recognizable names in the data center industry, but the company has designed and built some of the world’s first data centers.
Founded in the 1950s, the company, known previously as Data Processing Facilities, had done some early data center work with IBM. It built a data center for the New York City Clearinghouse, the largest clearinghouse in the US that’s been around since mid-nineteenth century.
DP has built data centers for who’s who in New York’s financial services industry – the likes of Morgan Stanley, Dow Jones, The Wall Street Journal, and Lehman Brothers – as well as organizations like the New York City Fire Department, US Postal Service, and Carnegie Hall. The list goes on.
In addition to designing and building dedicated data centers, DP is also a data center provider, operating facilities internationally, including in Singapore and Netherlands. Its latest project is a data center in Wise, Virginia – a small town all the way in the state’s southwestern corner, far away from the massive data center cluster in Northern Virginia.
As far as the company’ CTO Tom Durkin is aware, there are no other data centers in Wise. There also isn’t exactly a preponderance of corporate offices in the area.
The goal is to provide data center space for business continuity and disaster recovery, he said in an interview. The risk of natural disasters in the area is low; there is ample access to power and fiber infrastructure and a technically competent labor force because of the nearby University of Virginia at Wise.
DP will start with 65,000 square feet of data center space, offering from 5 kW to 20 kW per cabinet, Durkin said. There is enough power on the 22-acre site to double that capacity if necessary, he added. | | 4:33p |
Linux Foundation Launches Open Source HPC Group 
This post originally appeared at The Var Guy
The Linux Foundation, along with industry and academic partners, plans to drive innovation in open source high-performance computing through a new collaborative project, OpenHPC.
The Linux Foundation, a nonprofit organization that promotes the Linux kernel and other open source projects, has partnered with Dell, HP Enterprise, Intel, Fujitsu Systems Europe and a number of university research labs to create the OpenHPC project. The collaboration will center on four main goals:
- Producing a stable environment for testing HPC software
- Creating an open source framework for HPC environments that will reduce costs
- Developing a sophisticated HPC software stack suited to a variety of applications
- Building a configuration framework that offers developers and users flexibility to tailor HPC software to meet their needs.
The time is right for new investment in open source HPC software because such software is vital in fields like meteorology, astronomy, engineering and nuclear physics, yet it has not been developed in a central, efficient way, according to the Linux Foundation.
“The use of open source software is central to HPC, but lack of a unified community across key stakeholders — academic institutions, workload management companies, software vendors, computing leaders — has caused duplication of effort and has increased the barrier to entry,” said Jim Zemlin, executive director, The Linux Foundation. “OpenHPC will provide a neutral forum to develop one open source framework that satisfies a diverse set of cluster environment use-cases.”
This first ran at http://thevarguy.com/open-source-application-software-companies/linux-foundation-launches-open-source-high-performance-co | | 5:21p |
Europe Greenlights Equinix-Telecity Merger, With Caveats European antitrust regulators have approved the planned acquisition of TelecityGroup by Equinix, making the Redwood City, California-based data center provider the largest player in the European market.
The European Commission issued the approval on the condition that the two companies sell eight specific data centers in London, Amsterdam, and Frankfurt. One of the facilities slated for divestiture is presently an Equinix data center, while all the others are operated by London-based Telecity.
Equinix’s winning $3.6 billion bid to buy the European company in May ripped it out of the teeth of Interxion, another major player in the market. Amsterdam-based Interxion had previously struck a deal to acquire Telecity for $2.2 billion.
Had the Interxion deal gone through successfully, Equinix would be relegated to the number-two position in Europe indefinitely. Equinix CFO Keith Taylor told us in an interview earlier this year that a successful Interxion-Telecity merger would make it nearly impossible for Equinix to ever change that number-two status, which was part of the reasoning for going after Telecity aggressively.
On the company’s third-quarter conference call in October, Equinix CEO Stephen Smith said the company had made an offer to the European Commission with proposed commitments to nudge the review process in the direction of a favorable outcome but did not disclose the proposal’s details. The planned divestment of some of the future combined entity’s data centers appears to be at least the meat of that offer.
Here are the facilities Equinix and Telecity have agreed to sell in exchange for a go-ahead from the regulators:
London
- Telecity’s Bonnington House data center
- Telecity’s Sovereign House data center
- Telecity’s Meridian Gate data center
- Telecity’s Oliver’s Yard data center
- Equinix’s West Drayton data center
Amsterdam
- Telecity’s Science Park data center
- Telecity’s Amstel Business Park I data center
Frankfurt
- TelecityGroup’s Lyonerstrasse data center
Together, the eight data centers generated about four percent of the revenue the combined company would have made in the first nine months of this year, Equinix said in a statement.
Not counting the assets slated for divestment, the deal will add 32 data centers to Equinix’s already massive portfolio in Europe. The company operates about 30 facilities in Europe and the Middle East.
The acquisition will give Equinix instant presence in new markets, such as Ireland, Italy, Sweden, and Finland, among others, and substantially expand its presence in the key European markets of London, Frankfurt, and Amsterdam.
Equinix officials expect to complete the acquisition in the first half of 2016. | | 6:00p |
Verizon CFO: We’ll Continue to Support Cloud Business 
This article originally ran at Talkin’ Cloud
Verizon CFO Francis Shammo this week denied reports that his company is considering selling some of its enterprise assets.
The news comes after Reuters last week reported that Verizon was exploring a sale of enterprise assets that could be worth as much as $10 billion. These assets may have included landline and internet services provider MCI and colocation and cloud hosting provider Terremark.
However, Fortune noted that Shammo said Verizon plans to keep funding its cloud business.
“This is part of our portfolio and we’ll continue to support it,” he said at the Wells Fargo Securities 2015 Technology, Media & Telecom Conference.
Verizon sold some of its non-core assets earlier this year as well.
Frontier Communications in February acquired some of Verizon’s residential landline assets for $10.54 billion and its tower portfolio for more than $5 billion.
Verizon recorded $33.2 billion in operating revenues in the third quarter of this year, which represented a 5 percent year-over-year increase.
In addition, Verizon CEO Lowell McAdam said his company continues to bolster its earnings by providing its customers with dependable support.
“Verizon continues to grow earnings by delivering network reliability and superior value that continues to attract new customers,” he said in a prepared statement.
This first ran at http://talkincloud.com/telco-hub/verizon-cfo-well-continue-support-cloud-business |
|