Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, December 3rd, 2015
| Time |
Event |
| 12:09a |
SlideShare Moves from Hosting to LinkedIn Data Centers Engineers at SlideShare, the popular online service for hosting and sharing slide decks LinkedIn acquired three years ago, have moved their application stack out of a managed hosting provider’s data center into a LinkedIn data center – a project that took more than one year to finish.
Data center consolidation after one company buys another is never quick and easy, and companies usually wait a long time to start moving systems between facilities. In another recent example, Instagram moved its application stack from Amazon Web Services into Facebook’s own data centers last year – two years after the social networking giant bought it.
In addition to consolidating infrastructure, in both cases the acquired companies cited the opportunity to use the new parent company’s technological resources as a reason to move.
LinkedIn site reliability engineer Anatoly Shiroglazov described the migration process in detail in a blog post this week. “It was clear that to sustain growth and integrate the best parts of both products, SlideShare needed to move to LinkedIn data centers,” he wrote.
Linkedin had a growing data center infrastructure and needed its systems to work across multiple sites, while SlideShare appears to have hosted its stack in a single location. The parent company also had a much larger engineering team that could make bigger investments in technology and had already built sophisticated search and analytics systems. It also had large reliability engineering and database administration teams.
LinkedIn’s data center needs are growing rapidly. The company recently made changes to its data center strategy, switching from retail colocation to wholesale facilities, where it is for the first time using a custom infrastructure design.
The social network’s storage and compute requirements grew 30 percent over the last 12 months. It currently uses 30 MW of data center capacity in the US and overseas, working to add more capacity in Oregon and Singapore.
The SlideShare team had to change a lot to adjust to the way LinkedIn’s infrastructure was set up. The parent company, for example, doesn’t allow all of its servers to have access to the internet for security purposes. Only servers in the demilitarized zone, also known as DMZ, had external access for security purposes. A network DMZ acts as a buffer between a company’s internal network and the rest of the world.
All SlideShare hosts had access to the internet, and the company’s software development cycle depended heavily on this capability, Shiroglazov wrote.
Another example is the operating system. LinkedIn was using a much more recent Red Hat distribution of Linux than SlideShare was, and a lot of SlideShare’s code had to be recertified on the new OS. The team also had to change its Puppet code for infrastructure management and database operation practices.
To make sure the migration didn’t bring the service down, the SlideShare team first deployed the new stack in the managed hosting data center they had been using to make sure it worked, and then started diverting traffic to LinkedIn data centers, read traffic first, write traffic second.

Timeline of the transition from SlideShare’s hosting data center to a LinkedIn facility over 14 days. Click image to enlarge. (Source: LinkedIn Engineering blog)
Once the transition was competed, the company decommissioned its infrastructure in the managed hosting facility. The next step is to modernize its software stack even further. SlideShare is now working on breaking its monolithic application down into microservices, phasing out legacy components as they are replaced by LinkedIn equivalents, according to Shiroglazov. | | 5:01a |
Hot Data Center Startup Vapor IO Raises First Round of Funding While venture capital funding for startups focused higher up the technology stack – companies that make IT automation or cloud management software, for example – is common, it’s relatively rare for a company focused squarely on the physical aspects of the data center to announce a funding round.
Vapor IO, which came out of stealth earlier this year with a radical new design of the data center rack and sophisticated rack and server management software, has closed a Series A funding round, led by Goldman Sachs, with participation from Austin’s well-known VC firm AVX Partners.
Tom Jessop, a managing director at Goldman, is joining Vapor’s board of directors, and so is Chris Pacitti, general partner at AVX.
The Austin-based company hasn’t disclosed the size of the round, but its founder and CEO Cole Crawford said he was “very happy with the number.”
Not disclosing the size of a Series A is a philosophical choice for Crawford. “Everybody has an opinion about it: either you took too much or too little,” he explained.
Vapor is an attempt to disrupt the data center infrastructure industry at the rack level.
Instead of traditional rows of racks and straight data center aisles, Vapor’s hardware product, called Vapor Chamber, is a cylinder where six wedge-shaped server racks are arranged in a circle. Servers take cold air in from outside of the cylinder and push hot air into a column in the center, from where it is sucked out at the top.
The company claims the design is more space- and energy-efficient.
Vapor’s software stack includes an open source server hardware management system OpenDCRE, which aims to replace the old standard Intelligent Platform Management Interface. Based on OpenDCRE is Vapor’s commercial data center management software product called Vapor Core.
Having Goldman’s Jesson on the board is especially important for Vapor. Goldman has global reach, and Crawford hopes the startup will benefit from those connections, in terms of both hiring talented staff and in terms of meeting new potential customers, he said.
AVX is a fund recently formed by Pacitti and his partners from Austin Ventures, a tech VC firm that dominated the Austin market in the dot-com era but whose dominance declined in the 2008 recession and never quite bounced back, according to the Wall Street Journal.
AVX is Pacitti’s new fund that leverages AV’s network and reputation to go after mid-stage startups, so Vapor, being an early-stage startup, is an exception.
In a statement, Pacitti explained that while AVX typically focuses on new companies that already have some early revenue traction, Vapor is “so well positioned and promising that we gladly make an exception.”
One of the reasons he cited was having Crawford at the helm. Crawford has been involved in projects that have made massive impacts on the data center industry.
He participated early on in the creation of OpenStack, the open source cloud infrastructure software that has become the primary alternative to building clouds using VMware’s technology. Crawford also held several senior leadership roles at the Open Compute Foundation, which oversees the Open Compute Project, Facebook’s initiative that aims to bring benefits and philosophy of open source to hardware and data center design.
On his part, Crawford said Pacitti was a “legend,” and that it was rare for a company to raise money in Austin without talking to Austin Ventures. | | 5:05p |
Three Government IT Insights for Data Center Providers Steven Dreher is the Director of Solution Architecture at Green House Data.
As the longtime CTO of the Wyoming Supreme Court, I’ve watched enterprise IT evolve from a very different perspective than many in the data center industry. For one thing, with less flexible funding that must be approved across various branches of government on any given cycle, all of our projects required significant lead time. Agility is difficult at the government level.
But we’re starting to see government organizations from the Fed on down turning toward the cloud, consolidating data centers, and using private companies as service providers. Amidst this shift, I’ve joined a private company myself.
Whether you’re freshly FedRAMP compliant and trying to woo government agencies as they consolidate their data centers, or just curious how to overcome your own internal political battles for IT priorities, these insights can help.
The Public Sector Space is Ripe for Big Data
Usually, the public sector is more hesitant to adopt or embrace new technologies. However, data is the exception. Being able to collect, compile, and analyze data across a given vertical can be a treasure to policy writers and decision makers.
However, developing the talent and tools to do so is a large barrier to many public sector organizations that often operate on a shoestring. Statewide or nationwide initiatives have costs that often reach billions of dollars and ultimately yield very little fruit (or fail outright). One example of this is the initial rollout of healthcare.gov, widely considered a massive and expensive flop as millions struggled to access the site. Part of the problem is too many vested interests and the turnover that occurs when officials leave office. The drivers of a given initiative may have left long before the project is completed. At other times, the complexity of a government organization gets in the way, and responsibilities are too dispersed.
A provider with the expertise to offer a managed big data solution, with the resources to scale at an attractive price point, can offer an entryway to big data platforms for government groups.
There is Often a Gap Between Policy Writers, Decision Makers and Technology
I started with long lead times for a reason—there’s a catch up period between enterprise adoption of technology and government pulling the trigger on it. That means that once you convince a government IT office on the value of your solution, they’ll have to work up the chain to first demonstrate the technology itself, then secure funding and approval for it.
Once you have helped your government prospect fight to secure a project with your company, don’t be surprised if it is changed midstream or even outright cancelled. The reason for this? There is an inevitable tide of politics and changes of priority in public sector work. The range of stakeholders on particularly large projects in the public sphere is very large, and sometimes undefined. An election cycle or key management retirements can throw a wrench into your entire deployment.
That can be similar to an enterprise environment, where different departments and managers have different goals and priorities, each of which helped drive a given IT project. Suddenly, that desktop virtualization project is given less priority than disaster recovery, as systems went down for a brief period last week. Or a new CFO requires a new SaaS platform as they revamp the department.
In either environment, these shifts create opportunities to build bridges and alignment with political (or employee) supporters, while presenting new challenges or creating talking points for critics. You must prioritize yourself in order to complete projects and provide the organization with the technology it needs to succeed.
Recruiting IT Talent Can Be Difficult in the Public Space
While government jobs have the reputation of being a cushy, guaranteed gig with retirement benefits, the public sector often can’t offer as competitive of compensation as public companies for in-demand positions like IT.
For a data center provider, that means the more remote hands and managed services you can provide to a government entity, the more value you’re bringing to the table, especially if you can do it in a package without adding as much cost as a full-time employee would. Your expertise must supplement the staff they do have available.
Public access to information is in its infancy and will remain a major opportunity for data center providers. As a society, the expectations regarding public information tools and data are expanding exponentially. This involves more convenient ways to interact with government bodies: Can I pay my property taxes online, or initiate a court case without having to physically drive to a courthouse? Even as we start to see these systems emerging, we are nowhere near the full potential of digital government.
Your company might not offer the platform for these solutions, but you can certainly host them. As more and more government offices go digital, data center providers must be there with the security and availability to offer vital services to citizens and public offices, from the local level up to federal programs.
Every organization faces its own political challenges, and the data center industry is no different. While major opportunities exist among the public sector, take a hard look at previous mishaps to prove how, as a service provider, you can help take government IT to a different level and relieve some of those political pressures.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 6:26p |
Google Makes its Biggest Renewable Energy Purchase Yet As far as the general public is concerned, Google and its data centers have become synonymous with the internet, and when activists call for a greener internet, Google data centers are in the spotlight more than anyone else’s.
On the rare occasion that the public eye turns to the issue of carbon emissions associated with watching puppy videos, Google has a good story to tell. Today, that story got even stronger.
Timed to coincide with this week’s United Nations Conference on Climate Change in Paris, Google announced that it has made its biggest renewable energy purchase to date. The company has agreed to buy 842 MW of wind and solar power globally to offset energy consumption of its 14 massive data center campuses.
The bulk of the generation capacity is in the US, where most of the company’s data centers are. Google has made three wind power deals with three separate developers in America – 200 MW, 200 MW, and 225 MW – and a 61 MW solar deal with Duke Energy, the largest utility in the country. Google announced the agreement with Duke in November.
The company also agreed to buy energy from an 80 MW of solar farm in Chile and 76 MW of wind power in Sweden.
As has been the case since 2010, when Google made its first utility-scale renewable energy purchase agreement, the long-term contracts, ranging from 10 to 20 years, provide developers with the necessary financing to build the massive solar and wind projects.
Google has been a pioneer in renewable energy purchasing among data center operators, devising complex schemes to make sure carbon emissions associated with powering its data centers are offset by renewable energy generation while at the same time ensuring the cost of energy makes business sense.
Other web-scale data center operators, companies like Facebook, Microsoft, and Amazon, have followed in Google’s footsteps, announcing similar deals. Major commercial data center providers recently started taking renewable energy a lot more seriously and made substantial financial commitments.
Google may be the most familiar internet company, but the bulk of the internet is powered by data centers around the world operated by those commercial providers, companies like Equinix, Interxion, TelecityGroup (in the process of being acquired by Equinix), and Digital Realty Trust.
Equinix has made massive renewable energy purchase agreements this year to offset its carbon emissions in the US, and Interxion has been one of the renewable-energy leaders among data center providers in Europe.
Digital Realty, whose specialty has been providing wholesale data center space – including to the likes of Equinix – offers its customers premium-free renewable energy anywhere in the world for one year.
Switch, which provides data center services to Google, Amazon, eBay, Intuit, and others, has made a commitment to powering its data centers with 100 percent renewable energy and invested in a 100 MW solar farm in Nevada. | | 7:06p |
Ballmer: Microsoft Should Report More Cloud, Hardware Numbers 
This article originally appeared at The WHIR
Former Microsoft CEO Steve Ballmer wants to see the company disclose profit margins and sales for its cloud and hardware businesses rather than annualized revenue run rate, which it currently reports.
In conversation with Bloomberg at Microsoft’s annual meeting this week, Ballmer said that profit margins and sales of those divisions are key metrics and should be reported as such.
Ballmer said that margin is important because though gross margins for software are very high, they are far lower for things like hardware and cloud services.
Ballmer stepped down from his CEO post at Microsoft last year, handing over the reins to Satya Nadella in February 2014. Ballmer stepped down from the Microsoft board in August 2014.
In an open letter to Nadella announcing his departure from the company’s board of directors last year, Ballmer said “…[i]n the mobile-first, cloud-first world, software development is a key skill, but success requires moving to monetization through enterprise subscriptions, hardware gross margins, and advertising revenues. Making that change while also managing the existing software business well requires a boldness and fearlessness that I believe the management team has. Our board must also support and encourage that fearlessness for shareholders to get the best performance from Microsoft. You must drive that.”
During the shareholders call this week, Ballmer also said he wants to see Windows Phones support Android apps, dismissing Nadella’s answer to an audience member who questioned the lack of key apps on the company’s phone. According to Bloomberg, Nadella said Windows developers can write universal applications that work across devices, which Ballmer said “won’t work.”
While its Windows phone app ecosystem may be lacking, Microsoft is hoping to help its enterprise customers create and share business apps with the launch of Microsoft PowerApps earlier this week.
This first ran at http://www.thewhir.com/web-hosting-news/ballmer-microsoft-should-report-profit-margins-sales-for-cloud-and-hardware-businesses | | 7:40p |
Gartner: Server Market Reverses Downward Trend The 11 consecutive quarters of declining server sales continued to fade in the rear-view mirror, as server revenues rose 7.5 percent in the third quarter, while server shipments rose 9.2 percent, Gartner said in its latest report on the server market.
Server sales also rose in the second quarter, albeit more modestly, establishing a growth trend.
“The third quarter of 2015 produced growth on a global level with mixed results by region,” Jeffrey Hewitt, research vice president at Gartner, said in a statement. “All regions showed growth in both shipments and vendor revenue, except for Eastern Europe, Japan, and Latin America, which posted revenue declines of 5.8 percent, 11.7 percent, and 24.2 percent, respectively, for the period. Currency exchange rates are one of the main reasons for the disparity in regional server market performance.”
By chalking the soft numbers in underperforming regions to currency fluctuations, Hewitt provides further reassurance for server vendors of the market’s overall strength. All the major vendors had revenue increases in the quarter, except for IBM.
IBM’s share decreased 42 percent from 18.5 percent to 9.8 percent of the global market, mostly due to the sale of its x86 server business to Lenovo. Lenovo picked up the difference, increasing from 1.3 to 7.9 percent. The increase makes Lenovo the world’s fourth largest server vendor, passing Cisco.
IBM actually posted a small gain in share of the market it remained active in: a small drop in RISC products and a significant jump in mainframe sales. The market as a whole experienced the decrease in the RISC/Itanium Unix segment and increase in “other” CPU, including mainframe sales.
The global server market rebounded in Q2 2014 following 10 quarters of declining revenues and 11 of declining shipments. Third quarter 2015 revenues were actually down slightly from the second quarter despite increased shipments, as server prices continue to fall. The market also saw a slight decrease in revenues and a slight increase in shipments from Q2 to Q3 2014.
Global server sales reached almost $51 billion in 2014, according to IDC, and they appear likely to approach $55 billion this year. | | 8:28p |
One Week – Four Data Center Acquisitions It has been a particularly active week in terms of data center acquisitions. Four companies announced deals in Canada, the UK, and Australia. Here’s a roundup:
Rogers Buys Internetworking Atlantic in Canada
Canadian telecommunications giant Rogers has acquired Internetworking Atlantic, which operates a fiber network on the country’s east coast and provides data center services. IAI’s colocation facility in Halifax will be Rogers’ 16th data center in the country.
Financial terms of the transaction were not disclosed.
Virtus Buys Slough Data Center
UK data center service provider Virtus Data Centers bought a data center in Slough, just outside of London, from Infinity SDC. The deal doubles the company’s data center capacity, which is now 35 MW. Its other data centers are in Hayes and Enfield.
The acquisition, whose terms were not disclosed, follows a major investment in Virtus by Singapore’s ST Telemedia in January. ST Telemedia bought almost half (49 percent) of the UK company. The rest of Virtus is owned by a real-estate-focused fund manager called Brockton Capital.
iomart Acquires United Hosting for £7.5M
Cloud service provider iomart, based in Glasgow, has acquired United Hosting, a hosting company based in Hertfordshire, UK, where it also has a data center. United also leases data center space in London and Dallas.
The £7.5 million deal expands iomart’s managed hosting services capabilities for small and mid-size businesses. It is iomart’s second acquisition this year, following the purchase of cloud-focused IT consultancy SystemsUp in June.
Logicalis Buys Australian Provider Thomas Duryea
Logicalis, the major global IT service provider owned by South Africa-based Datatec, has acquired Thomas Duryea, a data center and cloud service provider in Australia, for an undisclosed sum. As part of the deal, Logicalis gains Thomas Duryea’s operations in Melbourne and Sydney.
The Australian company’s annual revenue is about $50 million, according to Datatec. |
|