Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, October 20th, 2015
| Time |
Event |
| 12:00p |
Amid Poor Results Overall, IBM’s Cloud Business Growing IBM now makes $4.5 billion in revenue from cloud delivered as a service – that’s separate from private cloud infrastructure it helps customers set up in their own data centers – which is less than Amazon Web Services’ $7.3 billion 12-month revenue run rate but 45 percent more than IBM’s own cloud revenue one year ago.
Good news was scarce in IBM’s third-quarter earnings report released Monday, and its cloud business growth was one example. The company continues to see revenues shrink across all of its business segments, leading it to lower expected full-year earnings for 2015 to between $14.75 and $15.75 per share.
The upper end of the new range was the lower end of the previous one, which was between $15.75 and $16.50.
On a call with analysts, IBM CFO Martin Schroeter reiterated that this was all to be expected. The company is going through a transition, repositioning itself for the future it says will be dominated by cloud services and Big Data analytics.
“We always said this would play out over time,” Schroeter said. “This is a longer-term play. We’re creating new platforms and building ecosystems.”
IBM’s total revenue for the third quarter was $19.3 billion, which is 14 percent down year over year, but only if you don’t take into account that IBM still had its x86 server business one year ago. Adjusting for the sale of the System x unit to Lenovo in September 2014 and currency fluctuations, Big Blue’s revenue was down only one percent year over year.
IBM reported Earnings Per Share of $3.02 – down 9 percent year over year. IBM stock was down almost 5 percent in afterhours trading on Monday following the earnings announcement.
Transforming Enterprise IT, One $1B Contract at a Time
Schroeter highlighted several recent customer wins that according to him illustrate where the business of providing IT infrastructure services to enterprises is headed. “We take over their IT systems and move them to the cloud,” he said.
IBM’s $700 million contract with Abu Dhabi’s Etihad Airways is for a wholesale transformation and outsourcing of the airline’s infrastructure to IBM. It includes a new data center in Abu Dhabi, which will be built and operated by IBM, as well as cloud-based services, such as analytics and IBM’s “cognitive computing” capabilities that go by the name Watson.
It also includes a mobile solution for Etihad, developed by IBM and Apple, and transition of 100 Etihad IT employees to IBM.
IBM has a similar outsourcing contract with the German airline Lufthansa. The seven-year €1 billion contract was signed last year.
Another recent customer win Schroeter highlighted was a $1 billion outsourcing deal with Evry, a Norvegian IT services company. The 10-year agreement also calls for transformation of the customer’s IT infrastructure, including cloud services hosted at IBM’s SoftLayer data center outside of Oslo.
Only Two Segments up Year on Year
Global Technology Services, the segment that includes IBM’s cloud services, was one of two segments that reported increases in revenue in the third quarter, albeit GTS’s increase was only 1 percent. The other growing segment was Global Financing, whose revenue grew 7 percent year over year.
Results from the remaining three segments:
- Global Business Services: down 5 percent
- Software: down 3 percent
- Systems Hardware: down 2 percent
Systems Hardware revenue was down because of poor sales of storage systems. IBM recorded a 20 percent increase from sales of z Systems (the mainframes) and a 2 percent increase from sales of Power Systems (the high-end servers powered by Big Blue’s Power processors), which was not enough to offset a 14-percent year-over-year drop in storage sales.
Storage represents about 34 percent of IBM’s total hardware-revenue pie.
IBM’s future clearly depends on its ability to continue raking in big IT outsourcing contracts as it has done for many years, only now these contracts will also include its vast array of cloud services, including Infrastructure-as-a-Service offerings called SoftLayer, Platform-as-a-Service offerings called Bluemix, and advanced data analytics capabilities.
The likes of AWS and Microsoft Azure may be actively pursuing enterprise data center workloads, but IBM has a lot to offer to enterprises that cannot deploy all of their applications in public clouds with its pitch of a hybrid infrastructure that can combine on-prem data centers with cloud services, all managed by one vendor. | | 3:00p |
Companies Starting to Take Server Utilization Rate More Seriously Matt Stansberry is the Director of Content and Publications for Uptime Institute.
In Uptime Institute’s 2014 Data Center Industry Survey, the majority of respondents didn’t
believe comatose servers
were a serious problem in their organizations. Yet 45 percent did not conduct
any scheduled auditing to identify if they actually had
a problem.
A recent non-Uptime study estimated that the value of idle servers sitting in data centers around the world was about $30 billion.
IT organizations tend to be in denial about
this issue, as the utilization is embarrassingly bad; but also because the people procuring and deploying servers had
no accountability regarding the
costs associated with poor utilization. Increased adoption of chargeback will likely have a positive impact, however.
Still, how does it get that bad in the first place? One reason might be the incredibly long server refresh rates reported in the 2015 survey (see Figure 8). Nearly two-thirds of the respondents install a server in a rack and do not replace it for four years or more. Multiple factors can change over that time period in a dynamic IT organization, and so organizations can lose track of many pieces of hardware.

Uptime Institute’s Server Roundup winners have proven that without an incredibly disciplined server decommissioning program with dedicated resourcing, comatose and underutilized IT equipment will accrue and bloat IT budgets, hamper new projects, and idly consume expensive infrastructure capacity.
Tracking server utilization is slowly creeping up the lists of priorities, as evidenced by 2015 survey data, ranked as the fourth most important data center metric overall (see Figure 9). PUE is still regarded as the top priority, a finding somewhat skewed by the job responsibilities of the survey sample size, and also what Uptime Institute sees as misapplication of PUE as a management metric.

However, the surprising finding in Figure 9 is how low carbon reporting ranks for data center professionals, given the potential havoc this metric can cause for companies in the sphere of public opinion. By now, most organizations are familiar with the reports from the environmental organization Greenpeace, scoring large data center operators on their environmental impacts. The reports largely focus on web-scale companies with a broad public-facing presence and the infrastructure providers that support those companies.
Greenpeace is well aware of the efficiencies the majority of web-scale operators have engineered into their data center facilities and the very fabric of their IT operations. Yet, the only metric that concerns the environmental organization is carbon emissions and an organization’s willingness to influence its utility providers to invest in renewable power generation.
Understandably, many data center operations professionals argue that these decisions are above their pay grades and not something they can influence. Yet, no one in a data center organization is going to be able to claim that carbon is not their problem when their company is called out in the media headlines, and their executives are asking questions about
what could have been done differently to avoid the negative exposure.
Ultimately, everyone associated with delivering data center capacity will need to be discussing this topic on a regular basis.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 5:25p |
Hivelocity to Double Tampa Data Center Footprint

This article originally appeared at The WHIR
Web hosting provider Hivelocity is breaking ground on a new data center in Tampa, marking its second data center in the Florida city. The construction is expected to be completed in the first quarter of 2016.
Hivelocity’s second Tampa data center will be 30,200 square feet, effectively doubling its footprint in the area to 60,000 square feet. It will also contribute 20,000 bare-metal servers to its infrastructure over the next few years.
In addition to Tampa, Hivelocity offers services out of Tampa, Atlanta, Miami and Los Angeles. The company is intending to grow that list soon, according to Hivelocity CEO and president Mike Architetto.
“We opened our flagship data center in Tampa back in 2012 and within two years found ourselves needing to build out an additional 10,000 square feet of raised floor space to accommodate growth,” Architetto said in a statement. “Our new data center will not only give us more space for growth but also allow us to offer an expanded portfolio of enterprise hosting services for businesses large and small. We have provided our customers with 100 percent uptime since we opened our current data center four years ago. The addition of our new facility will only enhance our ability to provide never-fail infrastructure and cloud services.”
Founded in 2002, Hivelocity offers dedicated servers, colocation, private cloud and other solutions, including data migration and cPanel managed hosting. The company has more than 50 employees that maintain more than 10,000 customer servers. In June, Hivelocity added Corero’s SmartWall Threat Defense System to its network defense infrastructure.
Hivelocity’s second Tampa data center will initially offer services including bare-metal dedicated servers, private cloud environments and colocation. Hivelocity said it will offer colocation solutions ranging from a single server to a private cage or pod in order to serve a wide variety of clients. The company will also offer hybrid colocation services to provide customers the option to complement their infrastructure with Hivelocity’s cloud storage and bare-metal servers.
This first ran at http://www.thewhir.com/web-hosting-news/hivelocity-set-to-double-tampa-footprint-with-30200-square-foot-data-center | | 5:42p |
Physical Security in Enterprise IT: a Renaissance for Cloud-Based Security 
This post originally appeared at The Var Guy
In an era where cybersecurity is at the top of every CIO’s list of business concerns, it can be easy to overlook the importance of physical security in protecting legacy hardware solutions and ensuring critical assets remain safe. But despite the lack of industry enthusiasm surrounding physical security solutions, advances in cloud technology and digital cameras have created somewhat of a renaissance in the physical security world.
From cloud-based cameras to motion detectors and even IoT-connected monitoring solutions, never have there been more options available to enterprises and consumers alike when it comes to protecting their most precious resources. Even better, many of these options are relatively simple to set up and deploy, even without a strong technology background.
New Solutions for New Problems
With the increase of targeted attacks against both virtual and physical assets, it’s only natural that the means by which we protect these resources become more advanced to prevent theft or property destruction. And while many people still think of physical surveillance solutions as a standard video camera connected to a CCTV network, the breadth and depth of available options for monitoring infrastructure and protecting hardware has changed rapidly within the past decade.
Some of the most commonly seen devices are cloud-based IP cameras, which take the old-fashioned CCTV model and add a persistent Internet connection, so data is stored in a remote database instead of in a dusty hard drive in the manager’s office. This not only allows companies to access their data from anywhere and at any time, but it also ensures that data is preserved in case of a fire, flood or other natural disaster.
IoT-connected cameras also are finding their place in the world of physical security and surveillance. Like their cloud-based brethren, these cameras store data remotely in a public or private server, but also allow users to control the pan, tilt and zoom of said devices remotely from a computer or smart phone. Currently, these devices are most commonly used for personal security in homes and in small businesses, but the rapid growth of IoT solutions is bound to make these security devices more prevalent throughout midmarket and enterprise companies.
The Rise of Cloud-Based IP Solutions
There’s no doubt cloud technology has played a critical role in the evolution of physical surveillance, especially in terms of protecting enterprise assets. But before we look at how cloud-based IP solutions have taken over the market, it’s important to understand what trends have influenced the mass migration of compute resources to the cloud in the first place.
Prior to the more widespread use of modern surveillance solutions, cameras had severe limitations, including resolution problems, limited storage and issues with light flare and, subsequently, dark areas. However, the new breed of security solutions are more like computers than traditional surveillance equipment, and feature built in solutions to compensate for such issues, said Duston Miller, founder and vice president of engineering at NDM Technologies, a Spokane, WA-based security provider.
“When we first got into it [physical surveillance] it was a lot of retrofitting existing coaxial systems into IP-based video,” said Miller. “But now it’s pretty much all IP from point to point.”
With the steady increase of reliability in cloud services over the past several years, enterprises have continued to utilize public, private and hybrid clouds to house critical customer and company data as a way of protecting it from theft or loss. Currently, about 51 percent of all compute information is housed within the cloud, as more people begin to realize the benefit of storing their information outside of the physical location, said Dean Drako, CEO and founder of Eagle Eye Networks.
“There’s an argument saying [that data] isn’t as secure in the cloud as it is onsite, and the reality is that it’s actually false. It’s actually more secure in the cloud,” said Drako. “And the reason it’s more secure in the cloud is because the cloud provider can’t afford a data breach, so they actually spend money and get the proper personnel and put the energy and focus into making sure that they are really secure.”
And despite a minimal amount of physical security data currently being stored in the cloud, customers are increasingly showing greater interest in storing their data remotely, according to a recent survey from Eagle Eye Networks. The study of 250 respondents showed about 65 percent expressed interest in some cloud recording, with only 35 percent of respondents wishing to keep their physical security data entirely on premises, Drako said.
“This actually gets me really excited because it basically expands the market for physical surveillance and provides more value than just security, and I think that’s actually great for the industry,” said Drako. | | 7:19p |
Cologix Enters New Jersey Data Center Market with Net Access Acquisition Cologix, a Denver-based data center provider, has agreed to acquire New Jersey competitor Net Access, gaining its first three data centers in the Garden State and more than 700 new customers.
The deal is the latest example of consolidation in the data center provider industry, which has been happening at a steady pace. Also this week, data center provider TierPoint announced it had agreed to buy the data center business of the telco Windstream.
Terms of the Cologix-Net Access transaction were not disclosed. Net Access is owned by a private equity firm called Seaport Capital.
Like TierPoint, Cologix has been growing primarily through acquisitions of smaller players with data centers outside of the primary markets. The company raised $255 million in private equity to fund expansion earlier this year.
This is Cologix’s tenth acquisition, bringing its portfolio to 24 data centers in nine markets:
- Columbus
- Dallas
- Jacksonville, Florida
- Lakeland, Florida
- Minneapolis
- Montreal
- Northern New Jersey
- Toronto
- Vancouver
Each of the three Net Access facilities – two in Parsippany and one in Cedar Knolls – is about 30 miles away from New York City. They are interconnected with a dark fiber network that also connects them to major carrier hotels in the region.
Total gross square footage of the three facilities is 200,000 square feet.
In an apparent nod to data center outages caused by flooding in Manhattan when Hurricane Sandy ravaged the US Northeast in 2012, Cologix stressed in its announcement the fact that Net Access data centers were 250 feet above sea level.
“The acquisition will enable Cologix to address the vibrant New Jersey market and provide a compelling alternative to colocation options in New York City,” Cologix said in a statement. | | 8:36p |
Nutanix, Cumulus Marry Converged Infrastructure and Open Networking Nutanix, one of the biggest names in converged infrastructure – a category of preconfigured full-package IT solutions that are easy and quick to deploy in a data center – has validated Cumulus Linux, a Linux-based network operating system by the startup Cumulus Networks, as compatible with Nutanix Acropolis, the software layer that ties the physical infrastructure together into a unified scale-out fabric.
This means users can use a Nutanix infrastructure together with any of the 31 networking platforms on Cumulus’s hardware compatibility list, including switches by Dell, Edge-core, HP, Penguin, Quanta (QCT), Agema, and Supermicro. The thrust of Cumulus has been to bring an open network OS that is easy to use for IT staff already familiar with Linux and that can be deployed on a variety of hardware platforms in contrast to closed, tightly integrated switching solutions from Cisco and other “incumbent” vendors.
Most of those incumbent vendors, including Dell, HP, and Juniper, have changed their stance recently, embracing network software that’s not their own for some of their switch product lines. Cisco, the biggest of them all, has not.
Acropolis has advanced virtual networking capabilities, enabling users to set up virtual networks and switches on top of the physical infrastructure.
Market analysts at Gartner consider Nutanix to have the most complete vision in the converged infrastructure space. Other leaders are Cisco, EMC, Oracle, NetApp, and HP.
Nutanix’s ability to execute on its vision is ranked lower than Cisco’s, Oracle’s, and EMC’s, but higher than NetApp’s and HPs, according to Gartner’s latest Magic Quadrant for the market segment.
Clinic in Perth Does Away With Fibre Channel SAN
In its announcement, Cumulus highlighted a deployment of Nutanix’s converged infrastructure with Cumulus Linux by Perth Radiological Clinic in Australia. The clinic deployed the infrastructure at a centralized data center to host medical images clinicians take at 15 sites around the city. The images are accessible by all authorized viewers, regardless of where they are.
“Before we embarked on this project, none of us wanted to learn a new switching model,” George Hewitt, infrastructure and development manager at the clinic, said in a statement. “We just wanted solid, reliable Ethernet that we could grow as the network required. With Cumulus Linux, we were able to move away from a Fibre Channel/SAN framework to a more modern way of supporting our business growth.” |
|