Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, July 18th, 2017
Time |
Event |
12:00p |
DCK Investor Edge: Why Q2 Earnings are Crucial for Data Center Stocks It is hard to believe that the first half of 2017 is already in the books.
Notably, this time last year data center REITs had run up by an average of 50 percent, which created a clear signal to take some chips off the table due to frothy valuations. The data center stocks gave back about half of those gains from July to November 2016.
This year the signals are somewhat more nuanced, with the six publicly traded data center REITs “only” returning 23.3 percent on average.

YChart created by author
Data center stocks have once again outperformed traditional asset classes such as office, industrial, and apartments, but they have already begun to pull back from 2017 highs going into earnings season. Many investors are on pins and needles, waiting to learn how the second half of the year will unfold.
Why Q2 2017 Is Critical
The wholesale-focused publicly traded data center REITs appear to be behind the curve for the first half of 2017 compared to the record leasing results they reported for the first half of 2016.
Last year, Northern Virginia’s data center alley notched 113MW of wholesale absorption in just three quarters, prior to the US presidential election. Then the rug was pulled out from under the wholesale leasing table.
It will be crucial to learn which of the hyper-scale players are expanding with build-to-suits or pre-leasing massive data halls in powered shells. Both Dallas and Northern Virginia now appear to have plenty of supply. However, plenty of questions remain to be answered this quarter.
Will near-term uncertainty created by the announcement of Digital Realty Trust’s acquisition of DuPont Fabros Technology impact new customer lease signings? Could this announced but not yet consummated merger create an opportunity for competitors such as low-cost provider CyrusOne (CONE) and NTT Group’s RagingWire in key markets like Northern Virginia?
Read more: Why RagingWire is a Data Center Company to Watch
CloudHQ is another serious super-wholesale player in Northern Virginia. Hossein Fateh, who co-founded DuPont Fabros, is the principal behind the new provider. Fateh’s firm reportedly signed a 30MW-plus super-wholesale requirement last year with Microsoft and is rumored to have recently landed another massive deal with the same customer.

Source: Data Center Hawk
The first CloudHQ data center designated MCC1, a 43.2MW state-of-the-art facility located in Manassas, Virginia, will be ready for service during summer 2017 to serve Microsoft, leaving just 8.2MW of available capacity, according to the company’s LinkedIn page, (there is zero public information available on the CloudHQ corporate website).
One of the reasons given on previous earnings calls for slow leasing during Q4 2016 was actually a “happy problem.” A lack of available supply in Tier 1 markets, including: Silicon Valley, Dallas, Chicago, and the engine that drives the wholesale leasing train — Northern Virginia.
Uncertainty surrounding the US presidential election was another reason given for the lack of enterprise leasing during Q4 2016. However, the lack of wholesale leasing by traditional enterprise customers during Q1 2017 potentially points to a different narrative for these data center stocks: the long sales cycle involved with hybrid cloud solutions.
Read more: QTS’s Hybrid IT Strategy — Short-Term Pain, Long-Term Gain?
Notably, except for DuPont Fabros’s success with its new Apple pre-leases, there was not a great deal of super-wholesale leasing reported by the publicly traded data center REITs during Q1 2017. Investors should keep in mind that the timing of when large wholesale deals are signed is notoriously lumpy.
A Rapidly Evolving Sector
Investors should also keep in mind that there have been some significant changes in the landscape during the past few months that are sure to affect data center stocks.
The latest seismic move has been the merger agreement between Digital Realty (DLR) and DuPont Fabros (DFT). The driving force behind that merger was Digital’s need to gain some super-wholesale market share in Tier 1 US markets, especially in Northern Virginia.
Read more: Digital Realty Signs Biggest Hyper-Scale Deal After All
Digital Realty has over 2,300 customer relationships, while DuPont Fabros has less than three dozen. However, the DFT relationships with Microsoft, Apple, Facebook, and others have proven to be “birds in the hand,” as those hyperscale cloud businesses continue to grow over time.
Other recent seismic activity in the market:
Equinix (EQIX) closed its Verizon America’s data center carve-out, a huge $3.6 billion acquisition. Equinix acquired several of Verizon’s former crown jewels, including: 1) NAP of the Americas in Miami, gateway to Latin America; 2) Verizon’s highly secure Culpeper, Virginia, campus, home to numerous US government agencies and contractors.
CenturyLink (CTL) became the latest legacy carrier to exit the data center business to focus on its core network initiatives. The deal helps CenturyLink fund its blockbuster $34 billion acquisition of Level 3. Cyxtera Technologies, the new company led by former Terremark CEO Manuel Medina, acquired the CenturyLink data center portfolio to build on the platform with a focus on cybersecurity.
There has been quite a bit of private sector M&A activity, including:
Digital Bridge‘s purchase of Vantage Data Centers in Silicon Valley. This gives Digital Bridge a wholesale data center platform to add to its Data Bank colocation and its wireless infrastructure business. Notably, former Digital Realty CEO Mike Foust is now CEO of both Vantage and Data Bank.
GI Partners-backed Peak 10 acquiring ViaWest to expand their secondary market footprint geographically. GI Partners has an eye for talent in the data center services market, having incubated and monetized Digital Realty, SoftLayer, and Telx.
While a rising tide tends to lift all ships, these sophisticated, privately held data center operators are looking to enter new markets and compete with the publicly traded REITs.
Investor Takeaway
One key takeaway is that there is a tremendous amount of institutional capital looking for a home in the data center asset class. This includes sovereign wealth, insurance companies, hedge funds, pension funds, and private equity.
The Q2 2017 earnings prints and conference calls will be an opportunity for investors to learn how the publicly traded REIT sales funnels and deal pipelines reported during the past two quarters have translated into new booked-but-not-billed lease contract backlog. Given the time required to build out large data halls, if a hyper-scale lease has not been inked by late July or early August, it reduces the likelihood of impacting full-year 2017 results in any meaningful way.
The long-term outlook for data center REITs remains constructive, given the sector tailwinds of cloud computing, wireless, streaming video, IoT, big data, and AI. However, a lot of good news is already baked into the shares, as data center REITs have returned about 25 percent on average during the first half of 2017, if you include dividends.
Meanwhile, any sell-off due to disappointing Q2 2017 results could provide a potential buying opportunity for data center stocks — as long as the long-term narrative for the individual data center REIT remains intact. | 3:00p |
Quantum Computing Could Make Today’s Encryption Obsolete
This is the first post in our new regular series on data center security. Scroll to the bottom of the article to learn more about the column and its author.
Researchers at top university and corporate labs around the world are in a furious race to create the first practical, usable quantum computer. Quantum computers — which use quantum bits, or qubits — are capable of running computations impossible for existing technology. It promises to open up new possibilities in areas like medical research, artificial intelligence, and security.
Oh, and they would also easily crack current encryption algorithms.
How close are quantum computers to becoming reality? The point at which quantum computers would surpass our current computers in capability is at about 50 cubits.
In March, IBM announced that it had a 20-qubit quantum computer, and that outside researchers and developers could already start running simulations on the IBM Quantum Experience.
In June, Google raised the ante. Alan Ho, an engineer from Google’s quantum AI lab, told a conference in Germany that Google already had a 20-qubit system, and was planning to built a 49-qubit computer by the end of the year.
See also: Google’s Quantum Computing Push Opens New Front in Cloud Battle
“Quantum computers are now commercially available if you have a lot of money,” said Mike Stute, chief scientist at Masergy, a networking, security and cloud communications technology company headquartered in Plano, Texas.
The problem is that dealing with qubits requires some tricky engineering involving quantum physics. Plus, quantum computers require built-in error correction to deal with the fact that qubits are not as well-behaved as the traditional zero-or-one bits of classical computing. These two challenges combine to make the development of larger quantum computer a difficult task.
Meanwhile, it’s not enough to just surpass current computers. In order to crack today’s encryption, quantum computers have to be a lot better than what we have today.
That will take between 500 and 2,000 qubits, said Kevin Curran, a senior member at IEEE and cybersecurity professor at Ulster University.
See also: One Click and Voilà, Your Entire Data Center is Encrypted
So, run-of-the-mill hackers won’t be breaking into banking systems right away. Government agencies, however, may have quantum computing technology a generation or two ahead of what’s commercially available, said Masergy’s Stute.
That means companies protecting data of interest to China, Russia, or the NSA might need to be particularly careful.
What You Can Do
Current encryption is based on the idea that there are some mathematical problems that are really hard for computers to solve.
For example, public-key encryption — where one key is used to encrypt the data, and a different key to unlock it — typically relies on just those kinds of problems.
“When quantum computing becomes a reality, then many public-key algorithms will be obsolete,” said Curran.
Symmetric encryption, where the same key is used to both encrypt and decrypt the data, is more robust and will last longer.
Companies that have data they want to protect may want to start planning ahead to make more use of symmetric encryption, as well as switch to longer keys.
In addition, researchers are already working on new, quantum-proof encryption methods and will start testing them as soon as quantum computers become more widely available.
For companies that depend on having good encryption in place the most important thing is not to hard-wire encryption systems into their applications.
Instead, they need to adopt a modular approach, so that they can easily replace old, obsolete algorithms with new, effective ones. With some advanced planning, that’s not hard to do.
Introducing Our New Data Center Security Column
Cyberattacks with wide-reaching consequences are now commonplace. Last month’s attack on FedEx’s TNT Express will hurt its quarterly results. The same month, thousands of members of the British Parliament and their staff lost access to email as a precautionary measure taken to limit the damage from a massive cyberattack on the legislative body. If your job has anything to do with your organization’s data centers, cybersecurity is becoming a bigger and bigger part of it, which is why we’re introducing a new column focused exclusively on data center security.
It’s a great pleasure to introduce Maria Korolov, who will author the column. She is a Massachusetts-based technology journalist who writes about cybersecurity and virtual reality.
During her 20 years of experience covering financial technology and cybersecurity she wrote for Computerworld, was a columnist for Securities Industry News, ran a business news bureau in China, and founded a publication covering virtual reality. She has reported for the Chicago Tribune, Reuters, UPI, and the Associated Press.
Before switching to business and technology journalism, she was a war correspondent in the republics of the former Soviet Union and has reported from Chechnya, Afghanistan, and other war zones. | 4:51p |
Oracle to Add 1,000 Employees in European Cloud Push Jeremy Kahn (Bloomberg) — Oracle Corp. is hiring 1,000 employees in Europe, the Middle East and Africa as it expands its cloud computing services in the region.
The company is looking for workers with between two to six years of experience to staff sales, management, finance, recruitment, marketing and human resources roles for its cloud computing service, Oracle said Tuesday. The Redwood City, California-based company did not specify which offices would be adding staff.
The move comes about a month after the company reported 58 percent year-over-year revenue growth in its cloud businesses, which allows corporate customers to manage data through a network of Oracle-run servers. The company sold $4.6 billion worth of cloud computing software and hardware last year, up from $2.9 billion the year before.
See also: Oracle’s Hurd Bullish on Cloud Business, Says Enterprise Market Largely Untapped
“Our cloud business is growing at incredible rates, so now is the right time to bring in a new generation of talent,” Tino Scholman, vice president of Oracle’s cloud computing for the region, said in a statement.
Cloud-related products now account for more than 12 percent of Oracle’s total sales. The company employs approximately 51,000 staff in the U.S. and 85,000 internationally.
Europe, the Middle East and Africa accounted for 28 percent of Oracle’s overall revenue last year, but sales in the region declined 2 percent last year to $10.6 billion, as customers shifted away from Oracle’s traditional enterprise computing software to cloud-based services.
See also: Oracle Closes Big Cloud Deal With AT&T, Inks Equinix Partnership
Public-cloud spending is expected to grow 27 percent annually to reach $82 billion by 2020, according to research firm IDC. Oracle’s “cloud infrastructure products are gaining traction and should become a major pillar of growth next year, amid increasing competition from Amazon,” Bloomberg Intelligence wrote in a July report.
Amazon.com Inc., Alphabet Inc.’s Google, Microsoft Corp., International Business Machines Corp. and others have all reported surging growth in cloud-computing sales. These companies have been adding data centers in Europe as the competition to deliver these services in the region heats up. | 5:06p |
You Can Now Earn a Bachelors in Data Center Facilities Engineering An industry-first Bachelors Degree in Data Center Facilities Engineering, offered by The Institute of Technology in Ireland, will help both students and existing data centers alike, reported Network World.
Developed after lengthy consultations with Google, Facebook, and Microsoft, the Sligo-based school said the studies will focus on traditional enterprise data center practices with the hopes of graduating a class ready and able to help fill the skills gap in technology management and operation of data center facilities.
“Google is proud to support IT Sligo’s pioneering new engineering degree in data center facilities engineering and management,” said Denis Browne, Google’s EU regional data center lead.
“Google’s data centers are some of the best in the world, and we look for the best talent to work with us. Thanks to IT Sligo this online course will increase the skills of people already working in the sector, and for those who wish to work in the industry going forward.”
Read also: Data Centers Go to College: New Masters Degree Offered By SMU
Most students of technology graduate with a broader education with a few specific data center courses sprinkled in, but this degree expects graduates to hit the ground running and fill entry-level positions with little, if any, on-site training needed.
This degree will come in very handy for students looking to pursue a Master’s Degree in Data Center Facilities Engineering at SMU’s School of Engineering in Dallas–with input from Hewlett Packard–because it is a prerequisite for attaining a higher degree there.
While students in the U.S. could complete the undergraduate coursework online, they would need to attend lab sessions in Belgium, making it more conducive to Europeans. However, the school said that now that the template is in place, other institutions could easily follow in its footsteps, creating worldwide opportunities.
See also: How to Get a Data Center Job at Google
One UK university recently made headway. Leeds University has directed some of its postgraduate engineering students towards dissertations specific to data centers and has engaged industry experts to help students. There are also plans to offer a specific masters degree in data center design.
According to Dr. Jon Summers of the University of Leeds, the university has recently beefed up its data center engineering undergraduate curriculum by developing industry-mentored projects focused on designing facilities for different climates. Universities in the U.S. have also begun investigating similar options.
Specific data center degrees are sorely needed, considering the workforce associated with data center operations tops 4 million, according to the U.S. Department of Labor. That number is growing, with an expected increase of 2 million by 2018. Approximately 70 percent of these workers have a bachelor’s degree or higher. Up until now, the primary college-level curriculum has been on online course from the Institute for Data Center Professionals at Marist College.
See also: How to Get a Data Center Job at Facebook | 6:25p |
Google Designs Data Center Appliance to Ship Client Data For a large enterprise, one of the costliest and most time-consuming steps in moving to the cloud is transferring the enormous amount of data stored in its on-premises data centers to its cloud provider’s data centers. Network bandwidth is a precious resource, and even when you have tons of it, moving petabytes of data over a WAN can take way longer than is practical.
Amazon Web Services solved this problem two years ago by introducing a service called Snowball. If you have lots of data you want to upload to AWS cloud storage, the company ships you a rugged storage appliance, which you connect to your internal network, upload your data to it, and ship it back to Amazon.
Today, Alphabet subsidiary Google announced beta launch of a similar service, taking another step in its effort to catch up to AWS and Microsoft Azure in the enterprise cloud market. The service, creatively named Transfer Appliance, is slightly cheaper per TB than AWS Snowball, although the exact price difference will depend on your specific shipping costs.

Google’s time estimates for transferring data over networks with varying bandwidth. Click chart to enlarge (Image: Google)
Another difference is in the design of the appliance itself. Besides storing more data, Google’s Transfer Appliance is designed to be mountable in a standard 19-inch data center rack, while the Snowball looks more like a PC tower built for an active battlefield.
Each cloud provider offers two models of its data migration device. The two Transfer Appliance options are 100TB in a 2U box and 480TB in a 4U box. Snowball has a 50TB and an 80TB option.
The Google service costs $3 per TB or $3.75 per TB, depending on which of the two versions of the appliance you select. Curiously, the higher-volume version of the appliance commands the higher per-TB price. You’re also responsible for shipping the Transfer Appliance (the service uses FedEx), which will run you about $500 for the 100TB model and $900 for the 480TB one.
Amazon’s Snowball service costs $4 per TB for the 50TB model or $3.12 per TB for the 80TB one. You pay for shipping to an Amazon facility too, and unlike Google, Amazon doesn’t provide set pricing for shipping, saying it will depend on your location and the shipping option you choose (e.g. 2-day or overnight).
Amazon also made the shipping process itself easier by mounting its Kindle readers on the appliances. When a customer is done uploading their data to the device, the screen on the Kindle automatically displays the correct shipping label, and a shipping company is notified that the device is ready to be picked up.
In Google’s case, the customer has to email support to request a shipping label and wait until it arrives in the mail before they can ship the appliance.
Google may be outsourcing some portion of the data migration service to the enterprise data management giant Iron Mountain, although we weren’t able to confirm this. “Google doesn’t publicly disclose this information,” a company spokesperson said in an email.
Documentation for the Transfer Appliance includes instructions for users to grant permissions on their staging bucket in Google cloud storage to ironmountainmigration@gmail.com. That Iron Mountain is an official Google Cloud Platform partner is public information, but public information available about the partnership is limited only to a service for transferring customer data stored on tape in Iron Mountain facilities to Google’s cloud data centers. | 7:00p |
Making Security a Priority in Connected Cars Malte Pollmann is CEO of Utimaco.
The automobile has evolved from a machine that gets us from point A to point B, into a computer on wheels, and even a shop on wheels. As a result, all eyes are on cybersecurity for the connected car.
New cars are now equipped with 3G and 4G connectivity, and consumers are getting used to the growing number of novel conveniences like parking assistance, internet connectivity, streaming media, traffic guidance systems and safety alerts. Though we’re moving closer to the reality of vehicle-to-vehicle communication (V2V), vehicle-to-infrastructure communication (V2I) and autonomous cars, we have yet to see all of the possibilities the connected car can bring. This is why, in parallel to these advances, it’s imperative that the connected vehicle provide a solid foundation and make deeper investments into V2V security, if it’s ever to reach its full potential.
A recent survey from Accenture shows that consumers want to stream music and surf the internet, all while their car identifies issues and helps navigate traffic. They also hope to have access to social media and email dictation, and features like night vision, fatigue warning, blind spot devices, car-to-car communication and finally autopilot. More and more, this conglomeration of technologies will become a standard, not a luxury. But what’s the payoff when this suite of comforts can be accessed by a third party when the driver is behind the steering wheel? The connected car is a goldmine for hackers, and highly susceptible to security breaches. IHS Markit tells us there are nearly 112 million connected vehicles around the world, and the global market for automotive cybersecurity is expected to grow exponentially.
The Hack: It’s not a Matter of if, but When
The growing number of Internet-connected devices and accessories in the connected car opens it up to new potential points of attack for cybercriminals. Connected vehicles are tied into a variety of outside networks for communications, navigation, maintenance, and even the ability to be directed by apps on smart phones, providing an ever-growing attack surface with an increasing number of points (or vectors) where an attacker could try to gain access into the environment. For the connected car network, it will not be a matter of if you are impacted by a vulnerability exploit or breach, it will be a matter of when.
Failing to properly secure the connected car means more than just putting your personal information at risk; it can take key components of the car offline, rendering it undriveable or something even more catastrophic. For example, it’s been made public via Wikileaks that CIA employees have worked to infect vehicle controllers with malware, under the code word “Vault 7.” Hacker groups like the Shadow Brokers have already exploited these NSA tools, including the notorious WannaCry ransomware exploit that temporarily knocked key business offline, and show no signs of stopping. It is just a matter of time when an exploit kit aimed at the connected car hits the dark web.
The threats to connected cars have been made clear, but the Herculean task now at hand is implementing top-level security practices under time-sensitive and high-pressure conditions. Car makers are in an arms race to develop these vehicles, hoping to gain a competitive edge and become the go-to name in the market. Complications rise to the surface as the hours spent developing the exciting, futuristic features of the connected car far outweigh time spent examining the security issues abound in integrated IT systems. More to the point, manufacturers are still adapting to the processes and structures that are standard to the traditional production of IT.
As Car Becomes Computer, There’s No Need to Start from Scratch
As the automotive industry realizes security must be at the core of the connected car, it faces the challenge of integrating proven IT and security solutions that reliably secure both networked production sites and the vehicles themselves. Consumers are clearly excited, but expectations are high – they want advanced connected cars and expect manufacturers to thoroughly secure their vehicle, as well as provide ongoing security updates. Since manufacturers are just dipping their toes into the business of cybersecurity and are under pressure to deliver quickly, it’s imperative they take note from highly regulated industries with deep security experience like technology and finance.
The current state of the automotive market tells us that the transformation to connected/networked vehicles can only succeed cross-company as industry standards – protocols and processes – must be implemented across the board. Legislators may be reluctant to force them upon the industry, leaving car makers to define best practices and de facto industry standards. As part of this process, they should consider current standards from other highly-regulated sectors such as banking, and adapt to their specific needs. Car makers may find themselves navigating financial regulations, for example, to ensure that connected vehicles can safely and securely execute transactions and simple payment processes when refueling/recharging at the (electrical) station, automatically billing parking tickets and purchasing new parts and gear as needed, among other scenarios.
Building the Connected Car Starts with the Box
As auto manufacturers attempt to ingrain themselves in the practices of IT and security to get their connected cars on the road, they have the advantage of learning from other industries. Security standards have already been established in the technology and finance sectors, and they can be adapted by the automotive industry to protect the data and systems in networked vehicles. At the center of consistent security is end-to-end encryption, in which hardware security modules (HSMs) constantly establish protection via authentication. They are used, for example, in the following methods:
• Key Injection: As a component of the HSM, you can insert individual digital keys into semiconductors using a real random generator. With the unique key of the components, the connected car is given a “digital identity” that authenticates the vehicle throughout its entire lifecycle. Authentication is used, for example, when the vehicle arrives at the workshop for maintenance, or eventually as cars communicate data and information among themselves (V2V).
• Authentication as the base layer for access control: Only those who have the digital key can make changes to the system in the vehicle – for example, downloading GPS updates or music would require authentication. In terms of any maintenance work on the vehicle, dealers and services can securely access the system using a Public Key Infrastructure (PKI).
• Code Signing: Software in the connected car will have already received an individual key during the development phase. This ensures that the code is both genuine and correct, and the integrity and authenticity of the software and its updates are safeguarded.
• Protecting the exchange of user data: Personal information should only be stored in an encrypted database. The cryptographic key material is managed and stored on premise, yet separated from the database in an HSM. Data is then protected against any unauthorized access, even if the database contents fall into the wrong hands – like the media and cybercriminals.
• Protecting monetary transactions: Ensure processes like tokenization and Host Card Emulation (HCE) are standard to securing the vehicle, as they are currently used in smartphone payments and transactions.
Security will Make or Break the Connected Car
As the automotive industry scrambles to develop vehicles with an impressive suite of IT features that stand out from the crowd, it is essential security is not seen as an added feature, but a prerequisite. The connected car will only reach its full potential if security is made a top priority – safety risks within the vehicle as well as threats to greater networks like the electric grid have the potential to create serious safety issues and unwanted disruptions to service.
Facing this new phase in the industry, auto manufacturers that have traditionally been tasked with providing safe, sturdy and well-built vehicles are switching gears to build hyper-connected and equally secure next-generation cars with the sleekest, coolest tech that can play the field with smartphones and other devices. But when cars become computers, the everyday traffic jam is a hacker’s paradise. To ensure security is fundamental to the development of the connected car, auto makers and OEMs must implement practices that quickly resolve any detected safety gaps during the process of production and systems development. Similarly, big industry players will be encouraged to join forces to develop cross-company/industry standards and adopt and adapt established ones.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. |
|