Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, May 30th, 2017
Time |
Event |
4:30a |
Taiwanese Firms to Sell Latest NVIDIA AI Hardware to Cloud Giants NVIDIA, the Silicon Valley-based chipmaker that has emerged as the top supplier of computing muscle for Artificial Intelligence software, has partnered with four Taiwanese electronics manufacturing giants who will design and manufacture its latest AI servers for data centers operated by the largest cloud providers, such as Microsoft, Google, and Amazon.
Foxconn, Inventec, Quanta, and Wistron will build hardware powered by NVIDIA’s next-generation GPUs, codenamed Volta, using the reference architecture for the chipmaker’s own HGX supercomputer design, developed together with Microsoft for AI software workloads.
NVIDIA CEO Jensen Huang announced a lineup of Volta-based products, including the Tesla V100 data center GPU, at the company’s big annual conference in Silicon Valley earlier this month, saying the chips will be available later this year.
See also: NVIDIA CEO Says AI Workloads Will “Flood” Data Centers
“We’re working with everybody on the transition to Volta,” Keith Morris, NVIDIA’s senior director of product management for accelerated computing, said in an interview with Data Center Knowledge, referring to the top cloud providers. He said he expects these companies to start upgrading their platforms from the current-generation Tesla P100 GPUs this year.
A New Frontier in Cloud Wars
Providing hardware for machine learning – the fastest-growing and most widely used and researched type of AI today – as a cloud service is the latest frontier in the war for market share among cloud giants. GPU-powered servers are the most common type of hardware used for these workloads, but they’re very expensive and difficult to support in a data center.

Jensen Huang, CEO, NVIDIA, speaking at the GPU Computing Conference in San Jose in May 2017 (Photo: Yevgeniy Sverdlik)
GPUs are very power-hungry, and servers used for a subset of machine learning workloads called training can pack as many of eight of them on a single motherboard. This results in extremely power-dense data center deployments, where densities north of 30kW a rack are common. Most of the world’s data centers have been designed to support the average 3kW to 6kW per rack.
See also: Deep Learning Driving Up Data Center Power Density
This is why renting GPU servers from cloud providers is an attractive proposition for companies either doing AI research or running AI software in production. They can pay as they go instead of spending large sums upfront to stand up this infrastructure on their own, and they can take advantage of the latest hardware as soon as it becomes available.
Use of Machine Learning on the Rise
A recent survey by MIT Technology Review and Google Cloud found that 60 percent of respondents had already implemented machine-learning strategies and committed to ongoing investment in machine learning. Additionally, 18 percent said they were planning to implement machine-learning strategies within the next 12 to 24 months. Just 5 percent said they had no interest in machine learning and no plans in that area for the foreseeable future.
Here’s a breakdown of the respondents active in machine learning by company size:

Source: MIT Technology Review, Google Cloud
GPU-Only Hardware
NVIDIA’s partnership with the four manufacturers is focused on a single particular form factor for GPU servers: machines that have GPUs only, rather than the hybrid CPU and GPU servers. Cloud companies plug these servers as extensions for regular CPU-powered machines in their data centers, Morris said.
This is one of the reasons this particular partnership is not with traditional data center hardware vendors like Hewlett Packard Enterprise or Dell Technologies, who tend to build boxes that combine GPUs and CPUs, he explained. “We’re working with all the main OEMs (the traditional vendors) on these products too.”
NVIDIA Anticipates Huge Data Center Revenue Growth
NVIDIA expects its revenue from data center products to more than double between this year’s first and last quarters, going from $143 million in Q1 of fiscal 2017 to $296 million in Q4, according to the company’s Q1 2017 earnings report. It projects this business segment to accelerate even further the following year, expecting it to generate $409 million in revenue in Q1 2018.
NVIDIA still makes most of its money by selling GPUs for videogames and doesn’t expect that to change in the near future.
IDC forecasts that companies will spend $10.36 billion on compute infrastructure for cognitive workloads (one way to describe AI software) in 2022, an average annual growth rate of nearly 19 percent over the next five years. The market-research firm also notes that investment growth in cloud computing infrastructure for these workloads will outpace investment in on-premises infrastructure.
“This presents a significant worldwide opportunity for silicon vendors, ISVs, and services vendors,” the analysts said.
Competition from Google
Earlier this month, Google announced that in addition to its cloud GPU services it will provide similar services for TPUs (Tensor Processing Units), the custom AI processors the company designed in-house. The announcement means cloud market share for GPUs will be smaller than it would’ve been was Google to continue using TPUs strictly to run its own applications. The company operates one of the world’s largest clouds and is known for attracting some of the world’s brightest engineering minds.

A pod of TPU servers inside a Google data center (Photo: Google)
“Obviously … there will be some GPU workloads that will run on the TPU that could’ve run on the GPU,” Morris said. “Overall, I think Google has said that they’re going to continue to use NVIDIA GPUs, and I think the NVIDIA GPUs are a great platform for developing this technology. We expect them to continue to be a great customers of ours.” | 3:00p |
DCK Investor Edge: Florida is Now Courting Hyperscale Data Centers Florida officially put out the welcome mat for wholesale data center developments with a “critical IT load” of 15MW and larger which meet specific requirements discussed below.
Last week, Florida Governor Rick Scott signed the 2017-2018 budget bill for the fiscal year which begins on July 1, 2107. This massive bill contains a provision which could become a game changer for certain data center requirements.
The elimination of sales tax and use tax for data centers, infrastructure, equipment, personal property, and electricity is intended to jump-start new large-scale wholesale data center development in the Sunshine State and marks its official entry into the data center derby.
The Inside Scoop
Data Center Knowledge spoke with Lee Kestler last week immediately following the signing of the budget bill. Kestler is a long-time data center industry professional and former business development executive at DuPont Fabros Technology. He is currently a principal at Leesburg, Virginia-based Kopend Ventures. Kestler worked closely with CyberXperts principal Todd Weller on behalf of a small group of clients to help shepherd this provision through the Florida legislature after a similar effort failed last year.
Kestler and Weller were engaged as subject matter experts “to expand the level of understanding for legislators on the benefits other areas of the country have realized by encouraging large-scale data center campuses.” Given the massive size of the Florida budget bill, if you weren’t part of the process as a consultant, lobbyist, or scrivener, it would be easy to miss this noteworthy development.
Sales Tax Exemption
Here are a few highlights from the new legislation:
- A minimum “cumulative capital investment” after July 1, 2017 for acquiring, constructing, equipping or expanding a data center of $150 million or greater. Landlords have until June 30, 2022 to put a data center into service to take advantage of the tax exemption.
- Must have a critical IT load of 15MW or higher and a critical IT load of 1MW or higher dedicated to each individual owner or tenant within the data center.
- “Data center is one or more contiguous parcels in this state, along with the buildings, substations and other infrastructure, fixtures, and personal property located on the parcels.” “The term also includes electricity used exclusively at a data center.”
- “However, the term does not include any expenses incurred in the acquisition of improved real property operating as a data center at the time of acquisition or within 6 months before the acquisition.”
The legislation creating the data center sales and use tax exemption requires a periodic review by the Florida Department of Revenue to assure continued qualification. It also contains a “claw back” provision, if the letter of the law is not met in its entirety.
Additionally, if you already own and operate a wholesale data center in Florida, you could now be at a competitive disadvantage. The ability to attract and retain existing tenants drawing 1MW or greater “critical IT load” could be negatively impacted going forward.
Custom Power Rates
Of course, the cost of power is crucial to the total cost of ownership for large data centers.
There were other interested parties working on behalf of this sales and use tax exemption, including Crystal Stiles an economic development executive with Juno Beach-based Florida Power and Light, according to Kestler. A recent tool in the Florida regulated electric utility arsenal to help attract a large industrial customer (minimum of 2MW), would be to enter into a CISR, or commercial/industrial service rider agreement.
Beginning in 2014, this regulated investor owned utility obtained approval to offer a CISR rate to help attract large customers with suitable load profiles. These sought-after customers can create a win/win situation, with a large stable electrical load helping the utility maintain the rate structures of smaller users at affordable rates.
Bottom Line
There are many regions of the country that have had tremendous success with public/private partnerships to encourage the development of data centers.
Data centers developed in rural areas burdened with agricultural tax exemptions can help grow a commercial real estate tax base to boost local budgets. These data centers do not require significant investments in local schools, roads, and parks in return for the entitlement to develop. They also have state-of-the-art security and fire suppression systems which help limit the impact on fire and police resources.
The Florida legislature and governor have now taken a crucial step to make the Sunshine State more competitive. However, it isn’t a silver bullet. Notably, the Florida legislature removed economic incentives from Enterprise Florida, which only received $16 million in funding in this latest budget.
Florida still has challenges to overcome in attracting massive data centers that could benefit from the new legislation, including: frequent lightning storms, hurricanes, and a harsh climate which requires robust HVAC systems to deal with sensible (heat) and latent (humidity) loads. There remains plenty of heavy lifting for the Florida peninsula to compete with greater-Atlanta, North Carolina, and others to land large regional data center requirements.
Here is a link to the relevant Florida legislation. The data center sales and use tax exemption language begins on line 1299. | 5:21p |
British Air Data Center Outage Feeds Outrage at Airline Cost Cuts Richard Weiss and Rebecca Penty (Bloomberg) — British Airways’ epic meltdown over a busy holiday weekend further fanned public outrage of an industry infamous for its focus on cost cuts over customer service, leaving the U.K. carrier scrambling to explain how a local computer outage could lead to thousands of stranded passengers.
Amid United Airlines’ dragging fiasco, mass cancellations at Delta Air Lines and U.S. concerns about terrorists using laptops to down planes, the global aviation industry hardly needed another blow. But then on Saturday morning, a brief power surge knocked out British Airways’ communications systems grounding the carrier’s entire London operations, leading to days of chaos and putting the new chief executive officer in the hot seat. A full flight schedule is due to resume on Tuesday.
Note from DCK: The outage was caused by “an exceptional power surge” at one of British Airways’ UK data centers, a spokeswoman for the airline told IDG. The surge “caused physical damage to our infrastructure and as a result many of our hugely complex operational IT systems failed.” The airline does have a backup system, she added, “but on this occasion it failed.”
With nearly 600 flights canceled and luggage unable to be dispersed, images and horror stories quickly coursed through social media. Damages for rebooking and compensating customers is estimated at about 100 million euros ($112 million), or about 3 percent of the annual operating profit of parent IAG SA. The shares fell the most in almost seven months as trading resumed in London after Monday’s holiday.
The image damage could be even greater as British Airways appears to have no idea how it all happened. “We’re absolutely committed to finding out the root causes of this particular event,” a grim looking Alex Cruz, the airline’s CEO, said in an interview with Sky Television. He did, however, rule out a cyber attack, which suggests the faults are homegrown.
See also: How Amazon Prevents Data Center Outages Like Delta’s $150M Meltdown
The airline’s communications systems are now working again and British Airways will run full flight schedules at London’s Heathrow and Gatwick airports on May 30, it said in an emailed statement.
“It is tempting but increasingly questionable to view this as a one-off,” said Damian Brewer, an analyst with RBC Capital Markets. “Coming after a spate of other issues, the bad PR and potential reputational aftermath will likely hit future revenues beyond the likely material impact.”
 People queue to speak to British Airways representatives at Heathrow Airport on May 28, 2017 in London. (Photo by Jack Taylor/Getty Images)
While about 95 percent of flights are running, thousands of customers are still being re-routed. More than two-thirds of the 75,000 affected passengers were scheduled to reach their final destination by the end of Monday, Cruz said in a YouTube message. Analysts estimated the number of people due compensation at closer to 170,000.
The crisis puts the spotlight on Cruz, who took charge a year ago after running IAG’s Spanish budget unit Vueling for more than nine years.
While Cruz helped Vueling expand into Spain’s second-biggest airline, the airline suffered repeated flight cancellations and delays last summer due to a lack of available aircraft and crews. Vueling was the only airline in IAG’s portfolio where profit declined last year.
Technology Outsourcing
His four-year cost-cutting program at BA includes eliminating almost 700 back-office jobs, outsourcing some technology operations and switching to paid-for food on short-haul flights. The excessive focus on costs is to blame for the latest mess, according to the GMB union.
“They started on this journey to outsource and offshore this work and there have been a number of incidents now that have culminated in what has taken place this weekend,” Mick Rix, national officer for civil aviation at GMB, said in a phone interview on Monday.
For passengers, workers and tabloids that have criticized the industry’s ruthless cost reductions for years, the disruptions seemed to prove that airlines have gone too far. Daily Mail blamed Cruz and chastised his methods at Vueling, where he outlawed color printing, banned paper towels from washrooms and offered visitors to business meetings only tap water. Critics on social media, meanwhile, questioned whether British Airways deserved to claim itself as the U.K.’s flag carrier after the perpetual cutbacks.
Cruz and the airline were keen to distance themselves from any notion that penny pinching led to the meltdown, saying instead that the outage was caused by damage at U.K. data centers, where work wasn’t outsourced.
“Two days later, they are not all back up and running completely, but we are making good progress in our recovery,” the airline said in a statement. “We do have a back-up system, but on this occasion it failed.”
It’s not the first problem involving British Airways. Last September, a computer network failure brought down the carrier’s check-in system, causing worldwide service delays, while earlier this month, London Gatwick airport reported problems with its baggage-sorting system.
‘It’s a Tragedy’
The latest gaffe hits British Airways as it faces increasing competition on lucrative transatlantic routes. Low-cost competitor Norwegian Air Shuttle ASA is ramping up service to the U.S., while Ryanair Holdings Plc is increasing feeder operations to connect with long-haul flights to destinations such as New York and Havanna.
Customers who were sent away from Heathrow and Gatwick on Saturday were told to find hotels on their own for reimbursement later by British Airways. Payments will include 200 pounds ($260) per night for lodging, 50 pounds round trip between the airport and the hotel, and as much as 25 pounds for refreshments, according to leaflets from the company. Operations were gradually recovering by Monday, when just 50 flights to and from its main London Heathrow hub were canceled and operations were back on schedule at Gatwick.
“It’s a tragedy,” Cruz said. “We do apologize profusely for the hardship that these customers of ours have had to go through.”
IAG shares fell 4.1 percent to 589 pence at 8:15 a.m. in London. The intraday decline of 4.5 percent was the biggest since Nov. 9.
| 5:35p |
A New Internal Threat to Your Environment? ‘Checkbox Security’ Mike Foley is a Senior Technical Marketing Manager at VMware.
Securing your virtual environment is a constantly evolving challenge with changing variables. Checkbox security, a strategy that focuses on compliance, does not make your environment secure. It is a strategy of complacency leading to eventual failure. A comprehensive risk strategy tackles compliance and security and can be achieved through governance automation.
Some may argue that if your environment is fully compliant with a stringent regulatory standard (PCI for example, as this is a particularly wide-reaching compliance standard), then your environment is “secure”. The assumption is that meeting a standard means that you have shored up any security vulnerabilities. This can be a fatal assumption. Compliance with a particular standard, be it FISMA, HIPAA, SOX or the aforementioned PCI, simply means that you are in alignment with a set of externally defined criteria with the ultimate goal of protecting sensitive customer or user data.
While there is an extra level of complexity that must be taken into account with dynamic virtual infrastructures, there are tools that can ensure compliance even in a virtual environment. As the nature of compliance mandates is being standardized and well defined, a “checkbox” approach to compliance does make sense.
That being said, while there are tools that provide the appropriate checks and audits needed to verify and maintain compliance, they often do not address actual security challenges or vulnerabilities. Compliance provides safeguards for specific types of security risks such as accessing credit card or health record data. Securing your virtual environment is a more fluid task that requires vigilance against both external and internal threats such as breaches, misconfiguration, access control changes, authentication and more.
A checkbox security approach breaks down in this scenario – there are simply too many variables outside the scope of compliance-focused toolsets to ensure the security of your environment. A checkbox security approach that relies on your compliance policies is, simply put, vulnerable. Being compliant does not mean your environment is secure; and conversely, just because your environment is secure does not mean it’s compliant.
Governance automation can go a long way in satisfying compliance requirements while also enforcing security policies to protect against internal and external threats. In a virtual or cloud-based (public, private or hybrid) environment with constantly shifting and distributed resources and possibly shared services, automated governance tooling is indispensable for implementing a comprehensive risk strategy at scale, no matter the size of your organization. A good governance solution will ensure that security tasks, such as identity and access management for personnel, are executed. Other tasks can be automated, including provisioning, authentication and authorization as well as more organization specific, granular security processes. Governance automation can not only deliver key elements of good data stewardship such as secure access, encryption and loss prevention, but recognize vulnerabilities, perform remediation and ensure audit readiness. These benefits of governance automation do not even take into account the additional benefits provided in a virtual or cloud environment, such as overall cost controls and the increased speed of business processes.
It’s an all too common downside of the “checkbox security” approach that you don’t actually get the security you’re looking for. This problem is exacerbated in a virtual or cloud environment where flexibility and scale opens up a whole Pandora’s box of additional checks and processes that will impact the productivity and security of a limited toolset – especially if data is compromised or vulnerability attacked.
Governance automation provides controls for regulatory compliance and data protection while incorporating security policies to address vulnerabilities, protecting enterprises from both internal and external threats and eliminating the inadequacies of checkbox security.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 7:07p |
Lincoln Rackhouse Buys into Hot Atlanta Data Center Market Lincoln Rackhouse, the data center division of the commercial real estate firm Lincoln Property Company, has agreed to acquire a property in the Atlanta data center market.
The seller is a corporation that has been using the 5MW facility to host its IT systems. The corporation, whose name Lincoln did not reveal, is planning to lease a portion of the facility after the deal closes, with Lincoln planning to lease the remaining capacity to others. The companies did not disclose the acquisition price in the announcement issued Tuesday.
The Atlanta data center market is heating up, existing players expanding and new ones entering. The demand is driven largely by corporate data center users moving out of on-premises facilities and into colocation data centers, according to a market report by the commercial real estate firm Jones Lang LaSalle. Those users come primarily from banking and financial services, healthcare, telecom, and technology industries.
Digital Realty Trust is expanding its data center capacity in Atlanta; wholesale data center player Ascent recently entered the market via acquisition; Las Vegas-based Switch chose Atlanta as the place for its southeastern-US data center campus; and a report has surfaced saying CyrusOne may be planning a $500 million build in the market.
The biggest player in Atlanta is QTS, which at the end of last year was responsible for almost 75 percent of all wholesale data center capacity in the metro, according to CBRE, another commercial real estate company.
Read more: Atlanta Data Center Market Firing on All Cylinders
Lincoln is also a new entrant to the Atlanta data center market. Its other facilities are in Charlotte, Boston, and Northern Virginia markets, as well as in Richmond, Virginia, and New York State. | 9:30p |
Big Data Experts in Big Demand Data scientist, named the best job in America for 2016 by the job site Glassdoor, is a mashup of traditional careers, from data analysis, economics, and statistics to computer science and others.
Although tech companies Microsoft, Facebook, and IBM employ the most data scientists (227, 132, and 98, respectively), according to a report by RJ Metrics, these professionals are also in demand in non-tech sectors. Kohl’s, AAA, and Publisher’s Clearing House are all searching for at least one on Glassdoor.
It’s no surprise that most people who choose this career begin by studying science, technology, engineering, and math (STEM)–subjects at the very core of innovation and emerging high-tech fields. The positive contribution these subjects make to the US economy and to the nation’s competitiveness in the global high-tech marketplace are undeniable.
That’s why so many students are enrolling in universities that offer STEM disciplines, like big data analytics degrees for example. The study of STEM subjects are tied to the fastest-growing industries, many of which lead to promising careers. These studies and subsequent employment remain male dominated; however, a push is underway in the IT industry to encourage females to focus on more technical studies.
Many careers of the future will rely heavily on big data analytics experts, who analyze and report data that is ultimately used as key factors in decision-making for businesses and organizations across various industries and sectors. According to the Computer Business Review, the big data market is predicted to grow to $46.34 billion by 2018 as more and more businesses adopt new technologies and a digital mindset.
Big Data Analytics Consultants
Big data analytics consultants can identify patterns and trends from the incredible amount of data that is created and stored on a daily basis. This emerging field allows businesses and organizations to interpret data and apply it directly to identifying business intelligence solutions. Consultants in big data analytics specialize in making data easier to understand and digest, allowing companies to react faster to trends and patterns that analysts identify.
Computer Systems Analysts
A study by Forbes showed that computer systems analysts with big data expertise increased by 89.9 percent in the last 12 months. These individuals are expected to possess skills in programming languages and tools such as Python, Linux and SQL. Computer systems analysts primarily leverage their knowledge of IT and business to improve the organization’s computer systems processes. These individuals help the networks, and overall computer system, run more effectively and efficiently.
Metrics and Analytics Specialist
Individuals in a metrics and analytics specialist role understand and document the organization’s requirements by analyzing and interpreting data from external sources. They also identify data required for extraction and work with other departments such as web development and IT to develop strategies for optimizing results, data models or compiling reports. Individuals in this role assist with research, development, monitoring and reporting. Several industries hire metrics and analytics specialists, including: B2B, B2C, healthcare, manufacturing, tourism, technology and finance.
Solutions Architect
The definition of a solutions architect is still evolving. Individuals in this role develop solutions that fit within an organization’s structure in terms of information architecture, integration and high-level business solutions. As companies adopt newer technologies and processes, a solution architect uses a deep, tactical approach to analyze the variables to ensure that each initiative and project is within the scope. Solution architects address concerns and roadblocks within the organization’s high-level projects by creating conceptual models, formulas and formal specifications.
Analytics Associate
For many companies, analytics is a key competitive resource. Companies across nearly every industry are collecting more information than ever before, and analytics associates are charged with identifying patterns and trends with the data. Analytics associates analyze data and provide practical insights that allow organizations to effectively make strategic decisions and drive results. Today’s data goes beyond numbers; it is multifaceted and dynamic, blending both insight and technology.
According to Intel’s Peer Research Big Data Analytics survey, organizations and executives concluded that big data analytics is one of the top priorities for businesses. Big data is pervasive in almost every aspect of daily life and a primary reason why many companies hire graduates with a big data analytics degree.
As STEM fields advance, the knowledge and skills needed to contribute to this growth are increasingly important, illustrating why so many colleges–and companies in need of skilled workers–encourage students to pursue STEM degrees.
Those who graduate from college with STEM-based degrees are almost guaranteed six-figure incomes with offers coming from multiple companies. Of course, if you feel like you’re in a dead-end job in or out of IT and have the knowledge and education to fill a data scientist position, you’ll likely have the same success or better than those coming straight out of a university. |
|