Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, August 1st, 2017

    Time Event
    12:00p
    DCK Investor Edge: JLL Says Cloud Data Center Hunger Has Subsided

    Real estate brokerage Jones Lang LaSalle just published its North America data center market report for the first half of the year, highlighting new trends impacting the data center industry, such as security, high-performance computing, enterprise hybrid cloud, big data, and AI.

    But the report also found that absorption of data center space available for lease “returned to normal levels after record leasing in 2016.” Cloud providers, who were largely behind last year’s record leasing, are still a major force in the market this year, while data center construction across the US is heating up.

    Leasing “New Normal”

    Industry experts and investors have spent the past six months trying to figure out the “new normal” after a record leasing year in 2016, led by hyper-scale cloud service providers Microsoft and Oracle.

    The JLL report confirms what many industry observers have discussed anecdotally — hyper-scale cloud leasing has been dialed back.

    “US markets absorbed a total of 182.1MW from January through June this year, compared to 249.1MW across the same timeframe in 2016. Canadian markets on the other hand captured 34.7MW of absorption in H1 2017, nearly 10MW more than H1 2016, carried by sizeable upticks in Toronto and Montreal.”

    The cloud leasing train left the station so fast in 2016 that it sputtered and ran out of fuel toward the end of the year. First half of this year saw more moderate leasing.

    Read more: Who Leased the Most Data Center Space in 2016?

    This more normal leasing pattern could serve to reduce the amount of volatility in the publicly traded data center REIT share prices. The six data center REITs’ shares ran up by 50 percent during the first six months of 2016 prior to giving back half those gains during late summer and fall. This year data center REITs have performed well, up 23 percent this year, mirroring the more modest (but still strong) data center leasing during the first six months.

    Leasing Down in Northern Virginia, Silicon Valley

    While leasing in the Northern Virginia data center market is down 25 percent compared to last year, there is plenty of space under construction which will be delivered during the second half of this year.

    The San Francisco/Silicon Valley data center market has once again lived up to its reputation as being one of the “lumpiest” Tier 1 markets, down almost 95 percent year-over-year. A lack of entitled land limiting new supply makes it difficult to measure current demand after a record 58MW of critical load was leased by this time last year.

    So far, Montreal and Atlanta have doubled up on leasing success year-over-year, while new supply in Dallas has helped to jump-start leasing during the first half of 2017.

    Notably, in Toronto JLL reports 41MW currently under construction. This is a huge increase over prior years, which equates to over three times the impressive 13.9MW of space absorbed there year-to-date.

    Source: JLL 1H2017 N. America Report

    What’s Hot, What’s Not

    One prediction made earlier this year by JLL managing director Bo Bond, a Dallas market expert, was right on the money. Lease signings for block and tackle corporate requirements dried up in Dallas during the final quarter of 2016, partially due to lack of supply.

    Read more: JLL: Expect the Dallas Data Center Market to Have a “Chicago Year” in 2017

    Bond told Data Center Knowledge back in January that he expected to see “bottled up enterprise demand as a tailwind.” Here was his prediction:

    “The last quarter of 2016 saw a buzz of Fortune 1000 Enterprises touring spaces for 2017 deployments. Many of those customers are already placing purchase orders for new equipment to be placed in ‘soon to be delivered’ facilities. Expect Q1/Q2 2017 to report solid enterprise colo signings in Dallas.”

    The new supply and pent-up demand in Dallas resulted in solid enterprise leasing results. Dallas has indeed leapfrogged Chicago to become the second strongest US data center market behind Northern Virginia.

    Source: JLL 1H2017 N. America Report

    Chicago remains a hot wholesale market mainly because of hyper-scale leases. However, two-thirds of the Dallas leasing was due to more traditional corporate requirements, including enterprise hybrid IT deployments.

    The chart below shows that cloud leasing is still driving a significant amount of the absorption in several of the top markets.

    Source: JLL 1H2017 N. America Report

    However, Pacific Northwest, Atlanta, Las Vegas/Reno, Toronto, and Phoenix joined Dallas in the camp where the majority of deals where inked with enterprise customers. A slowdown in hyper-scale signings has left some larger blocks of space available for corporate users who were shut out in some markets due to “The Great Absorption of 2016,” according to JLL.

    Another artifact from the massive cloud leasing during 2016 is the 43 percent increase in data centers reported to be under construction. Data centers under construction last year totaled 353MW in capacity, compared with the 506MW currently underway.

    Emerging Trends

    JLL analysts interviewed three types of customers for the report to help spot IT trends: tech and big data, transaction/online payments, and online retail. Key takeaways included:

    • High Performance Computing (HPC) is becoming a more cost-effective solution than public cloud for some big data applications.
    • Heat generated by newer processors is requiring data centers designed for at least 200 watts per square foot (rendering some older data centers obsolete).
    • AI processor technologies allow companies to leverage predictive analytics to automate more tasks.
    • Recent hybrid-cloud announcements by Microsoft, Oracle, VMware, and others to support enterprise private cloud should accelerate deployments.
    • Security upgrades are requiring more rack space and must be carefully planned in conjunction with data center expansions.

    Expect More Deals

    M&A activity has skyrocketed in the first half of the year, reaching north of $13 billion, and is likely to continue. JLL believes that the publicly traded data center REITs will continue merger mania (“activity is nearly guaranteed”), as acquisitions of assets and takeover targets play out the remainder of the year.

    3:00p
    What Europe’s New Data Protection Law Means for Data Center Operators

    The new European General Data Protection Regulation goes into effect next May and applies to any company, anywhere in the world, that collects sensitive data about European customers or employees. GDPR also comes with onerous breach notification requirements and high penalties for failing to comply, and data center operators may become prime targets for regulators’ enforcement efforts once the new rules kick in.

    “Data center providers are an important piece in the GDPR compliance chain as they have ownership of the physical assets where information is stored,” said Jose Casinha, CISO at OutSystems, an enterprise software company based in Atlanta, Georgia.

    “The data center is ‘where the rubber meets the road’ for many aspects of GDPR,” said Ken Krupa, enterprise CTO at MarkLogic Corp.

    Often, it’s only the people who manage the infrastructure who really understand where all the copies of the data are, he said, especially when things like high availability, disaster recovery, and backups are taken into account.

    It’s Not Only for Data Centers in Europe

    However, many providers don’t know about the law, aren’t aware that it applies to them, and don’t have the time and resources to become compliant.

    For example, data centers don’t have to be located in Europe to be affected by the law, said Benjamin Wright, attorney and instructor at the SANS Institute.

    “One step a data center can take to limit risk is to get assurances from customers that they are not storing or processing EU data and that they indemnify the center from any costs or losses related to GDPR enforcement,” he said.

    GDPR Compliance as a Product Feature

    Or a data center can take the opposite approach, and make GDPR compliance a selling point, he said.

    That means that they will need to appoint a data protection officer, conduct risk assessments, and establish a track record of compliance, he said. Plus, data centers may need to work with customers to be able to identify what data was affected by a data breach within a 72-hour window.

    That won’t be easy.

    Enterprises will need to have granular control over how and where customer data resides and is accessed, said Adam Conway, VP of product management at Bracket Computing, a cloud security vendor based in Mountain View, California.

    That will require a fundamental shift in data center technology, one that will be felt for decades, he said.

    The Data Privacy Part

    And it’s not just about cybersecurity. GDPR covers a broad range of areas related to data privacy.

    For example, it requires that companies delete personal data when requested by customers or employees, said Eric Dieterich, data privacy practice leader at Focal Point Data Risk.

    “Data centers might need to provide functionality to allow their customer to perform the right to erasure for data center storage of personal data,” he said.

    More Physical Security

    Another challenge is that data centers may need to put more robust physical security in place, said Tomáš Honzák, director of security and compliance at GoodData, a San Francisco-based software company.

    “This causes a lot of pressure as changing a data center is a strategic and hard decision, but strengthening data center security practices is likely to be a costly and time-consuming exercise too,” he said.

    In fact, according to Gartner, most companies that fall under GDPR still won’t be in full compliance by the end of 2018.

    But the costs of non-compliance could be staggering. Failing to comply can lead to a fine of 20 million Euros or 4 percent of annual global revenues, whichever is higher. That means a breach that cost a company millions of dollars in fines before the law went into effect could cost billions afterward.

    3:30p
    Alphabet Wants To Fix Clean Energy’s Storage Problem — With Salt

    Mark Bergen (Bloomberg) — Alphabet Inc.’s secretive X skunk works has another idea that could save the world. This one, code named Malta, involves vats of salt and antifreeze.

    The research lab, which hatched Google’s driverless car almost a decade ago, is developing a system for storing renewable energy that would otherwise be wasted. It can be located almost anywhere, has the potential to last longer than lithium-ion batteries and compete on price with new hydroelectric plants and other existing clean energy storage methods, according to X executives and researchers.

    The previously undisclosed initiative is part of a handful of energy projects at X, which has a mixed record with audacious “moonshots” like Google Glass and drone delivery. Venture capitalists, and increasingly governments, have cut funding and support for technology and businesses built around alternatives to fossil fuels. X’s clean-energy projects have yet to become hits like its driverless cars, but the lab isn’t giving up.

    “If the moonshot factory gives up on a big, important problem like climate change, then maybe it will never get solved,” said Obi Felten, a director at X. “If we do start solving it, there are trillions and trillions of dollars in market opportunity.”

    She runs The Foundry, where a Malta team of fewer than 10 researchers is testing a stripped-down prototype. This is the part of X that tries to turn experiments in science labs into full-blown projects with emerging business models, such as its Loon internet-beaming high-altitude balloons. Malta is not yet an official X project, but it has been “de-risked” enough that the team is now looking for partners to build, operate and connect a commercial-sized prototype to the grid, Felten said. That means Alphabet may team up or compete with industrial powerhouses like Siemens AG, ABB Ltd. and General Electric Co.

    X is stepping into a market that could see about $40 billion in investment by 2024, according Bloomberg New Energy Finance. Roughly 790 megawatts of energy will be stored this year and overall capacity is expected to hit 45 gigawatts in seven years, BNEF estimates. Existing electrical grids struggle with renewable energy, a vexing problem that’s driving demand for new storage methods. Solar panels and wind farms churn out energy around midday and at night when demand lulls. This forces utilities to discard it in favor of more predictable oil and coal plants and more controllable natural gas “peaker” plants.

    In the first half of this year, California tossed out more than 300,000 megawatts produced by solar panels and wind farms because there’s no good way to store it. That’s enough to power tens of thousands of homes. About 4 percent of all wind energy from Germany was jettisoned in 2015, according to Bloomberg New Energy Finance. China throws out more than 17 percent.

    Felten is particularly excited about working with companies in China, a voracious energy consumer — and a country where almost all Google web services are banned. Before that happens, the Malta team has to turn what is now an early test prototype in a warehouse in Silicon Valley into a final product that can be manufactured and is big and reliable enough for utilities to plug it into electricity grids.

    In renderings, viewed by Bloomberg News, the system looks like a miniature power plant with four cylindrical tanks connected via pipes to a heat pump. X says it can vary in size from roughly the dimensions of a large garage to a full-scale traditional power plant, providing energy on demand to huge industrial facilities, data centers or storage for small wind farms and solar installations.

    The system mixes an established technique with newly designed components. “Think of this, at a very simple level, as a fridge and a jet,” said Julian Green, the product manager for Malta.

    Two tanks are filled with salt, and two are filled with antifreeze or a hydrocarbon liquid. The system takes in energy in the form of electricity and turns it into separate streams of hot and cold air. The hot air heats up the salt, while the cold air cools the antifreeze, a bit like a refrigerator. The jet engine part: Flip a switch and the process reverses. Hot and cold air rush toward each other, creating powerful gusts that spin a turbine and spit out electricity when the grid needs it. Salt maintains its temperature well, so the system can store energy for many hours, and even days, depending on how much you insulate the tanks.

    Scientists have already proven this as a plausible storage technique. Malta’s contribution was to design a system that operates at lower temperatures so it doesn’t require specialized, expensive ceramics and steels. “The thermodynamic physics are well-known to anyone who studied it enough in college,” Green said. “The trick is doing it at the right temperatures, with cheap materials. That is super compelling.”

    X declined to share exactly how cheap its materials are. Thermal salt-based storage has the potential to be several times cheaper than lithium-ion batteries and other existing grid-scale storage technologies, said Raj Apte, Malta’s head engineer. German engineering firm Siemens is also developing storage systems using salt for its solar-thermal plants.

    But lithium-ion battery prices are falling quickly, according to Bloomberg New Energy Finance. And Malta must contend with low oil and natural gas prices, a market reality that’s wiped out several companies working on alternatives to fossil fuels. “It could potentially compete with lithium-ion,” said Bloomberg New Energy Finance analyst Yayoi Sekine. “But there are a lot of challenges that an emerging technology has to face.”

    One hurdle is convincing energy incumbents to put capital into a project with potential returns many years down the road. Alphabet has the balance sheet to inspire confidence, with $95 billion in cash and equivalents. Yet the tech giant has a recent history of retreating from or shutting experimental projects that stray from its core areas of high-power computing and software.

    Robert Laughlin, a Nobel prize-winning physicist whose research laid the foundation for Malta, is now a consultant on the project. He met X representatives at a conference a few years ago. They discussed the idea, and the lab ultimately decided to fund the project and build a small team to execute it. Laughin has signed off on the team’s designs, and he said his theories have been working with the prototype.

    Laughlin believes X is more committed than previous potential backers. He first pitched the idea as his own startup, taking it to luminary tech investors including Khosla Ventures and Peter Thiel’s Founders Fund. They passed, according to the scientist, because they didn’t want to deal with the tougher demands of a conservative energy industry that will have to buy and use the system in the end. “What we’re talking about here is engines and oil companies — big dinosaurs with very long teeth,” said Laughlin. That’s “above the pay grade of people out here.” A representative from Founders Fund declined to comment. Khosla didn’t respond to requests for comment.

    X won’t say how much it has invested so far, but it’s enough for Laughlin. “A blessing came out of the sky,” he said. “X came in and took a giant bite out of this problem.”

    4:00p
    Four Survival Tips for the Accidental DBA

    Zack Kendra is Principal Software Engineer at Blue Medora.

    Let’s be honest, most of us have inherited a database administrator gig. Either you’re a dev that’s deployed a new database platform to support your application, and now you’ve got to troubleshoot; or you’re a dedicated database administrator, familiar with a few platforms who’s been asked to work on one or more of the many number of new data platforms.

    The number of DBA professionals across the country continues to rise. According to CNN Money, the DBA job market is expected to grow by more than 11 percent over the next 10 years. Many database professionals pursue their careers through a formal college degree or internal development on the job.  Increasingly though, many of us end up in this profession by accident.

    By 2018, Gartner predicts that more than 70 percent of new in-house applications will be developed on an open source database and 50 percent of closed-source databases will be on their way to conversion. These in-house applications and the growing number of open-source and cloud-based databases that drive them plays a major role in increasing the number of accidental DBAs among us. Nearly anyone can deploy a database on AWS or Azure. However, few of them know how to fix or manage one, beyond just increasing the instance size.

    Demand for more DBAs is also on the rise because of the digital transformation efforts promoting cloud adoption and comfort with running production databases in the cloud. There is an increasing adoption of DevOps and proliferation of developer tools, as well. Even the role of a traditional DBA is transforming as new, inexperienced DBAs join their ranks from less traditional roles,  such as a data scientist.

    Accidental DBAs can quickly find themselves overwhelmed by demands for data and performance SLAs. They might lack the resources, policies or processes necessary to succeed .  Not to mention, many accidental DBAs are often left to their own devices when it comes to training. Here are a few suggestions to help new DBAs survive the transition.

    Don’t Solve Everything with Code

    Developers turned accidental DBAs can quickly find themselves in over their heads if they try to address database issues. It’s only natural for a developer to turn back to their code as the most familiar path to resolve an issue. In most cases the database engine will do a better job at finding the most efficient way of completing a task than you could in code. Especially when it comes to things like making the results conditional on operations performed on the data.

    Get to Know Your Databasics

    For those DBAs that have come from less traditional roles, or even those who are working with a new platform, it’s important to get a good baseline for the unique KPIs for your specific DBMS. This can be incredibly challenging given the rapid amount of change taking place in this space today. Beyond the basics of querying execution time, each platform often has its own metrics you’ll want to keep an eye on for performance tuning. Make sure you’re tracking things like  Cassandra’s keyspace or Couchbase’s buckets and nodes.

    Databases with origins in open source, like PostgreSQL, can be particularly difficult to learn unless the DBA becomes an active member of the community that developed it. Documentation can sometimes be limited. So the new DBA should reach out to those colleagues with long histories of working with the database platform through multiple generations and version updates.

    Find a Way to Visualize the Data   

    Managing multiple databases with different data sets can result in areas where the DBA lacks visibility into key data metrics and configuration. When faced with these database blind spots, many DBAs resort to developing their library of homegrown scripts. While these scripts can function as a bridge between the monitoring solutions implemented and the functionality required by the stakeholders, they are inevitably difficult to maintain and provide the required functionality.

    Blind spots within database infrastructure wreak the most havoc during cause analysis. If a DBA is tracking down an issue’s source but has limited visibility into the database infrastructure, they may be unable to discover the cause and fully resolve the issue.

    Stop Hoarding Monitoring Systems

    New database software packages usually come with their own monitor solution. Add the multiple monitoring applications for cloud and on-premises infrastructure, and before you know it, your desk is littered with passwords on sticky notes.

    Consolidating your database and infrastructure monitoring on a single platform can provide visibility across your entire data layer so DBAs can start to optimize database performance.  Also, DBAs with a comprehensive view of their IT can pinpoint the cause of problems faster and reduce the time spent chasing down and putting out fires and focus instead on optimizing database performance.

    Challenges for accidental DBAs will only increase as more organizations continue to migrate to the cloud and adopt DevOps initiatives.  Professionals that find themselves  working with databases in next few years will be challenged to keep up with multiple platforms, diverse workloads and impatient internal clients. With any luck, demand and complexity will drive up salaries and prestige to make the DBA the next big career sensation.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:56p
    Microsoft’s Cloud Market Share Grew More than Anyone Else’s Last Quarter — Analysts

    Now that top cloud providers have released their quarterly numbers, Synergy Research Group has had a chance to augment the numbers they released a couple of weeks back, but there are no surprises. The future remains rosy for the industry — at least for the top players.

    Amazon Web Services, Microsoft Azure, and Google’s Cloud Platform continue to grow market share, with IBM holding its own. Quarterly cloud infrastructure service revenues (including IaaS, PaaS, and hosted private cloud services) are at nearly $11 billion and continue a growth rate of over 40 percent per year. This means worldwide revenues from cloud and SaaS remain on track to surpass $200 billion by 2020, according to Synergy.

    John Dinsdale, chief analyst and research director at Synergy, said in a statement:

    “The increasing dominance of hyper-scale players continues to play out, with all four leading companies having cause to celebrate. While Microsoft Azure and Google Cloud Platform are doubling in size, IBM continues to dominate in hosted private cloud, and AWS is still over three times the size of its nearest competitor. Some of the numbers are actually pretty spectacular.”

    See also: JLL Says Cloud Data Center Hunger Has Subsided

    According to Q2 figures, Microsoft Azure, with 11 percent of the total public cloud market, showed the largest growth in market share, with a 3 percent gain over the last four quarters. AWS, which commands 34 percent of the total market, and GCP, with a 5 percent share, both saw their share grow by a single percent. IBM experienced no growth in its total share, but held steady at 8 percent, three points ahead of Google.

    The next 10 top-ranked cloud providers collectively saw their market share drop by 1 percent, with Alibaba and Oracle achieving the highest growth rates in that group. Alibaba is now the fourth ranked provider in IaaS worldwide, due to strong growth in China, and helped by its aggressive expansion outside its home country.

    All other players in the cloud infrastructure market saw an overall drop in share of 5 percent as a group, a continuation of recent trends.

    Dinsdale:

    “The year-on-year market growth rate is nudging down as we expected in such a large market. But it remains at comfortably over 40 percent and AWS alone generated revenue growth of $1.2 billion over the last four quarters.”

    Synergy pointed out that IBM, Rackspace, and some traditional IT service providers continue to find more strength in hosted private cloud services than in public clouds.

    See also: Here’s What Wall Street Is Saying About Amazon’s Earnings

    << Previous Day 2017/08/01
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org