Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Monday, October 12th, 2015
| Time |
Event |
| 12:00p |
Growth Continues in Secondary North American Data Center Markets While top data center markets like New York, Silicon Valley, and Dallas get most of the attention, a lot of growth is taking place in markets considered secondary. Markets like Seattle, Portland, Phoenix, and, more recently, Reno, Nevada, are seeing a lot of multi-tenant data center construction and take-up.
There are market-specific characteristics at play, but core demand drivers are the same across the board, according to a recent report on the North American data center market by Jones Lang La Salle, a commercial real estate firm. Companies want to focus on their core business instead of spending resources on data center management. They want to take advantage of new data center technologies, eliminate overhead, and increase IT operations efficiency.
Activity Picking Up in Pacific Northwest
One particularly active tier-two region this year has been the Pacific Northwest. From Seattle to Portland, including Oregon’s high desert plain, the region has cumulatively absorbed tens of megawatts of multi-tenant data center capacity year to date, according to JLL.
Notable data center deals closed this year include a lease of a 200,000-square-foot building by colocation provider ViaWest from developer Majestic Realty in Hillsboro, Oregon, for an 18 MW data center, according to JLL’s report; Server Farm Realty’s 8 MW deal with CenturyLink in Moses Lake, Washington; and a 1 MW Costo lease at a Sabey data center in Washington state.
Whether the different sub-markets of the region should be treated as a single “mega-market” is controversial in data center real estate circles, but even if taken individually, “all three of those markets are seeing activity in different ways,” Bo Bond, managing director at JLL and one of the report’s authors, said.
Seattle has always been and continues to be an important West Coast market, and there continues to be a lot of demand there, Bond said. Central Washington is also an established market, while Portland is a newer, emerging market, he said.
Reno – a Brand New Data Center Market
Another even newer emerging data center market is Reno, Nevada, where Apple’s data center was followed by a massive build by Las Vegas-based data center provider Switch.
The company is building what it says will be the world’s largest data center by space, power, and cooling capacity. eBay has signed as the anchor tenant.
Managed services provider Rackspace has been exploring Reno as a potential location of its next data center, and Tesla Motors, which is building a battery plant near the Switch site, has included a data center in its construction plans.
Reno has never been known as an active data center market before, but Nevada has a lot to offer to data center operators, including robust internet infrastructure, a reliable electrical grid, a business-friendly political environment, and proximity to California.
Whether Reno’s data center market will continue to grow remains to be seen, Bond warned. It’s a market to watch, but it’s also a market that could, potentially, hit its peak quickly.
An example where this has happened is San Antonio, Texas, he said. San Antonio was in vogue several years ago, after several big corporations moved in, but eventually slowed down and hasn’t picked up since.
Will Houston’s Healthcare Industry Embrace Colocation?
Another Texas market, Houston, has been growing at a healthy clip this year, according to JLL. Demand there is driven primarily by oil and gas companies, many of which have corporate offices in Houston. Energy companies are leveraging technology in drilling and exploration efforts, which drives take-up of data center capacity in the market.
But Houston also has a big community of healthcare organizations, and Bond expects this industry vertical to start taking multi-tenant data center space much more actively than they have in the past. The healthcare industry is one of the last holdouts, continuing for the most part to operate data centers on hospital campuses and in medical centers. “But it’s going to break sooner or later,” Bonds said.
Other notable demand drivers in Houston are banking and financial services, telecommunications, insurance, retail, and e-commerce, according to the JLL report.
Minneapolis-St. Paul Sees Glut of Supply
Also notable is Minneapolis-St. Paul, where there is more multitenant data center space available today than ever. New companies recently entered the market, and existing providers have expanded, resulting in a market with about 250,000 square feet of vacant data center space, according to JLL.
The good news for Minneapolis-St. Paul is that demand for data center capacity there is growing quickly. “Absorption has steadily increased with deals ranging from single racks to 1 MW during the last four quarters,” the report’s authors wrote.
JLL expects the region to continue being a tenant’s market throughout 2016, with downward pressure on pricing until absorption rates catch up with supply.
Consolidation Changes Market Shape
Real estate dynamics aside, one of the most important trends shaping the North American multi-tenant data center market this year has been consolidation. Some of the biggest data center providers in the country have been expanding their geographic reach as well as the set of services they offer.
Companies like QTS, CyrusOne, and Digital Realty have been diversifying their product portfolios, often via acquisition. Digital Realty made a big bet on retail colocation and interconnection services with its $1.9B acquisition of Telx, which closed last week; CyrusOne entered the New York market with its acquisition of Cervalis; and QTS bought Carpathia Hosting, gaining a massive managed hosting service and more than doubling the size of its data center portfolio.
“We expect industry consolidation will change the market by shrinking the overall number of providers and positioning those left to offer greater geographic coverage and service options,” JLL researchers wrote. | | 3:00p |
Quanta Pitches OCP-Inspired Hyper Converged Infrastructure Quanta Cloud Technology, subsidiary of the Taiwanese hardware manufacturer Quanta Computer, has introduced a hyper converged infrastructure offering that combines its hardware, including Open Compute servers, storage, networking switches, and racks, with virtualization and cloud software options by VMware, Canonical, Red Hat, Microsoft, and Mirantis, among others.
After years of building hardware for both major IT vendors and web-scale data center operators, such as Facebook, Quanta started its own off-the-shelf data center hardware business in 2012, using designs based on specs available through the Open Compute Project, Facebook’s open source hardware design initiative.
The new converged infrastructure solutions are configurable, based on customer requirements, Mike Yang, QCT general manager, said in an interview. Quanta is leveraging its hardware capabilities, earned by years of supplying hardware to web-scale data center customers, to provide custom-configured infrastructure with software the customer chooses.
The solutions are targeted primarily at customers who want to set up hybrid cloud infrastructure, using OpenStack distributions by Red Hat, Mirantis, or Canonical, or proprietary cloud technologies by Microsoft or VMware.
The converged infrastructure will come as a rack, with hardware in it configured to support the software stack, Yang said.
QCT has opened a new Cloud Solution Center in San Jose, California, to test and showcase its converged infrastructure solutions. This is the company’s second solution center. It opened the first one in Taiwan in April.
In addition to the two solution centers, QCT has set up a global network of integration centers to be able to deliver systems to customers around the world quickly. Those integration centers are in the US, Europe, and Asia Pacific. | | 4:39p |
Confirming the Rumor: Dell Buying EMC 
This post originally appeared at The Var Guy
As far as rumors go, this one was a whopper: Dell was in talks to buy EMC. And, as it turns out, it was absolutely true: Dell and EMC have signed a definitive agreement under which Dell will acquire EMC. The deal is valued at $67 billion.
Under the terms of the deal, EMC shareholders will receive $24.05 per share in cash in addition to tracking stock linked to a portion of EMC’s economic interest in the VMware business. Dell plans to maintain VMware as a publicly traded company, which could mean it’s leaning on VMware’s success to fund the EMC deal.
According to the release announcing the deal, the combination of Dell and EMC creates “the world’s largest privately controlled, integrated technology company.” For sure, adding EMC technology propels Dell into the storage stratosphere, and when combined with Dell’s servers, storage, virtualization and PCs, Dell’s portfolio becomes a force to be reckoned with.
The deal also better positions Dell to be a major player in high-growth technology areas including hybrid cloud, software-defined networking, converged infrastructure, mobile and security.
What this means for VMware ultimately remains to be seen. Should Dell stick with its plan to run VMware as a publicly traded company, VMware could become a cash cow for Dell. Dell also could decide to sell the company, which, considering it has a market value of about $33 billion, according to the New York Times, is about half the size of the deal. Currently, EMC has an 81 percent stake in VMware.
What the deal means for Dell and EMC solution providers seems obvious at first glance—better access to a complete technology portfolio and a more solid ability to sell into emerging markets. What it means to VMware partners, however, remains to be seen. You can bet we’ll be keeping our eye on the implications of the deal.
This first ran at http://thevarguy.com/information-technology-merger-and-acquistion-news/101215/confirming-rumor-dell-buying-emc | | 5:00p |
Need for Speed: Top Five MySQL Performance Tuning Tips Janis Griffin is a Database Performance Evangelist for SolarWinds.
With the added complexity of growing data volumes and ever changing workloads, database performance tuning is now necessary to maximize the use of system resources to perform work as efficiently and effectively as possible. However, for many database administrators (DBAs), performance tuning is often easier said than done.
Let’s face it, tuning is difficult for a number of reasons. For one thing, it requires a significant amount of expertise in order to read, understand the execution plan, and write a good SQL statement in case the query needs to be updated or changed. On top of that, the nature of tuning necessitates a lot of time and effort. There will always be a large volume of SQL statements to sort through, which may lead to uncertainty around which specific statement needs tuning; and given every statement is different, so too is the tuning approach.
However, as data volumes grow and IT becomes increasingly complex, it’s become a necessity for DBAs to tune databases properly to save themselves and their organization time and energy. Performance tuning can help DBAs quickly identify bottlenecks, target insufficient operations through review of query execution plans and eliminate any guessing games.
Regardless of the complexity or a DBA’s skill level, the following performance tuning tips will serve as a step-by-step guide to solving MySQL performance problems and help alleviate the pain points that often accompany performance tuning.
Gather Baseline Metrics
Effective data collection and analysis is essential for identifying and correcting performance problems. That said, before performance tuning begins, it’s important to set expectations for how long the process will take, as well as provide an acceptable timeframe to close the query, whether it be 10 seconds, 10 minutes or one hour. This timeframe should include gathering all of your baseline metrics. From there, it’s critical to collect wait and thread states, which include: system block, sending data, calculating statistics and writing to the network.
Develop an Execution Plan
Developing an execution plan is incredibly important as you work to create a roadmap for query performance. Luckily, MySQL offers many ways to choose an execution plan and simple navigation to close the query. For example, to gain visual or tabular views into the database, use EXPLAIN, Explain EXTENDED, Optimizer Trace or MySQL Workbench. These plans list out steps from top to bottom, select type, table names, possible keys to target, key length, reference and the number of rows to read. Also, the “extra columns” will give you more information about how it’s going to filter and access the data.
Review the Table and Index
Now that the metrics have been gathered and the execution plan has been put in place, it’s time to review the table and information in the query, as these details will ultimately inform your tuning strategy. To start, it’s important to know where the table resides to review the keys and constraints, plus how the tables are related. Another area to focus on is the size and makeup of the columns – especially in the “where” clause. A little trick DBAs can use to get the size of the tables is to use the statement “mysqlshow with–status.” Also using the “show indices from table” statement is helpful to check on the indices and their cardinality, as this will help drive the execution plan. Notably, identify if the indices are multi-column or not, and what order they fall within the index.
Consider SQL Diagramming
After gathering and reviewing all of this information, it’s time to finally start tuning. Often, there may be so many possible execution paths to resolve a poorly performing query that the optimizer cannot examine them all. To circumvent this, a useful technique is SQL Diagramming, which provides a view of the issue mathematically to help the tuner ultimately find a better execution path than the optimizer. SQL diagramming can also be implemented when tuning to help expose bugs within a complete query. Many times, it’s hard to understand why the optimizer is doing what it’s doing, but SQL diagramming helps create a clearer path to the issue, which can save businesses from costly mistakes.
Monitor
Monitoring can easily be forgotten, but it’s an integral step in ensuring the problem within the database is resolved – and stays resolved. After tuning, it’s important remember to monitor the improvements made. To do this, make sure to take new metric measurements, then compare to the initial readings to prove that tuning made a difference. Following a continuous monitoring process, it’s also critical to monitor for the next tuning opportunity, as there’s always room for improvement.
As with much in IT, database performance tuning is not without its challenges. However, tuning proves to be worthwhile when DBAs are tasked with managing queries to ensure business success, on top of performing their daily tasks. Tuning can also give businesses more bang for the buck, rather than simply throwing more hardware at the issue. But as it’s not unusual for a DBA responsible for upwards of 50 databases to also handle SQL tuning, these tips will also be helpful as DBAs look to find the best and most efficient plan of action. Remember: tuning is an iterative process. As data grows and workloads change, there will always be new tuning opportunities.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 7:08p |
Study: Sweden Should Lower Taxes for Data Centers Authors of a government-commissioned study in Sweden have recommended that the government lowers the tax burden on data center operators the same way it gives tax breaks to other industries that consume a lot of electricity and compete globally.
Already home to massive data centers by Facebook, data center provider Hydro66, and the bitcoin mining specialist KnC Miner, Sweden boasts low energy rates, cool climate, and robust network infrastructure – all crucial factors in data center site selection. Tax rates are another big factor companies consider when deciding on a data center location.
Governments have been using data center tax breaks as a way to attract the big high-tech construction projects to rural areas in hopes of boosting local economies. A recent analysis by the Associated Press, found that state governments around the US have committed to about $1.5 billion in tax breaks for data centers over the past decade.
Sweden has been providing lower tax rates to industries that are both energy-intensive and exposed to international competition. The rationale is that some industries, such as manufacturing, find it difficult to compete with foreign companies if they’re taxed too heavily at home.
The study found that data centers, a young industry, are worthy of such “protection.”
“This industry is only a couple of decades old, but is growing fast both in Sweden and internationally, due to the increasing use of the internet and of online services. The result is an increased need for data center capacity. The industry is assessed as still having great development potential,” the report’s authors wrote.
Sweden is ranked third on the latest Networked Readiness Index 2015 that’s part of The Global Information Technology Report commissioned by the World Economic Forum. The index rates countries’ potential for ICT industry growth based on regulatory environment, skills, infrastructure, potential for ICT’s economic impact, and other factors.
Singapore ranked first, followed by Finland. US ranked seventh, and the UK ranked eighth. | | 8:58p |
Digital Realty Closes $1.9B Telx Acquisition While all eyes are on the biggest tech acquisition in history – Dell’s $67 billion takeover of EMC – the data center services industry recently saw the close of one of the biggest M&A deals on its record, albeit a much smaller one than the expected merger between the two IT giants.
Digital Realty Trust, one of the world’s largest data center providers and the world’s largest provider of wholesale data center space, has completed its $1.89 billion acquisition of Telx, just about doubling the size of its retail colocation business.
The acquisition is symbolic of the changes the business of leasing data center capacity is going through. Digital was an industry pioneer that perfected the art and science of being a type of data center landlord whose involvement with the customer’s infrastructure stops at the power and cooling infrastructure level, and sometimes even lower “down the stack,” where the company would simply provide a building shell and access to utility power feeds, leaving the rest of the build-out to the tenant.
But data center tenants today want more than that out of their provider, and Digital is responding as it competes against smaller players, such as QTS Realty Trust and CyrusOne, and its more seasoned rivals, Equinix being the biggest of them all.
Telx gives Digital a substantial retail colocation and interconnection business – the types of services that have traditionally taken the back seat to the San Francisco-based real estate investment trust’s wholesale business.
Read more DCK analysis of Digital Realty’s Telx acquisition here and here
Companies in the data center services space are diversifying their product portfolios by developing new capabilities internally, partnering, or making acquisitions. Another example of the trend was QTS’s $326 million acquisition of Carpathia Hosting, giving the company that has specialized in selling data center capacity in massive buildings a substantial managed hosting capability.
The deals show that big customers that take data center space in bulk increasingly need other services and that it’s difficult to make it today as a data center provider with a pure “bulk space play,” Bo Bond, managing director at the commercial real estate brokerage Jones Lang La Salle, said.
“The combination of Digital Realty’s and Telx’s portfolios of data centers and capabilities gives customers the platform they need to grow and compete in a data-driven world,” Digital CEO William Stein said in a statement. |
|