Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, July 25th, 2014
Time |
Event |
12:00p |
Vantage: Changing Along With the Wholesale Market Wholesale data center providers are increasingly offering smaller chunks of space and are getting more involved with customers and their infrastructure needs. The latest example is Vantage Data Centers, which has been working closely with customers in a variety of ways, from providing remote hands to helping them increase energy and capital efficiency. The company is increasingly working with sub-1-megawatt customers that have the potential to grow into a typical wholesale space. We caught up with Vantage COO Chris Yetman recently to talk about the changing relationship between wholesale providers and their customers.
Other examples of providers who find themselves having to provide more data center services than just space and power include Digital Realty Trust, which said in May that it was seeing more demand for fuller data center solutions. Both Digital Realty and another major U.S. wholesaler DuPont Fabros have also been more actively promoting deals that are smaller than traditional multi-megawatt leases.
Wholesale is easier than growing in retail
There are some challenges to accepting smaller customers, but Vantage believes it’s worth it in the long run. It’s easier for someone to grow into wholesale than migrate to wholesale from rack-based retail colocation. The company is accepting smaller deals than normal if there are indications that the customer will grow into the space.
Selling 0.5-megawatt and 0.25-megawatt chunks, the company also has been offering other data center services, such remote-hands support, which is something it hasn’t done in the past.
Working with smaller customers means customers can take advantage of wholesale data center space and it means they cannot be “trapped by colocation,” according to Yetman. “We talk to retail colo customers who don’t want to move because they’re scared to death and they don’t want to break it.
“Colocation providers are like drug pushers in that these customers incrementally grow and eventually they realize they’re drawing down 1.5 megawatts and paying big prices per cab. What we’re saying is, you can leave sooner so it doesn’t get big and complex. We’ll give you smart hands. We’ll hook you up with good vendors. We’re good with layout, racks, and have an ecosystem that helps them.”
Pushing the envelope for better TCO
Vantage is also working with customers to push their environmental comfort zones. Operating at temperatures warmer than 70F offers several benefits that offset any potential hardware failures, Yetman said.
He believes IT is often too afraid to let hardware fail, and that IT should in fact plan for hardware failure. The emphasis should be on recovery and resilient software. By pushing the temperature envelope companies can improve overall data center TCO. “I believe whatever faults you drive into the hardware is minimal in terms of gains of TCO,” he says.
Where it can, Vantage is increasingly leveraging outside-air economization in combination with containerization to provide optimal TCO.
“We’re also big on making sure customers put containment in,” said Yetman. “We check containment as part of our rounds, because people make changes. If we see cabs with missing blanking plates, we go talk to the customer and say ‘let’s go ahead and work with you.’” Most customers install really well, but over the course of a year they keep making changes. “I almost think of it as a slow degradation.”
The company uses modeling to optimize space for customer environments. “Cloudera is a recent customer, signed back in May,” Yetman said. “We’re lighting up their space with them; it’s been a fun experience.” The Hadoop distribution vendor said Vantage offered the lowest TCO. Cloudera is said to be taking a significant amount of data center space.
Closer relationships means providing better data
The building has to be smarter, and Vantage engineers are leveraging data and making it more transparent to the customer. “Lots of folks have data they’re willing to share in a portal,” said Yetman. “We’ll give customers access to the same info (dew point, etc.) that we gather off of our systems ourselves.” The company creates graphs based on that data and uses them to operate the data centers. “We want to have a good conversation with customers to help stretch their envelope. Helping them operate at warmer temperatures with good containment drastically lowers their TCO.”
Vantage built an overarching SCADA (supervisory control and data acquisition) system based on Java that sits like an umbrella across all of its buildings. The company also employs Ignition to calculate overall equipment effectiveness, collecting data into BMSs and putting it into a much larger MySQL database.
Outside-air challenges in Santa Clara
The company’s VC2 facility is built to take advantage of outside air. Cold air is supplied from the top, taking advantage of the simple principle that hot air likes to rise and cold air likes to drop.
Temperature and humidity swings make use of outside air challenging in Santa Clara. “Humidity in particular is the more challenging problem,” said Yetman. If temperatures and humidity fluctuate, the company does partial economization to dry out that humidity. “It’s not operating at 80 percent humidity that is dangerous so much as a quick fluctuation,” said Yetman. “The answer is to slow it down – get to 80 but not as fast. If that weather comes in fast, we’ll start to bleed it in and eventually go back to full outside.”
“In one of our more efficient facilities, a customer saw more … failures,” said Yetman. “We re-sealed the building up and worked with them to dial in the temperature and humidity swings. When we make sure we have that on a tight band, it didn’t make a material difference in terms of failures.” | 12:30p |
QTS Reports Strong Sales Momentum in Upcoming Dallas Data Center QTS Realty Trust has pre-sold about 26,000 square feet of space in its Dallas-Fort Worth data center ahead of launch, expected this quarter. Construction began in late 2013.
QTS has been on a mission to diversify its clientele beyond its sizable Atlanta position, and early success in the Dallas market indicates that it has hit the ground running in that market.
Its annual SEC filing in 2013 indicated plans to invest up to $277 million in adding more than 300,000 square feet of data center space – most of it in the Dallas facility and in Richmond, Virginia.
QTS has a strategy of buying massive properties at a discount. In Irving, Texas (a Dallas suburb), it acquired a 700,000 square foot former semiconductor plant. In other markets, it acquired the former Sun-Times plant in Chicago and McGraw Hill’s former New Jersey data center to grow presence in those markets.
The building in Irving was previously used by several companies. It sits on a 40-acre campus in the Las Colinas area and has potential to double the size of the footprint to 1.4 million square feet. The campus is powered by an on-site 140MW dual-fed substation and has diverse fiber connections, allowing QTS to offer carrier-neutral space. It’s expected to house up to eight custom data center rooms ranging from 25,000 to 27,000 square feet.
The company offers three lines of service, dubbed as the “3Cs”: custom, colocation and cloud and managed services. All three will be available at the new facility.
The Dallas data center market continues to be a hotbed of activity, given its location in the country, good connectivity and friendly business environment. In addition to data centers, several major hosting providers, such as Rackspace and SoftLayer (now an IBM company) got their start in the area.
In terms of recent activity, Equinix held a grand opening for its sixth data center in Dallas last June. CyrusOne has been doing well in the area, adding both data halls and customers, including Wikimedia. zColo, the colocation division of Zayo acquired Dallas provider CoreXchange in March. T5, in nearby Plano, has been expanding at a significant clip as well.
Despite all of this activity and space coming online, QTS was able to pre-sell a sizable chunk of space of its facility before opening, which bodes well for both the market and the data center provider.
“We’re proud to announce the signing of the first customers at our Dallas Fort-Worth facility, particularly those that utilize more than one of our mega data centers and a combination of QTS’ 3Cs,” QTS COO Dan Bennewitz said. “This site’s central location, ample power capacity and rich networking connections provide a strong foundation for our unique platform.” | 1:00p |
IBM Brings Government Cloud to California State California Department of Technology and IBM today announced CalCloud, a statewide cloud services platform that will serve more than 400 state departments and local government entities.
Instead of separate IT systems for each department, the CalCloud service lets government entities share a common pool of computing resources. It’s a much more efficient and centralized approach to individual islands of cloud usage across departments.
CalCloud is designed to allow around-the-clock access to a shared pool of easily configurable resources, including compute, storage, network and disaster recovery services. CalCloud meets the state’s security standards based on National Institute of Standards (NIST) for cloud based services and FedRAMP — the federal security standard.
There is immediate access to back-end services. CalCloud is available to all municipalities and state and local government agencies on a subscription basis. The California Department of Technology is providing tools that offer access to IT services. So far, more than 20 state departments have requested IT services through CalCloud.
IBM is supplying and managing the infrastructure while the California Department of Technology manages all other aspects of the service offering. IBM is working closely with the state to transfer essential knowledge and best practices in security and systems integration to the Department of Technology.
“CalCloud is an important step towards providing faster and more cost effective IT services to California state departments and ultimately to the citizens of California,” said Marybel Batjer, secretary of the Government Operations Agency.
In addition to IBM, CalCloud partners include AT&T for network services, IT and consulting firm Alexan International and KPMG to drive CalCloud’s adoption rate and migration to the new service.
There has been a strategic imperative to unify and consolidate IT spread across disparate departments in government. CalCloud offers a consolidation point and might spur more states to provide something similar to their local departments and entities.
“Transforming how the State of California delivers technology services is not only more efficient and cost effective, it will spur innovation with cloud capabilities that are open and secure,” said Erich Clementi, senior vice president, IBM Global Technology Services. “California is setting an example for other states on how to use cloud technology to improve coordination across agencies and municipalities while reducing the barriers and duplication that can impede the delivery of government services.” | 1:30p |
Teradata Builds Hadoop Strength With Revelytix and Hadapt Acquisitions Teradata has expanded its Big Data fortitude for Hadoop with acquisitions of startups Revelytix and Hadapt, bringing it both talent and intellectual property.
The Hadoop tools from these two companies will compliment Teradata’s data warehouse offerings and help it embrace the Hadoop technology and community. The employees and intellectual property from these acquisitions will become part of Teradata Labs, the research and development arm of the company.
Revelytix provides Hadoop data management and data preparation tools to join integrated metadata, lineage and data wrangling in a single software solution. After developing software for the U.S. Department of Defense the company developed Loom, an application that manages Hadoop data complexity by automatically discovering datasets, generating metadata on datasets and tracking lineage of operations in Hadoop. The Loom Hadoop Data Management and Agile Data Preparation tool were made available for the Teradata Appliance for Hadoop over a year ago.
Hadapt founder and CEO Justin Borgman said in a blog post that his startup looked at Teradata as a role model for developing software that enables analysts to work with new types of data, all in one unified environment, through a familiar SQL interface. The Hadapt engineering team will give Teradata deep Big Data knowledge that will be leveraged to enhance and move forward the Teradata Unified Data Architecture.
“We welcome Revelytix and Hadapt employees to Teradata and we look forward to their contributions in helping Teradata Labs accelerate delivery of Big Data solutions for our customers,” said Scott Gnau, president, Teradata Labs. “The addition of the key assets of these companies underscores Teradata’s commitment to continued innovation and customer value, extends our big data portfolio and enhances the Teradata Unified Data Architecture.”
The Hadoop tools from Revelytix and Hadapt may not be directly integrated to the Teradata warehouse, but, as suggested, leveraged as complimentary for its Unified Data Architecture. Teradata CEO Scott Gnau has stated before that Hadoop itself is complimentary to a Teradata system and not a replacement for it.
Enterprises have data in a lot of places, and the applications used to extract value are increasingly needed to bridge all platforms. In a recent response to this trend, Oracle launched its Big Data SQL software to span Hadoop, NoSQL and Oracle database data last week. | 2:00p |
How DCIM Improves Planning and Cuts Operational Costs Your data center has become a critical component of your business. So much so that many business initiatives are actually planned around the capabilities of IT. The emergence of cloud, IT consumerization, and a lot more data have forced the data center to evolve and support new demands. Through it all, data center management and control sit at the top of an administrator’s task list. This is because power, energy consumption, and efficiency are all critical pieces of keeping a business running and a data center healthy.
IT and business executives have realized that hundreds of thousands of dollars in energy and operational costs can be saved by improved physical infrastructure planning, minor system reconfiguration, and small process changes. The systems which allow management to leverage these savings consist of modern data center physical infrastructure (i.e., power and cooling) management software tools. Legacy reporting systems, designed to support traditional data centers, are no longer adequate for new “agile” data centers that need to manage constant capacity changes and dynamic loads.
In this whitepaper from Schneider Electric we learn how data center infrastructure management (DCIM) software tools can simplify operational processes, cut costs, and speed up information delivery. Throughout the paper, a variety of key data center management topics are covered. Those topics include:
- Planning: Effect/Impact of decisions
- Operations: Completing more tasks in less time
- Analysis: Identifying operational strengths and weaknesses
Download this whitepaper to learn how holistic management capabilities can enable data center professionals to maximize their capacity to control energy costs and advise the business on how to effectively utilize IT assets. By sharing key data points, historical data, and asset tracking information, and by developing the ability to charge back users, Nlyte’s Planning & Implementation tools allow users to take actions based on data center business intelligence. Effective use of today’s data center IT infrastructure management software will help make your data center be more reliable and efficient while increasing its overall business value. | 4:52p |
Equinix Buys Rest of Brazil Data Center Provider ALOG Equinix has bought the remaining half of ALOG Data Centers do Brazil for $225 million.
The U.S. colocation giant acquired a majority stake in ALOG together with Riverwood Capital in April 2011, Equinix holding a 53 percent ownership of the company. Today, Equinix announced it now owns 100 percent of the company, which gives it an instant home base for a growing play in the Brazil data center market and Latin America in general.
ALOG is now part of Platform Equinix, a global footprint of 101 International Business Exchange (IBX) data centers across 32 markets. Equinix gains the ability to satisfy strong demand from its network, content, cloud, enterprise and financial services customers looking to establish a presence in the rapidly growing Brazilian market.
ALOG has four data centers in Brazil and serves about 1,500 customers. The facilities are in São Paulo and Rio de Janeiro, the two largest markets in the country. ALOG was key in providing infrastructure behind this year’s World Cup.
The Brazil data center market and Latin American market in general are key for Equinix. Following acquisition of a position in ALOG, it opened an IBX in Boca Raton, Florida, to act as a bridge in 2012. The data center serves as a key hub for domestic and international routes. Adjacent to several landing stations, the MI3 facility offers the lowest latency route to Brazil via GlobeNet, according to Equinix.
Brazil is the world’s seventh-largest economy, according to a 2014 report from World Bank. It’s also second-largest among emerging IT markets, second only to China.
Since the original investment in 2011, Equinix has seen several customers across verticals extend to Brazil via ALOG, including Cloudsigma, GlobeNet, Level 3, Orange Business Services and Telefonica.
“ALOG’s strong position in Brazil and complementary business model provided Equinix the opportunity to establish a presence in an important emerging market and meet growing demand for data center services in Latin America,” said Karl Strohmeyer, president of the Americas for Equinix. “The ALOG team has done an outstanding job of leveraging the company’s strength in Brazil – specifically in cloud and mobility – and integrating it into Equinix’s global footprint to extend the world’s leading data center platform into Brazil.” | 5:37p |
Chinese Search Giant Baidu Buys Pre-fab Modular Data Center from Schneider Baidu, the largest Chinese search engine and Web services company (China’s equivalent of Google, essentially) has bought two pre-fabricated data center modules from the French energy management, infrastructure and automation multinational Schneider Electric.
The containers are a full-package deal, including everything from electrical and cooling infrastructure equipment to data center infrastructure management (DCIM) software. This is the first time Baidu has deployed pre-fabricated data center modules, according to Schneider.
Pre-fab modualr data centers are a niche solution, generally used for quick capacity expansion, deployments in places where it is difficult to build traditional brick-and-mortar facilities or for mobile deployments, such as in combat zones or oil fields. Some data center providers, such as IO and Colt, also use pre-fabricated modules to build out capacity for their customers.
The modular data center Schneider has built for Baidu is called M1. They are tailored to the customer’s specifications and include in-row cooling, IT cabinets, UPS, cable management, fire suppression and access security.
Insulated wall panels make the containers – which are deployed in open air – weatherproof.
Baidu has shown in the past that it is willing to experiment with unorthodox technologies in building out its IT infrastructure. It has collaborated with Facebook and the Open Compute Project to redesign the data center rack and deployed servers powered by processors based on the ARM architecture – a low-power architecture used primarily for mobile devices.
This is not the first time Baidu has worked with Schneider. The vendor supplied IT cabinets and UPS systems for its data center in Yangquan and manufactured the aforementioned custom racks – a design effort codenamed the “Scorpio Plan.”
Chenhong Huang, senior vice president at Schneider, said, “Today’s successful bid is a vote of confidence in Schneider Electric’s innovative development in the field of data management.”
Schneider has been ramping up its pre-fabricated modular data center play quite aggressively over the past several years.
In November 2013, it announced 15 different modules and 14 modular data center reference designs. Earlier this year, Schneider acquired AST Modular, a modular data center maker based in Barcelona. | 6:30p |
Friday Funny Caption Contest: Chicken Coop We have a two-day vacation upon us again and we’re in a celebratory mood. Let’s kick this summer weekend off right with a brand new Friday Funny!
Diane Alber, the Arizona artist who created Kip and Gary, has a new cartoon for Data Center Knowledge’s cartoon caption contest. We challenge you to submit a humorous and clever caption that fits the comedic situation. Please add your entry in the comments below. Then, next week, our readers will vote for the best submission.
Here’s what Diane had to say about this week’s cartoon, “I’m sure you’ve heard of the chicken coop data center?! Kip and Gary wanted to try the design out in their new white space!”
Congratulations to the last cartoon winner, Don, who won with, “Norton virus removal team seems to mean business …”
For more cartoons on DCK, see our Humor Channel. For more of Diane’s work, visit Kip and Gary’s website. |
|