Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, February 19th, 2013
| Time |
Event |
| 12:30p |
Data Center Jobs: Lectrus Corporation At the Data Center Jobs Board, we have a new job listing from Lectrus Corporation, which is seeking a Strategic Account Manager in Chattanooga, Tennessee.
The Strategic Account Manager is responsible for developing, leading and executing strategic account sales plans by defining sales goals within North America including planning for sales by region, market segment, end-user type in coordination with Regional Account Managers, Business Development Managers, EVP Sales, and Marketing Department, creating and maintaining relationships with executive level decision makers at assigned accounts, managing and coordinating relationships at all levels within assigned strategic accounts, identifying account growth obstacles, and generating plans to overcome the obstacles and lead execution in obtaining growth. To view full details and apply, see job listing details.
Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed. | | 1:00p |
365 Main: Embracing the Server Hugger  A look at the national footprint for 365 Main, which has re-entered the data center market by acquiring 16 facilities from Equinix.
Local markets are different from national markets, and require a different approach. That’s the philosophy behind 365 Main’s acquisition of 16 data centers from Equinix, which was completed late last year.
The transaction marked a re-entry of sorts for 365 Main principals Chris Dolan and James McGrath, who had been tracking the industry from the sidelines since the original 365 Main portfolio was acquired by Digital Realty Trust in 2010. After weighing a number of deals and strategies, Dolan and McGrath saw an opportunity in the growing demand for data center services in second-tier markets.
Their chance came when colocation provider Equinix looked to divest many of the data centers it acquired in its 2009 acquisition of Switch and Data, a provider that built its strategy around second-tier markets. Equinix, on the other hand, is one of the dominant players in the largest and most connected colocation markets. The facilities in the smaller Switch and Data markets weren’t an ideal fit for Equinix, and by mid-2012 represented just 2 percent of the company’s revenue.
“Some of these assets didn’t fit the Equinx profile,” said Dolan. “The opportunity I saw, with 50 percent utilization across the portfolio, was to build relationships with the customers and enhance the services.
National Player, Local Focus
“We’re a national player with local focus,” said Dolan. “There’s demand in the local market. People want to be close to their equipment. Our brand is strong in this industry, and we thought this was an ideal entry point with a cash flow portfolio.”
Dolan says 365 Main can bring new energy to these properties. Cities like Indianapolis, Detroit, Nashville, Buffalo and Cleveland were never going to be high priorities when compared with the huge financial interconnection ecosystems Equinix served in its big-market data hubs. But 365 Main finds these geographies “intriguing” and sees a real opportunity for a provider that takes these markets as seriously as its customers do.
“We’re really finding these people need communication,” said Dolan. “Now we’re investing in these facilities and they can expand and these data centers. It really comes down to this customer service component that was missing.”
“What we like about this portfolio is that it takes us into markets like Buffalo,” he added. “We can have a local sales presence in small markets. The competition becomes geography-based. The customer usually has a preference, and competition is localized.”
Growth in Second-Tier Cities
365 Main was formed to operate a San Francisco data center after AboveNet filed for bankruptcy in 2003. Over the next seven years, the company expanded to operate a national network of five data centers with 200 customers and 919,000 square feet of data center space. The portfolio was bought by Digital Realty in 2010 for $725 million.
Dolan and McGrath continued to monitor the industry, looking for opportunities to begin building again. Two other original 365 Main partners, J.P Balajadia and Kevin Louie, left to form Rubicon Data Centers, which is building a new data center in Reno, Nevada.
The new 365 Main isn’t the only provider focused on building a network of facilities in smaller cities. Companies like Peak 10 (southeast), ViaWest (western U.S.) and Colospace (New England) are long-time players in this market. More recently, Xand has been building a network of facilities in the Northeast through acquisitions, and Compass Datacenters and ByteGrid have been targeting second-tier markets with wholesale data center offerings.
The interest in regional markets is driven, at least in part, by supply issues in both investor capital and data center space in first-tier markets. Investors are keen to put money to work in the data center business, which has outperformed other sectors of the real estate market in recent years. That liquidity has contributed to increased competition in markets like Silicon Valley, northern Virginia and New Jersey. That, in turn, has prompted some investors to focus on smaller markets ,where local businesses face growing data management challenges and competition is less intense.
“I spent time looking for areas where there were unmet needs,” said Dolan. “I didn’t want to compete in (major markets) on wholesale, with private equity players moving. So our focus shifted.”
More Expansion Ahead for 365 Main
Dolan said Equinix had made improvements to a few of the properties, but some were still identified as Switch & Data buildings. “Part of our plan is to do a lobby and data center freshen up,” said Dolan. “Each of these data centers has a unique story. I’m pleasantly surprised with the amount of customer activity. The carriers are probably the most sticky customers. We have 10-plus carriers in each center.”
365 Main is focused on these 16 markets, but will likely be adding more.
“The strategy is to continue to grow through acquisition and perhaps greenfield builds,” said Dolan. “We are looking at the broader map, for sure. There is a lot of interest in the industry from private equity perspective, and everyone wants to get in. We’re very connected in the investment banking community, so a lot of opportunities are crossing our desks.” | | 2:08p |
Tilera Targets Data Bottlenecks With 72-Core Chip 
Tilera continues to develop new many-core processors built for the massive data processing needs of “hyper-connected” technology companies. Tilera’s newest offering is the GX-72, which boosts the number of cores it can harness to address bottlenecks in networks and applications. The new chip, which is being released today, can be used to power servers or as an accelerator an offload engine for x86-based hardware, an approach that broadens the potential uses for the technology.
The GX-72 is optimized for busting bottlenecks in moving large amounts of data. It builds upon Tilera’s architecture, which eliminates the on-chip bus interconnect, a centralized intersection where information flows between processor cores or between cores and the memory and I/O. Instead, Tilera employs an on-chip mesh network to interconnect cores.
The GX-72 is a 64-bit system-on-chip (SoC) equipped with 72 processing cores, 4 DDR memory controllers and a big-time emphasis on I/O. This includes 8 ports for 10 Gb Ethernet, 32 for 1Gb Ethernet, 24 lanes of PCIe 2.0 and Tilera’s MICA acceleration engine.
Tilera CEO: “Unprecedented” Compute
“Customers demand ever-increasing levels of performance and performance-per-watt to stay competitive and they simultaneously want to reuse their software and hardware investments across their product portfolio,” said Devesh Garg, president and CEO of Tilera. “The TILE-Gx72 brings an unprecedented amount of compute to customer designs, and leverages thousands of open source libraries and the growing Linux ecosystem. The TILE-Gx72 rounds out our processor portfolio, complementing our 9, 16 and 36-core TILE-Gx processors and is offering a remarkable range of processing performance.”
Tilera is part of an emerging ecosystem of companies seeking to harnesses thousands of low-power cores that work together on computing tasks. Tilera’s processors have proven to be particularly effective in web-scale operations like load balancing, image compression and especially caching, where Facebook has cited Tilera’s performance-per-watt in memcached applications.
With the GX-72, Tilera is also touting its capabilities as an “offload engine” – a network interface card (NIC) that can plug into an x86 server using a PCIe connection, accelerating data handling by using both the many-core and x86 architectures to process data. One application for this is network monitoring and intrusion detection, allowing rapid inspection of data as it enters a system.
Security Use Cases
“We have a pretty high bar on what we can put on an accelerator,” said Bob Doud, the Director of Processor Strategy at Tilera. “It can be used for network intelligence: packet monitoring, visibility into fast-moving traffic, capturing packets and filtering at higher speed.”
With more companies moving large volumes of encrypted traffic, that’s an important capability. “If 100 percent of your traffic is encrypted, that’s a crushing load,” said Doud.
“We continue to be impressed with the scalability of the TILE-Gx family with its seamless software compatibility from 9 cores to 72 cores,” said Ofer Raz, head of platforms and architecture at security specialist Check Point Software Technologies. “The TILE-Gx72 processor brings the right mix of compute, low-latency I/O, memory bandwidth, and accelerators for the needs of our intelligent, integrated security appliances.”
The GX-72 is also proving effective in managing large volumes of video content, either in HTTP streaming or video conferencing. An example: a law enforcement agency using Tilera chips to accelerate processing of incoming video that may need to be quickly redeployed back out to the field.
Here’s a slide that offers an overview of Tilera’s approach to offload processing and some of the use cases:
 | | 3:29p |
Making the Case for DDoS Protection Use of cloud computing has evolved the threat landscape that organizations must face. Now, with more utilization of the Internet and WAN services, more companies are at risk of some type of an attack. In particular, security administrators must now concern themselves around distributed denial of service (DDoS) attacks. Just like the advancements in technology, new ways of attacking an organization from the Internet are emerging as well. Gartner, in conjunction with Arbor Networks, created this white paper to show organizations the vital process of protecting the environment from potential DDoS attacks.
In designing a solid solution, administrators must analyze the industry and have a clear understanding of their own infrastructure. This white paper not only outlines the definition of a DDoS attack, it illustrates the important best practices revolving around DDoS defense. These practices include:
• Understanding that today’s attacker is using a combination of the following to carry out a DDoS attack:
1. High-bandwidth or volumetric
2. Application-layer attacks
• The best pace to stop high-bandwidth DDoS attacks is in the ISP’s cloud (via network-based DDoS protection).
• The best place to perform application-layer DDoS detection and mitigation is at the network perimeter.
Aside from establishing a solid DDoS attack prevention plan – administrators must know how to budget for solutions and communicate these changes. Many times, the communication process will revolve around non-IT people. To get the message across, IT administrators must learn how to speak “business.” This means outlining dollars lost during outages, the possibility of lost data and how that can affect image, and the amount of lost productivity from poorly performing DDoS-affected systems. To help illustrate the point, this white paper outlines the Four-I methodology. Click here to download this white paper. | | 3:39p |
Navigating Data Center Performance Challenges With KPIs (Part 2) David Appelbaum is vice president of marketing at Sentilla Corporation, and has worked in software marketing roles at Borland, Oracle, Autonomy, Salesforce.com, BigFix, and Act-On.
 DAVID APPELBAUM
Sentilla
In Part 1 of this article on data center Key Performance Indicators (KPIs), I compared managing a data center to piloting a jumbo jet. Although an experienced pilot can fly a small plane by sight alone in good weather, the same pilot needs a flight plan, cockpit instrumentation, and air traffic control support in order to fly in foggy weather. Data centers today are akin to that foggy day, with new technologies and applications adding layers of abstraction and complexity to data center management.
In Part 1, I also described results from a 2012 survey of 5,000 data center professionals highlighting the metrics they use to provide visibility into data center infrastructure to support intelligent decision-making. Relevant metrics show not only what’s happening in your data center today, but also how it relates to other parts of the data center and to your costs. The following commentary references the survey findings.
Good Data is Hard to Find
If you are not getting the visibility and metrics to make informed decisions about your data center utilization and capacity, you are not alone. The 2012 survey asked respondents about KPIs for cost and capacity.
Many metrics that respondents indicated were most important were among the most difficult to get from existing tools.

This is evidence of a serious visibility gap – data center management tools do not make it easy to find important information around power capacity and costs, as well as overall operations and maintenance costs.
Puzzled about Power? Join the Crowd.
If nothing else, both Super Bowl 2013 and the widespread outages caused by Hurricane Sandy and the subsequent reliance on fuel-powered generators caused many to rethink their relationship with power. For one Manhattan-based data center, people formed a bucket brigade hauling diesel fuel up 17 flights of stairs to keep the data center’s backup generator running. Suddenly, excessive power consumption has a real, human cost.
The survey included questions about two important power-related metrics:
- Power density, expressed in watts/square foot
- Power Usage Efficiency (PUE) – a ratio of the power used in the data center to the power that is used for computing (versus heating, cooling and other overhead power consumption)
Of all the metrics-related questions in the survey, the two power-related metrics had the highest number of “Don’t Know” responses.


More than a third of respondents didn’t know their PUE.
This lack of knowledge might signal an organizational gap that is hindering visibility into necessary metrics.
Finding the Metrics Through Management Tools
The first wave of Data Center Infrastructure Management (DCIM) solutions only show basic facilities and IT metrics, such as PUE, energy consumption and temperature fluctuation. More recently, a new wave of DCIM solutions are emerging to fill the gap – by aggregating and correlating information from the other categories of data center management solutions using a “manager of managers” approach.
While relatively few respondents were currently using DCIM tools, a full 38% were planning or considering implementing DCIM.

The new generation of solutions will deliver not only better insights through global visibility, but also the ability to perform proactive project planning and evolve to KPI-driven data center management.
Flying with Instrumentation: Data Center Management Maturity
Having better visibility is the essential first step in moving towards a more mature approach to infrastructure management. With visibility in place, you can start proactively planning based on predictions and forecasts, and eventually move towards KPI-driven data center management.
If you manage your data center as if you were piloting in the fog, implementing a unified data center performance management solution can help you move up the maturity cycle.

- Start by monitoring and analyzing capacity and utilization of all of the data center assets (including virtual, physical and facility layers). This insight will help you cut costs, reclaim under-used capacity and avoid over-provisioning.
- With a unified data center performance management solution in place, move to a proactive planning approach: use predictive analysis and what-if scenarios to make smart decisions. This approach helps you optimize your IT spending and availability while alerting you to potential constraints before they cause availability or performance problems.
- With a firm understanding of business needs and data center performance, adopt a KPI-driven data center approach, enforcing service levels and understanding the true cost of services. Data centers in this stage of maturity are best positioned to align resource spending with business needs.
You Don’t Have to Fly Blindly
Just as you don’t want to fly that plane without instrumentation, you don’t want to plan your data center capacity based on incomplete data and best guesses. But, getting these metrics means moving beyond the isolated enterprise IT management, virtualization management and facilities monitoring tools that you may have in place today.
By deploying a unified Data Center management solution, you can get the KPIs that put context around your planning decisions – integrating operating cost and power metrics with IT assets, and supporting predictive and what-if analysis. With this insight, you can reduce the cost of over-provisioning while mitigating the risks of unanticipated capacity problems.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 3:46p |
UK Provider Accepts BitCoin for Hosting Accounts UK provider Hosting.co.uk is accepting a very non traditional payment method called BitCoin as payment for hosting accounts. BitCoin is a startup digital currency that eschews traditional banking. It’s somewhat unusual for a web host to accept it as currency, but this potentially opens it up to a wider international audience, as it simplifies international sales. BitCoin also adds a level of security and privacy to the transaction, so it might attract the types of customers that prefer to remain private as well.
On one hand, it makes it easier for customers of all sorts to pay. On the other hand, payment quality and fraud control has always been a meaningful cost of business for hosts, and this would seem to be hard to police and manage with BitCoin.
“We understand that many potential customers still shy away from digital payment methods, fearful of identity theft or fraud,” said Hosting.co.uk’s Frederick Schiwek. “That’s why Hosting.co.uk has moved forward with BitCoin as an alternate payment method. Not only is it faster and less costly than most bank-mediated transactions, it’s more secure as well. We expect to see a surge in clients from around the world thanks to the globalizing effects of digital currency.”
What’s a BitCoin?
BitCoin is a decentralized, person-to-person economy of digital coins. The all-digital network is not backed by any government or bank, so there are no exchange rates. The BitCoin network is built on open source code, so anyone can take a gander into how it functions. It basically functions like most online banking and payment systems, with the big difference being that it’s a virtual currency.
Hosting.co.uk’s main data center is a Tier IV facility in the midlands of England with multiple redundant systems. Hosting.co.uk’s target market includes everyone from individuals to small- and medium-sized businesses. The company says BitCoin will be most appealing to individual users, especially those located outside of England or continental Europe.
Recently, Hosting.co.uk partnered with MEGA cloud storage service as an authorized reseller. MEGA is the somewhat controversial provider of turn-key data storage founded by Megaupload’s founder Kim Dotcom. Given the roots, the security for MEGA seems high, and it has garnered quite a bit of interest. Hosting.co.uk clearly has a comfort-level with non-traditional offerings here | | 4:00p |
Data Center Jobs: Opengate Data Systems At the Data Center Jobs Board, we have a new job listing from Opengate Data Systems, which is seeking a Regional Sales Manager in Sacramento, California.
The Regional Sales Manager is responsible for developing relationships with key technical and decision makers within a customer’s organization; including data center and facility managers, engineers, project managers, management, executives, construction, financial, and purchasing personnel, gaining organization trust by demonstrating and constantly improving technical consultation and service, high level sales activity through initiative, persistence, ingenuity, creativity, personal and business relationships, and influencing the design of integrated data center cooling and airflow systems along with data center management and automation systems. To view full details and apply, see job listing details.
Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed. | | 7:32p |
MasterCard Takes Stake in Data Analytics Firm As big data continues to be collected and stored, more commercial businesses are seeing the high value of analyzing that data, and are choosing partners who specialize in large data set analytics to assist with extracting business insight from petabytes of data. Here’s some recent examples of big data partnerships.
MasterCard takes stake in Mu Sigma
MasterCard Advisors, a division of MasterCard (MA) and analytics firm Mu Sigma announced a new partnership that combines MasterCard Advisors’ aggregated and anonymous purchase behavior insights with Mu Sigma’s advanced analytics and expertise. As part of the partnership MasterCard has acquired an equity stake in Mu Sigma. The two companies will jointly develop innovative analytic products to enable companies of all sizes to solve their business challenges.
“The data analytics market is rapidly growing as customers seek real time insight allowing them to better connect with their consumers through highly relevant products, offers and services,” said Gary Kearns, Executive Vice President, Information Services for MasterCard Advisors. “We went through an extensive process to choose the right partner and Mu Sigma’s innovation labs and capabilities make them stand out as best-in-class in this field. By combining MasterCard Advisors’ purchase behavior insights with Mu Sigma’s expertise we will be able to drive faster innovations in data analytics solutions and deliver them on a broader scale, globally.”
Northbrook Illinois-based Mu Sigma is a decision science and analytics company that serves many large global organizations in health care, insurance, finance, retail and technology companies such as Microsoft and Dell. It combines innovation in analytics with its interdisciplinary approach combining business, math and technology and using proprietary products, assets, methodologies and people. According to research by Wikibon the big data analytics market is expected to be a $50 billion market opportunity within five years.
“Big Data analytics is growing at a tremendous pace,” said Dhiraj Rajaram, CEO of Mu Sigma. “We have deep experience in this area, probably more than any other provider, and MasterCard Advisors has been leading in data analytics for a number of years among payments companies. We believe the combination of MasterCard’s deep data and information insights expertise when exposed to Mu Sigma’s Big Data analytics ecosystem will add tremendous value to various businesses. We’re looking forward to working closely with MasterCard to help drive innovation across multiple industries and help businesses succeed in new and smarter ways.”
Yahoo! Japan Upgrades Teradata Analytics platform
Teradata (TDC) announced that Yahoo Japan Corporation has renewed and updated its analytical data systems with the Teradata Active Enterprise Data Warehouse Platform 6690, Teradata’s latest high-end model, for its data warehouse (DWH), the largest DWH analytics platform in Japan.
Yahoo JAPAN has also implemented a new application known as “Access-Navigator Web,” a web-based data search tool. This software enables an employee without training in SQL to perform data analysis – by freely manipulating and integrating data on the new Teradata system. This lowers employee learning curves for SQL training and increases the number of users who can perform analytics.
Yahoo Japan manages about 120,000 queries daily. The deployment of the newest Teradata technology, completed at the end of November 2012, provides high efficiency and speed – with response times now running 200 to 250 percent faster than the prior systems. | | 8:03p |
Cologix Opens Second Site at Dallas INFOMART  The distinctive facade of the Dallas INFOMART, where colocation provider Cologix now operates two data centers. (Photo: Cologix)
Interconnection and colocation company Cologix has been rapidly expanding since it acquired its first data center at the Dallas INFOMART from NaviSite back in 2010. Now, after expanding across North America, the company comes full circle, launching its second data center at the INFOMART.
Cologix today announced the successful commissioning and launch of the new facility at the INFOMART, also known as 1950 North Stemmons Freeway. The 12,000 square foot data center holds over 300 cabinets and is supported by 3.2 megawatts of power from three diverse substations. The new Dallas facility is Cologix’s 12th North American data center, including key carrier hotel locations in Toronto, Montreal, Vancouver, Minneapolis and Dallas. Cologix bought its first data center at the INFOMART in late 2010 from NaviSite. The company first announced the addition of a second facility there in April 2012.
The new data center includes hot aisle containment pods, modular power distribution units (PDUs) and in-row cooling technology, which collectively provide for rapid deployments and the ability to dynamically cool equipment specific to the needs of individual cabinets. “Our hot aisle containment and in-row cooling systems are unique in the Dallas market and provide for enhanced customer experience and efficiency,” said Rob DeVita, General Manager of Cologix Texas. “We are excited to introduce this technology to the Dallas community and look forward to supporting our customers’ growth.”
Key Internet Data Hub
Dallas-Fort Worth is the fourth largest metro market in the US and host to twenty Fortune 500 company headquarters. Its central location and network density make it a primary Internet peering point and natural location for regional and national network nodes.
“The continued rapid adoption of the cloud by all customer segments is dramatically elevating traffic and network performance requirements, which continues to heighten the value of colocation and interconnection options in the downtown INFOMART,” said Cologix CEO Grant van Rooyen. “Cologix is focused on providing our customers network neutral access to broad connectivity options, represented today in this new Dallas inventory.”
Customers at the new facility have the ability to directly interconnect with existing customers and 25+ networks in the existing Cologix meet-me room (MMR) as well as accessing other tenants in the carrier hotel.
The Dallas INFOMART is a 1.2 million square foot technology hub with tenants including SoftLayer, ViaWest and Equinix, as well as network providers including MCI, Allegiance Telecom and Level 3. The Infomart was built by Trammell Crow in 1985, and was initially envisioned as a hub for computer industry trade shows. The building’s glass facade was designed to be a replica of the Crystal Palace, built in London in 1851 as part of the first World’s Fair. | | 8:21p |
OnRamp Will Build Second Austin Data Center  The interior of the OnRamp data center in Raleigh as it was preparing to open. The company is also building a new data center in Austin, Texas. (Photo: OnRamp).
Data center operations company OnRamp announced it is building a 42,000 square foot data center in Austin which will open early in the fourth quarter of this year. This will be the second data center for the company in Austin. The announcement of OnRamp’s Austin II project comes just a week after the company announced the opening of a data center in the heart of Research Triangle Park in Raleigh, NC.
OnRamp says the additional facility was necessitated by demand. “We are excited to open a second, enterprise-class Data Center in Austin,” said OnRamp CEO Lucas Braun. “We’re an Austin-based company, and a large percentage of our managed and cloud hosting and HIPAA compliant hosting services are delivered by our teams in Austin.” The facility is being designed for industry-leading levels of high density computing, with the capability of delivering upwards of 30kW per rack, contiguously. In addition, the SSAE 16, SOC I Type II, HIPAA and PCI Data Center will feature a separate high security area for HIPAA hosting. OnRamp’s Redundant Isolated Path Power Architecture delivers true 2N power to customers, from the utility to the rack.
OnRamp is working with Square One Consultants to oversee the design, development and construction of the facility.
OnRamp was founded as an ISP in 1994 in Austin, Texas. Its first colo customer was a year later, and It’s first managed server came about in 2000. It built its first data center with 2N power in 2003. Private Cloud came in 2007, and an investment from Brown Robin Capital followed in 2009.
The company offers colocation, cloud computing, high security hosting and disaster recovery services backed by what it calls Full7Layer support, which is, of course, support across all 7 layers including all the way to the application layer. |
|