Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, October 17th, 2014
Time |
Event |
4:00p |
Friday Funny Caption Contest: Mystery Cabinet Another weekend is (practically) here and we’re in a Friday Funny mood. Help us complete our Kip and Gary cartoon by submitting your caption below!
Diane Alber, the Arizona artist who created Kip and Gary, has a new cartoon for Data Center Knowledge’s cartoon caption contest. We challenge you to submit a humorous and clever caption that fits the comedic situation. Please add your entry in the comments below. Then, next week, our readers will vote for the best submission.
Here’s what Diane had to say about this week’s cartoon, “Well, Kip is in the cabinet again! What could he possibly be doing in there now?”
Congratulations to the last cartoon winner, Ben, who won with, “There’s no PC’s in here for you!! I’m a Linux guy.”
For more cartoons on DCK, see our Humor Channel. For more of Diane’s work, visit Kip and Gary’s website. | 4:59p |
Cloudera and Red Hat Partner Around Hadoop-on-OpenStack Solutions Cloudera and Red Hat have partnered on joint enterprise software solutions for big data with a key integration focused on Sahara, the Hadoop- or Spark-on-OpenStack component in Juno, the latest release of the open source cloud software suite.
In another Hadoop-OpenStack marriage, the partners plan to provide cloud-ready big data platforms that combine both companies’ development tools with Apache Hadoop at the core. They will integrate Red Hat’s OpenStack distribution, including Sahara, with Cloudera Director and Cloudera Enterprise, all managed by Red Hat CloudForms.
Director is a new Cloudera product for deploying Hadoop in the cloud. Integrated into Cloudera Enterprise, it is fully supported on Amazon Web Services in its first release.
It allows customers to quickly deploy, elastically scale, and terminate Hadoop clusters in cloud environments.
Cloudera Vice President of Business and Corporate Development Tim Stevens said the alliance “allows for Hadoop workloads to be deployed and managed with the same confidence as other mission-critical workloads to deliver the next wave of big data-based innovation for the enterprise.”
Intel-backed Cloudera already has a vast partner ecosystem, but developing solutions with Red Hat opens additional avenues to enterprise opportunities. Red Hat has another Hadoop-OpenStack alliance with Cloudera’s competitor Hortonworks, with whom it partnered earlier this year year to integrate its Hadoop version with Red Hat OpenStack, as well as offer a Hortonworks Data Platform plugin for Red Hat Storage. | 5:58p |
Ericsson Buys Data Center Management Software Firm Sentilla Swedish telco-equipment giant Ericsson has acquired Sentilla, a Redwood City, California-based company that develops data center management software for the IT side of the house.
As telcos add cloud infrastructure services to their portfolios and increase their use of software-based network management tools, there is an opportunity for vendors like Ericsson to provide them with new types of technology products. The acquisition brings additional technological capabilities to Ericsson’s existing products for infrastructure management as well as a team of experts in the field.
Sentilla has a robust data center management platform that provides visibility into current state and historical information about VMs, physical hosts, private and public cloud infrastructure. It monitors things like CPU usage, power consumption and cost and presents data through visualizations.
Ericsson said Sentilla’s platform was complementary to its Cloud Manager and Expert Analytics software products, although there is some overlap in functionality between Cloud Manager and Sentilla. Ericsson’s product also monitors resource usage, but it also has features like configuration management, security and self-service portals, among others, that Sentilla does not.
Expert Analytics is a whole different animal. Like the name implies, it is a data analytics system. It is geared toward network operators to help them manage quality of service for customers. It helps with things like identifying network congestion, device issues, incident analysis and overall network performance monitoring.
Mike Kaul, Sentilla CEO, said dynamic infrastructure management was an important capability for service providers to have. “By combining our capabilities with key Ericsson offerings, we can support dynamic optimization of workloads across physical, virtual and cloud infrastructures, including constantly changing data center environments,” he said in a statement. | 6:51p |
CyrusOne Brings Second Phoenix Data Center Online CyrusOne announced that the second data center on its massive campus outside of Phoenix is operational.
The company said it had broken ground the 120,000-square-foot data center construction project only in May, saying then that it would house 60,000 square feet of data center space and provide up to 12 megawatts of power. Now, 30,000 square feet of that space has been commissioned and pre-leased.
Phoenix turned out to be a hot market for Carrollton, Texas-based CyrusOne, which entered the market in 2012. The company said in May it was accelerating data center construction at the campus in Chandler because it had closed a 40,000-square-foot deal at its first facility there with an unnamed technology company.
The property is master-planned for seven buildings. At full build-out, it has the potential to house about 1 million square feet of data center space, the company said.
“We’ve seen strong demand in our Phoenix market,” CyrusOne CEO Gary Wojtaszek said in a statement. “We completely leased our first facility in 18 months, built our second facility on the campus, and pre-leased 100 percent of what we just commissioned, all in less than two years.” | 7:30p |
Rackspace Launches OnMetal Cloud Big Data Platform 
This article originally appeared at The WHIR
Rackspace announced the launch of the OnMetal Cloud Big Data Platform at the Strata + Hadoop World conference this week. The new offering provides big data analytics by running Apache Hadoop with Spark on bare metal servers.
Previous Hadoop solutions offered by Rackspace were virtualized, while bare metal is more conducive to consistent performance from big data applications. Rackspace progeny OpenStack’s released its new version “Juno” this week with a big data processing service which also manages clusters using Hadoop and Spark.
“This solution breaks new ground for the world of big data,” Rackspace CTO John Engates said, according to Forbes. “For the first time, Hadoop and Spark can have the best of both worlds: bare metal performance with cloud agility, all backed by Fanatical Support.”
With the hardware and software infrastructure already fully integrated, the offering is meant to be a convenient, turn-key service. According to Rackspace, it can be provisioned in three clicks. The company also says that in preliminary DFSIO and Terasort performance benchmark testing, the new OnMetal Cloud Big Data Platform beat Rackspace’s legacy Cloud Big Data Platform by an average of between 50 and 100 percent.
OnMetal Cloud Big Data Platform is available for a free trial, and Rackspace is offering a $250 credit for new OnMetal Cloud Servers customers as a promotional offer which ends with calendar 2014.
Forbes reports that Rackspace demonstrated the new platform at Strata + Hadoop World with an analysis of sentiments on all tweets with a certain hashtag. The service gave snapshots every five seconds almost in real time, whereas previous virtualized Hadoop installations would theoretically take 2 or 3 minutes, or possibly even 10 to 15 minutes to deliver the analysis, according to Forbes.
While running Apache’s big data platforms on bare metal is part of the new product’s value proposition, Spark is becoming a valued tool in its own right since becoming a top-level Apache project in February. A Wired article on the then-incubating Spark noted that it is “about 100 times faster than the mighty Hadoop — and could very well replace Hadoop as the stuff that fuels the modern web.”
For now Spark compliments Hadoop with operations like classification and stream processing, and as such competes with tools like MapReduce and Storm.
Rackspace expanded its enterprise offerings earlier this month with managed Google Apps for Work, a few weeks after Taylor Rhodes was named CEO and said the company had been repositioned and had “many levers to pull.”
This article originally appeared at: http://www.thewhir.com/web-hosting-news/rackspace-launches-onmetal-cloud-big-data-platform | 9:31p |
Future Facilities and Partner to Productize ACE Data Center Metric Future Facilities, whose software adapts 3D modeling concepts used in electronics design to data center management and design, has teamed up with data center infrastructure solutions company DCIM Solutions on a joint solution meant to help companies optimize efficiency of their data centers.
The solution is based on ACE, an approach to measuring data center efficiency Future Facilities created and has been actively promoting. ACE stands for Availability, Capacity and Efficiency. You need to take pulse in these three interdependent dimensions, the company proposes, if you want a true assessment of your data center’s performance.
The ACE model can also be used to make investment decisions in data center management with awareness of trade-offs. If you optimize for energy efficiency, for example, availability may suffer, since using redundant infrastructure components is by definition inefficient.
DCIM Solutions will now provide a service called ACE Jumpstart. Using Future Facilities’ software, the company will create a 3D computational fluid dynamics model of a customer’s data center that will visualize things like airflow distribution, temperature and hardware performance, among other parameters. The vendors will use the model to assess the facility’s state across the three dimensions of ACE and help the customer set ACE goals to work toward.
The service comes with a 90-day subscription of Future Facilities’ modeling software called 6SigmaDC.
DCIM Solutions Managing Partner Dan McDougal said the service will be helpful to companies that want to assess business value of their data center assets. “Using the ACE methodology, DCIM Solutions will be well-equipped to help data centers of all sizes plan for capacity changes and prevent negative trends before they begin,” he said in a statement.
Learn more about the product directly from Future Facilities and DCIM Solutions at next week’s Data Center World conference in Orlando.
Disclosure: Data Center World is organized by AFCOM, a Data Center Knowledge sister company | 10:33p |
OVH Builds Big Data Cloud on IBM Power8 Chips and OpenStack OVH, one of Europe’s largest hosting companies, has built a new big data cloud infrastructure service using IBM servers powered by Big Blue’s Power8 processors – an alternative to the commonplace x86 processor architecture Intel’s chips are based on.
Getting high-profile customers on board with Power8 is important today for IBM, which has been trying to expand its processor market share aggressively. The company recently sold its commodity x86 server business to Lenovo, and has been focusing resources on Power8.
Along with announcing OVH’s new cloud services, IBM also said the Roubaix, France-based company has become the latest member to join its OpenPower Foundation, an effort it started together with Google, Nvidia, Mellanox and Tyan in 2013. The foundation, now about 60-members strong, is promoting use of the Power architecture, which IBM licenses to others.
The big data cloud services are not in production yet, available only as a lab preview. Called RunAbove, they were designed as cloud services for big data, high performance computing and database workloads – in other words, performance-hungry applications.
There will be two flavors of RunAbove: S for testing how well applications do on the Power8 architecture and 2XL for prime time. S provides VMs that share a physical host, while 2XL offers a single VM per box.
IBM Power Systems General Manager Doug Balog said Power8 was designed specifically for big data. “As the world’s first processor designed for big data, Power8 is a natural choice for any service provider looking to offer their clients high performance capabilities to analyze, move and manage significant amounts of data,” he said in a statement.
The big data cloud infrastructure consists of IBM Power Systems servers and a lot of open source software. The cloud is built using PowerKVM (IBM’s distribution of the open source hypervisor KVM), Fedora Linux operating system and OpenStack, the open source cloud architecture.
OVH is deploying RunAbove nodes in Europe and North America. The company has 15 data centers, two in Canada, and 13 in France. | 11:00p |
QTS Expands Service Offerings With New Facility Management Service EAST WINDSOR - As it enters the facility management business, QTS Realty sees its new Princeton campus as an ideal showcase for its approach to data center services. The 200-acre property provides QTS with a lease from an existing tenant, revenue from its facility management services, room to expand its infrastructure, and a partner to help fill that new space with customers.
The $75 million deal reflects the way QTS approaches the data center business. It likes buying large facilities at an affordable price, building out space at a thrifty $7 million per megawatt, and offering a range of service options to its 850 customers.
The Princeton site is the first implementation of the new Critical Facilities Management (CFM) program, in which QTS manages data centers for corporate users. QTS will provide its wholesale data center offering (known as C1) to Atos, which will then package these services for McGraw Hill Financial, the end user and previous owner of the facility.
QTS expects the 10-year lease to provide a return on capital of better than 10 percent.
“What was interesting for us is the ability to go into a transaction with great visibility and long-term double digit return, and then still have the capacity of almost 180,000 square feet of fully improved shell,” said Chad Williams, CEO of QTS Realty, in the company’s recent earnings call.
The building was initially used as a book warehouse by McGraw Hill, which has had a campus in East Windsor since the early 1960s. The current data center, built in 2008, spans about 180,000 square feet. It also features more than 180,000 feet of expansion space.
 The large footprint of the QTS Princeton building offers generous space for future expansion of the technical space. (Photo: Rich Miller)
“The long-term plan is to build that out for multi-tenant use,” said Danny Crocker, the Vice President of Operations and Global Critical Facility Management for QTS. Crocker estimated that construction would probably start in six to 12 months, with an initial buildout of 30,000 to 60,000 square feet of space.
It may get some help marketing the space from its new tenant. “Atos has ambitious plans for that facility beyond McGraw Hill Financial, and (wants) to bring other clients in,” said Williams.
Williams said the data center management service provides enterprise customers a way to monetize underused data center assets.
“We can utilize our CFM services in enterprise-owned facilities or in connection with QTS taking ownership of the data center in a sale/leaseback structure where QTS can increase the value by using our services to broaden the customer base in a multi-tenant environment,” he said.
The QTS product portfolio is built atop the “three Cs” approach. The C1 offering is wholesale data center space, while C2 is colocation and C3 is a managed cloud offering. The CFM offering adds the option to outsource not just services, but facilities management.
A key benefit is unleashing the power of the sale-leaseback model, which has been a popular growth strategy in recent years for real estate investment trusts like QTS Realty. A sale-leaseback option typically involves a property owner selling their building to a second party, while agreeing to continue to lease space in the building. The transaction generates cash for the former owner (now the tenant), and provides the new owner steady rent from the lease. These deals are particularly attractive when the initial owner is a blue-chip company with a strong credit rating.
Leveraging the Sale-Leaseback Model
The sale-leaseback model has provided a popular way for companies to reduce their focus on real estate and data center management. The model has been widely used by Digital Realty Trust and Carter Validus as they have built large portfolios of data centers and technology buildings.
“We’re excited about the new customer relationships with both McGraw Hill and the broader partnership with Atos,” said Williams. “The opportunity to be 50 miles from Philly and 50 miles from New York really kind of makes this location near Princeton a very optimal location for us. We feel like we have a cost advantage mega scale campus with unique infrastructure and a great client roster to start.”
Continue to see photos of the East Windsor facility: |
|