Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, May 2nd, 2014
Time |
Event |
11:30a |
Red Hat Acquires Ceph Open Storage Provider Inktank for $175 million Open source software company Red Hat is acquiring Ceph open storage systems provider Inktank for $175 million. In a move targeted at the enterprise storage market, Inktank’s core offering, Inktank Ceph Enterprise will be integrated with Red Hat’s GlusterFS-based storage server software.
Ceph is an open source, software-defined storage system that runs on commodity hardware developed by Inktank’s founder and CTO Sage Weil. Designed to replace legacy storage systems, it provides a unified solution for cloud computing environments.
“We’re thrilled to welcome Inktank to the Red Hat family,” Red Hat CTO Brian Stevens said. “They have built an incredibly vibrant community that will continue to be nurtured as we work together to make open the de facto choice for software-defined storage.
“Inktank has done a brilliant job assembling a strong ecosystem around Ceph and we look forward to expanding on this success together. The strength of these world-class open storage technologies will offer compelling capability as customers move to software-based scale-out storage systems.”
Ceph is a competitor to Amazon’s S3 cloud storage, allowing enterprises to build out their own storage offerings at Exabyte scale. Inktank provides the object and block storage functionality of Ceph with a visual interface and enterprise support.
Inktank was launched in 2012 and is based in San Francisco. The company’s developers are behind the Ceph software-defined storage component of the open source cloud architecture OpenStack.
Inktank provides commercial support for Ceph and has a solid list of customers, including Cisco, Deutsche Telekom and CERN.
“We believe our open storage technologies will be critical in the management of data in the coming era of cloud computing,” Weil said. “Joining Red Hat will no doubt lead to tremendous innovation that will ultimately serve the industry well and answer the demand for open storage solutions fully integrated with existing and emerging data center architectures such as OpenStack.”
Red Hat entered the NAS market by acquiring the traditional file storage capabilities of Gluster’s GlusterFS platform for $136 million in 2011. It also purchased cloud middleware JBoss in 2006 for $420 million.
As part of the most recent transaction, Red Hat will assume unvested Inktank equity outstanding on the closing date and issue certain equity retention incentives. The transaction is expected to close in May 2014, subject to customary closing conditions. | 12:30p |
Data Center World Expo Hall Highlights Data Center World Global Conference wraps up its spring conference today in Las Vegas. The expo hall contained a wide variety of vendors and exhibits to assist attendees with their unique problems and day-to-day challenges. From power, cooling, security and facilities management areas such as fire suppression and cabling management, the expo hall had many product and services companies available. See our photo highlights from the Expo Hall — From the Expo Hall: More Scenes at Data Center World. Enjoy! | 12:30p |
How to Create a Next Gen Organization With a Next Gen Data Center There’s a new reality happening in the business world: The next gen data center is critical to your business. Organizations are now planning their business model around their IT environment. The data center is now a direct part of your organization. It controls your delivery process, optimizes user experiences, and helps you tackle demands both today and in the future.
Your data center has become your strategic center. In this white paper from Equinix, we quickly learn that preparation for your future business model starts at your data center.
The businesses that thrive amid fluctuating technology demands are not only keeping a finger on the pulse of current trends, they have the infrastructure in place to handle whatever changes might come.
And they’re doing so by treating their data centers as a strategic asset—a hub from which providers and performance can all stem.
So what does the next-generation data center look like? What will this platform house and how will it directly impact your organization and the end-user? As many analysts have outlined, there are five key trends that will impact the modern organization and infrastructure.
- Mobility
- Consumer technology
- Cloud services
- Hyper-digitization
- Globalization
Through your planning process and as you design your data center model – it will be critical to take the above trends into the design process. Consider this – by 2017, 5.2 billion people will be connected through mobile devices. That’s a majority of the world’s population who will be streaming, sharing, downloading, sending, receiving and opening up a world of potential customers for the prepared enterprise. This also means that a majority of the population will be tapping into your networks all day, every day, from a myriad of different devices.
All of these new trends around IT consumerization, cloud delivery models, and a greater influx of data in the cloud are generating new kinds of demands around the next-gen data center ecosystem. Download this white paper today to learn about these 5 key trends and how your future business model will directly revolve around preparations for your next-generation data center.
As cloud customers begin to make the cloud a vital part of their IT strategy, rather than for select cases, they are becoming savvier, and in turn, more demanding—expecting to interact, work and play whenever, wherever, and however they choose. As we head toward worldwide cloud adoption, it’s estimated that by 2017, the percentage of workloads migrating to the cloud will increase by 500 percent.
| 1:30p |
TIBCO Acquires Business Intelligence Software Firm Jaspersoft for $185 Million Infrastructure software company TIBCO (TIBX) has acquired embedded business intelligence software company Jaspersoft for about $185 million. Jaspersoft’s commercial open source business model and embedded business intelligence suite allows application developers to embed highly interactive reports, dashboards and analytics into their applications.
The acquisition accelerates TIBCO’s expansion into embedded business intelligence and reporting. Current Jaspersoft management team and employees will join TIBCO and continue to pursue the Jaspersoft commercial open source strategy, product roadmap and low-cost, subscription-pricing approach, according to the buyer.
“TIBCO’s broader business strategy is to continue to expand our go-to-market model with a wider range of pricing, packaging and deployment options,” Murray Rode, chief operating officer at TIBCO, said. “The acquisition of Jaspersoft accelerates this strategy as it brings a subscription pricing approach for its commercial products along with a set of open source offerings.
“We believe that the combination of our companies’ offerings will allow us to cover the whole of the market opportunity for analytics while providing tremendous value to the combined set of customers and prospects.”
Brian Gentile, CEO of Jaspersoft, said, “TIBCO has a vision of providing the right information at the right time in the right context to deliver a competitive advantage for customers. The continuation of Jaspersoft’s open source mission, business model, technology, market presence and brand – now as part of TIBCO – will allow us to deliver an even better business analytics toolset for the largest customer audience.” | 1:51p |
Friday Funny: The Server Desk So we are sliding into home, with the end of the work week! And that means it’s time for a cartoon by our fav data center artist, Diane Alber.
Diane writes, “Have you ever seen this??? It just started off as one server but after time It just kept growing!!!” Please add your entry for a funny caption in the comments below.
Here’s how it works: Diane Alber, the Arizona artist who created Kip and Gary, creates a cartoon and we challenge our readers to submit a humorous and clever caption that fits the comedic situation. Then, next week, our readers will vote for the best submission. The winner gets a signed cartoon print.
Congratulations to both our winners from last time. (Our first tie!) Darrell, who entered, “Perhaps it’s time to take Alan Turing, Robert Noyce, and Jon Postel off of the badge access list.” and Dan, who entered, “So they said the colo is covering lunch for the team?”, both get hearty congratulations.
For more cartoons on DCK, see our Humor Channel. For more of Diane’s work visit Kip and Gary‘s website. | 2:00p |
Storage news: Violin Sells Flash Array, NetApp Out With New Software, Avere and Amplidata link up In this week’s storage news, Avere and Amplidata are partnering on high-performance scalable storage, NetApp has introduced OnCommandPerformance Manager software products for ONTAP environments, and Violin Memory is helping Echo Health reduce report processing time.
Echo Health buys Violin Memory flash array. Violin Memory (VMEM) announced that Echo Health, provider of healthcare benefit payment consolidation services, has bought is all-flash storage array to enhance its implementation of SQL Server 2008.
Echo’s healthcare payment systems required long overnight batch processing that impacted next-day business operations. Switching to a Violin All Flash Array reduced its nightly report runtimes to about one-third — from ten to three hours — improving efficiency to support the firm’s projected 40-percent business growth.
“We needed a storage solution that would help us maintain our SLAs and grow our business,” Chad Davis, vice president and chief information officer at Echo Health, said. “With the performance of the Violin All Flash Array, we spend less time on report generation, which allows us to focus on more strategic business initiatives.”
Avere and Amplidata Partner. Avere Systems and object-based software defined storage provider Amplidata announced a partnership to deliver a high-performance storage solution for cloud-enabled data centers. The solution designed for companies in life sciences, oil and gas, media and entertainment, R&D and service provider verticals.
It combines Amplidata’s Amplistor with Avere FlashCloud on FXT Series Edge filers. Using the SPECsfs2008 benchmark, a three-node FXT 3800 cluster achieved 180,229 ops/sec throughput and minimal latency of 0.95ms overall response time (ORT) with an Amplistor system.
The AmpliStor system can be deployed across three geographically dispersed sites.
“We are thrilled to partner with Avere to offer customers a cost effective cloud-scale storage solution that combines high performance file access with massive scalability,” Mike Wall, chairman and CEO of Amplidata, said. “This solution is ideally suited for customers with big data demands who are looking to keep more data online in order to extract value and use it as a competitive advantage.”
NetApp introduces OnCommand Performance Manager. A new piece of software from NetApp is designed for monitoring and troubleshooting clustered Data ONTAP environments. In addition, the company introduced technology updates to NetApp OnCommand Unified Manager, OnCommand Workflow Automation and OnCommand Insight, providing further storage management, automation and protection for faster IT service delivery.
OnCommand products include Performance Manager for monitoring and troubleshooting with analytics, Workflow Automation 2.2 for defining and scheduling storage processes, Unified Manager 6.1 for an overarching view of the health of storage environments and Insight 7.0 for managing virtualized and cloud environments. ”As organizations begin the transition to software-defined service delivery, optimal storage and data management will be critical to their success,” Glenn Rhodes, senior director of product marketing at NetApp, said.
“Organizations must have the ability to monitor trends in their data center, understand performance goals and the attainment of those goals and improve control over their storage environment. NetApp’s new OnCommand product offerings directly enable our customers to focus on rich, integrated, automated IT service delivery.” | 5:00p |
Data Center Jobs: ViaWest At the Data Center Jobs Board, we have a new job listing from ViaWest, which is seeking a Data Center Engineer in Chaska, Minnesota.
The Data Center Engineer is responsible for monitoring the buildings HVAC, mechanical and electrical systems, performing preventive maintenance, site surveys, replacement of electrical and mechanical equipment, reading and interpreting blueprints, engineering specifications, project plans and other technical documents, performing operation, installation and servicing of peripheral devices, assisting with equipment start-ups, repairs and overhauls, preparing reports on facility performance, overseeing vendor facility maintenance, and performing emergency equipment repair. To view full details and apply, see job listing details.
Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed. | 7:00p |
Greater Chicago & Midwest Data Center Summit CapRate event’s Third Annual Greater Chicago & Midwest Data Center Summit has added new speakers, new agenda topics and attendees from around the nation.
The one-day event will be held on June 12 in downtown Chicago.
The day will be kicked off by Jim Kerrigan, Principal and Director – Data Centres, Avison Young, who will speak about Comparing and Contrasting National, Chicago and Midwest Markets. See full agenda and registration information on the Summit website.
Venue
The Holiday Inn Mart Plaza Hotel
350 West Mart Center Drive, Chicago IL 60654
See hotel website for more information.
For more events, return to the Data Center Knowledge Events Calendar. | 11:25p |
Level 3 and Digital Realty Connect Clouds In data center news from this week, Level 3 and Digital Realty said they will partner to give customers cloud connectivity solutions, Juniper is collaborating with Gainspeed on a converged virtual cable access platform, and Ciena was selected by Unitymedia and Integra to advance network performance and functionality.
Level 3 and Digital Realty connect clouds. Level 3 Communications (LVLT) announced that it will connect Digital Realty Trust (DLR) customers in 14 major markets in the U.S. and Europe to its growing cloud ecosystem that includes Amazon Web Services (AWS Direct Connect) and Microsoft Azure (ExpressRoute). With Level 3 Cloud Connect solutions and virtual private network services Digital Realty customers can dynamically scale their bandwidth and only pay for what they consume as demand for computing capacity spikes. In a hybrid environment, transferring large data sets is expedited with greater bandwidth capacity, which also improves the performance of real-time voice and data feeds. ”This relationship with Level 3 enables us to facilitate connections for our clients to AWS or Azure globally with Level 3 Cloud Connect Solutions in a seamless and highly secure manner,” said Digital Realty Interim CEO, Bill Stein. “As our clients’ capacity and connectivity needs grow, it’s our goal to facilitate their easy access to a broad range of solution providers. Our alliance with Level 3 and their growing cloud ecosystem play a key role in the breadth of support we can extend to our clients.”
Juniper collaborates with Gainspeed on cable platform. Juniper Networks (JNPR) announced the Virtual Converged Cable Access Platform (Virtual CCAP), a new solution from its collaboration with Gainspeed, that will enable cable operators to readily address growing consumer demand for data services over cable networks. The new Virtual CCAP solution combines Gainspeed’s technology with Juniper’s MX Series 3D Universal Edge Routers and EX Series Switches to allow cable operators to better automate and scale their edge/access network infrastructure while creating a platform for new cloud-based services for their residential and commercial subscribers. Additionally, the current access network infrastructure protocol for IP transport known as the Data Over Cable Service Interface Specification (DOCSIS) will no longer be used just for high-speed data, but will be the foundation for all services delivered to homes and businesses. ”The world of the MSO is undergoing a radical shift due to advances in technology, changes in consumer behaviors and the insatiable demand for bandwidth and data services,” said Mike Marcellin, senior vice president of strategy and marketing at Juniper Networks. ”As a result, they are seeking ways to maximize their network resources, while priming their infrastructures to deliver new, high-value customizable services to their customers. Juniper Networks is excited to be partnering with Gainspeed to enable Cable MSOs to cost-effectively deploy High-IQ networks that can quickly adapt and scale to deliver the services and applications Cable MSOs need when they want them, without a forklift approach.”
Ciena selected by Unitymedia and Integra. Ciena (CIEN) announced that Unitymedia KabelBW is deploying its 100G coherent optical transport platform to upgrade its backbone network. Unitymedia is deploying the 6500 Packet-Optical Platform, part of Ciena’s network architecture. The platform, equipped with third generation WaveLogic Coherent Optical Processors, leverages an agile ROADM (Reconfigurable Optical Add/Drop Multiplexer) optical layer, which enables fast and efficient 100G connections between central locations. “It is important for cable operators such as Unitymedia KabelBW to extend their own infrastructure to meet growing demands. This is made possible by our converged Packet Optical Platform,” explains Eugen Gebhard, Regional Director, Germany & Pan-European Carrier Accounts at Ciena. “It also enables higher data speeds and new applications so that customer expectations can also be met in the future.” Ciena also announced that networking company Integra is deploying Ciena’s converged packet optical portfolio to enhance its route that connects Salt Lake City to Sacramento. With this, Integra can increase transport efficiency, dramatically reduce network latency and better support wide area networks, private and hybrid cloud infrastructures. The new degrees of software programmability enable Integra to go beyond traditional offerings by tailoring its services, such as diverse path protection in addition to low latency, for datacenter, enterprise, government and wholesale carrier application demands. | 11:26p |
NSA Data Center’s Water Use Pattern Indicates Economization The National Security Agency’s Utah data center using much less water than the U.S. spy agency pays for most likely indicates that the facility is not running at full capacity and that it is taking advantage of economization, also known as “free cooling,” a data center engineering expert told Data Center Knowledge.
Mark Monroe, CTO at DLB Associates, an engineering firm that designs data centers among many other types of buildings, said water consumption levels the site was reportedly designed for were probably for worst-case days – or “design days” in engineering lingo – and that the pattern of monthly water usage was normal for a data center.
Data center engineers “calculate the usage on the hottest hour of the hottest day, with the data center fully loaded, and size the system for that peak,” Monroe wrote in an email. “Average usage could be 20-30 percent of that peak, and that still requires the data center to be fully loaded.”
The NSA’s monthly water bills, released recently by the City of Bluffdale, which sells water to the data center, revealed that the agency was paying for a set amount of water every month but never came close to consuming the amount it had contracted for, The Salt Lake Tribune reported. The paper pushed reluctant Bluffdale officials to release the documents until the state’s record committee ordered the city to do so in March.
The Tribune published the data center’s monthly water consumption and payment figures from January 2012 through February 2014 earlier this week, drawing attention to the fact that the NSA is paying the city for a lot more water than it actually consumes.
The agency paid about $32,000 for water in January and in February of this year, but consumed about 4.9 million gallons and 2.8 million gallons during those two months, respectively. Bluffdale can deliver up to 1.7 million gallons a day to the facility, The Tribune reported, citing city council meeting minutes.
Its monthly water bill from July 2013 through December was about $28,000. During that period, however, there were months when the facility consumed 6.2 million gallons, about 2.6 million gallons and 3.8 million gallons of water.
Monroe also said the wide fluctuations in water consumption from month to month indicate use of evaporative cooling and economization. “Water will only be used when the air [temperature] is too high for direct air or dry cooler use,” he wrote in an email.
“Since air [temperatures] in Utah are cool most of the year, they would only be required to use water evaporation in the summer months. Even in summer, air [temperatures] are cool enough at night to allow several hours of no-water cooling each night.” |
|