Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Monday, April 14th, 2014
| Time |
Event |
| 12:30p |
Keeping Money in the Bank: Managing Data Protection in Today’s Data-Driven World Tom O’Brien is the product manager for IBM Tivoli Storage Manager. Tom is focused on partnering with customers, sales and development to bring new functional enhancements and new product offerings to market.
The sheer volume of data continues to grow. This isn’t new; data has been growing for quite some time. What is new about data growth is the complexity, volume and value of data have changed dramatically. Data is becoming more complex; more capacity is being generated from an increasing number of sources (cloud, social, mobile, etc.).
The number of sources has driven the volume of data and data growth faster than ever before. While the complexity of data and the volume are increasing, so is the value of data to the business. Data is now the lifeblood of the business; it’s a business’ most valuable asset and the expectation is that it’s highly available and accessible.
New Data Trends
Global Data Availability, where data is highly available, accessible and resilient, is the new expectation. The challenge is how to provide Global Data Availability to drive efficiency in the face of all this data growth.
In many ways, stored data can be the best currency an organization has to offer. But like all other investments, this comes at a price, as many organizations must devote money and time to protecting their data. If the company’s data protection plan is inefficient, things could get costly. Along with rapid data growth, inefficient data protection systems are also impediments to efficiency and budgets.
Data Protection is Significant
Modernizing data protection technology can be an excellent way to save money and free up funds to invest in new ideas. But to take advantage of these savings organizations have to look beyond traditional methods of data protection.
At its core, data protection is an IT process, and small improvements can have a big impact on the value IT brings to the organization. Just as today’s businesses have to be more interconnected and intelligent to collect, process, use and store more data than ever before, there are now cost effective ways to protect data which leverages cloud, security and open standards.
Despite this more connected world, cost is still king. In our experience with enterprise clients managing their data protection needs, there are eight best practices which can help businesses re-asses their approach to data protection and reap the beneficial rewards.
- Data deduplication – One way organizations can save money on data protection is by implementing cost-effective, high-performance and easy-to-use data deduplication for all data protection workloads. There are many choices to consider; deduplication which can operate on backup servers, source systems, and dedicated deduplication gateways or appliances. When considering deduplication with backup software, in order to control you costs, look for solutions that include this as a based feature and not an additional charge.
- Incremental “forever” backups – Most backup software requires periodic full backups to maintain restore performance, creating unnecessary work for backup infrastructures. One way to reduce costs is by implementing incremental forever backups, which take less time and require less infrastructure. Incremental forever means no more full backups for fast-growing VMware environments and file systems.
- Flexible deployment options – Another way organizations can save money is through flexible deployment options which offer choices regarding how much of their data protection infrastructure they want to own, and how they want to pay for it. For example, cloud and appliance-based solutions can reduce the cost and complexity of data protection while enabling users to switch systems quickly, sometimes in less than a day.
- Simplified administration – Implementing a solution to automate and simplify data protection administration increases efficiency and reduces reliance on IT experts, all while ensuring data security is not at risk. Organizations should look for a solution which enables nontechnical users to see — at a glance — whether data is protected, in turn helping administrators resolve problems faster.
- Cloud data protection – Organizations are quickly moving to cloud-based storage. By implementing a cloud data protection solution, organizations can take full advantage of cloud environments to improve staff efficiency and support multiple service classes, ranging from daily backups to frequent snapshots to remote mirroring, as well as accommodate flexible billing plans.
- Massive scalability – You want a solution that can grow as your data grows, making scalability key. By implementing a comprehensive backup and recovery solution to support your organization at any size, you’ll have more efficiency and save on costs, as backups will be consolidated and there will be no need to purchase more servers to account for data growth.
- Open standards – Look for vendors that adopt and contribute to open standards. Collaboration through open standards is known to reduce costs and speed innovation, providing direct value to organizations. Open standard interoperability between software and hardware vendors enables heterogeneous snapshots, prevents vendor lock in, and enables remote mirroring and other storage management functions to work across all storage systems. Additionally, as many countries have data protection regulations, adopting open standards enables easier compliance.
- Single-vendor solutions – Consider implementing a complete set of solutions to meet data protection needs. Purchasing all technology components from one vendor enables IT staff to spend less time managing multiple vendors, integrating multiple products and managing support calls between vendors.
By leveraging these best practices, organizations can spend more time on innovation and less on managing inefficient infrastructure. Evolving with and embracing technologies such as cloud or open standards can be the little tweaks which enable a smarter data protection program. With the door now open to new possibilities and resources, data protection challenges can be a thing of the past.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission processfor information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 2:36p |
VMware Launches Horizon 6 With a variety of laptops, tablets, smartphones and an array of other employee-owned devices putting pressure on IT departments, VMware (VMW) comes to the rescue with the release of VMware Horizon 6, an integrated solution that delivers published applications and desktops on a single platform.
The new release is a comprehensive desktop solution with centralized management of any type of enterprise application and desktop, including physical desktops and laptops, virtual desktops and applications and employee-owned PCs.
“Customers want to transform their applications and enterprise desktops for the “Mobile Cloud Era” — extending access to employees on any device, from anywhere via a comprehensive solution that is simple, secure and cost effective,” said Sumit Dhawan, vice president and general manager, Desktop Products, End-User Computing, VMware. “VMware Horizon 6 addresses these issues and delivers amazing new capabilities to our customers at nearly the same cost as a traditional, physical desktop.”
Horizon 6 enables entire desktops –or just applications –to be delivered in a flexible manner to end-users. It allows access virtually from multiple devices and locations, physically by syncing the entire desktop image to end user laptops, and securely by delivering applications and content in a managed secure container. New in version 6, Horizon offers streamlined management, end-user entitlement, and quick delivery of published Windows applications, RDS-based desktops and virtual desktops across devices and locations. End-users can access all applications and desktops from a single unified workspace, which supports the delivery of virtualized applications hosted in the datacenter or local on the device.
VMware Horizon 6 is optimized for the Software-Defined Data Center. The solution provides integrated management of VMware Virtual SAN that can significantly reduce the cost of storage for virtual desktops by using local storage. With this innovation, the capital cost of virtual desktops with Horizon 6 can be similar to physical desktops. The new VMware vCenter Operations for View provides health and risk monitoring, proactive end-user experience monitoring and deep diagnostics from datacenter-to-device all within a single console. Using the updated VMware Mirage, IT administrators can design a single desktop with the required operating system and applications, and deliver it to end-users in a department or entire organization based on end-user needs.
VMware Horizon 6 introduces a new client that seamlessly connects to virtual desktops and applications running in an on-premise cloud, a service provider partner cloud, or through VMware vCloud Hybrid Service with the same, high performance end-user experience. This flexibility gives customers the ability to deploy Horizon 6 via the hybrid cloud — balancing between business-owned and public cloud-based infrastructure to best satisfy their needs.
“Governance and compliance can only work if end-users stay within the confines of IT, but end-users are savvy with more options than ever before to work outside the purview of IT,” said Brett Waldman, research manager, End-User Computing, IDC. “If IT can provide the resources, capabilities and support end-users need, they will be less likely to stray, so IT needs vendors, such as VMware, to provide simpler, more agile solutions. With VMware Horizon 6′s new ability to deliver published applications in addition to virtual desktops, IT can deliver just what end-users need, or more importantly want.” | | 3:25p |
IBM Brings New Disaster Recovery and Managed Security Services to SoftLayer Cloud Customers This article originally appeared at The WHIR.
IBM added new disaster recovery and managed security services for SoftLayer cloud customers on Monday. The new services are available through IBM’s cloud resilience portfolio.
The new disaster recovery services will give SoftLayer customers access to IBM’s Cloud Virtualized Server Recovery managed service, which helps customers recover applications, servers and cloud-based data in the event of an outage. VSR allows SoftLayer customers to replicate entire systems in real-time, which helps customers minimize the impact of downtime.
Disaster recovery is still relevant despite the increasing use of cloud storage and multi-cloud approaches for data backup. Disaster recovery helps automate the process of recovering crucial business information, ensuring customers can go back to business as usual as quickly as possible after an outage or other event.
SoftLayer customers will also have access to IBM’s Resiliency Consulting Services which help assess resiliency, and offer planning and design as well as implementation and testing services. The services include Resiliency Consulting for Cloud, Cloud Managed Backup, Cloud Data Virtualization, Cloud Application Resiliency and the afformentioned Cloud Virtualized Server Recovery.
As part of the announcement, IBM is also opening two new cloud-based resiliency centers in Raleigh, North Carolina and Mumbai, India, respectively.
These facilities are in addition to 15 global centers planned by SoftLayer and the 50 BCRS Resiliency Centers which will speed recovery times by eliminitating network latency. The centers also offer customers the ability to meet local and federal data residency compliance regulations, IBM says.
At the beginning of the year, IBM announced that it would invest over $1.2 billion to expand its global cloud footprint, adding 13 SoftLayer data centers. The first of the new SoftLayer data centers opened last month in Hong Kong.
SoftLayer customers will also have access to new security services, which deliver threat management for firewall and intrusion detection and prevention management and monitoring services.
By leveraging IBM’s security opeartion and intelligence analysts, SoftLayer clients will be able to quickly identify threats and potential vulnerabilities, according to IBM. The services can be integrated with on-premise security equipment as well.
Later this year, IBM will introduce enhanced DDoS protection, web and email protection, as well as managed endpoint protection services for SoftLayer cloud customers.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/ibm-brings-new-disaster-recovery-managed-security-services-softlayer-cloud-customers | | 5:23p |
Red Hat Announces OpenShift Marketplace Red Hat, Inc. (RHT), a leading provider of open source solutions, announced at its conference this week OpenShift Marketplace, a one-stop shop that will enable customers of all sizes to find and try solutions for their cloud applications.
OpenShift is Red Hat’s Platform-as-a-Service (PaaS) that allows developers to quickly develop, host, and scale applications in a cloud environment. The OpenShift Marketplace will bring Red Hat’s OpenShift PaaS partner ecosystem directly to OpenShift Online customers, enabling them to leverage enterprise PaaS with tightly integrated, complementary solutions developed for the public cloud.
The marketplace will be launching in all availability regions of the OpenShift Online public PaaS service in the coming weeks.
With the advent of the marketplace, Red Hat aims to reduce the search time and cost of finding the solution for customers seeking value-added OpenShift partner add-ons. Customers will be able to easily find the information, tools, and community they need to discover and procure the right solution. They will be able to securely access and manage leading OpenShift application technologies from a single location.
Customers and developers will be able to find third-party OpenShift solutions and add-on productivity offerings, including database, email delivery services, messaging queues, application performance monitoring and more, all managed from a central location. Several OpenShift partners have already signed on to add their solutions to the marketplace, including BlazeMeter, ClearDB, Iron.io, MongoLab, New Relic, Redis Labs, SendGrid, and Shippable.
“The OpenShift Marketplace is our next step towards our goal of providing customers the widest variety of choice when it comes to technologies that complement their OpenShift experience. As the OpenShift partner ecosystem continues to expand, we expect the Marketplace to provide developers and customers a more streamlined, secure experience to choose the best third-party solutions for their productivity and business enablement needs,” said Julio Tapia, director, OpenShift ecosystem, Red Hat.
Enabling SaaS ISVs to Reach a Growing Network of OpenShift Customers and Developers
As more developers use enterprise PaaS for an increasing array of applications, a key to their success is a comprehensive partner ecosystem. OpenShift’s current partner ecosystem uses the OpenShift Cartridge specification method to link key technologies and services into applications built on OpenShift, giving customers access to a variety of offerings from cloud industry leaders. | | 8:00p |
CoreSite Powers Up With Double-Stacked Generators SECAUCUS, N.J. - What does reliability look like? On a windy afternoon in northern New Jersey, reliability is vertical, extending skyward in an equipment yard framed by snow and clouds.
At its NY2 data center, CoreSite has stacked the massive generators that provide emergency backup power for the huge new facility. The 2 megawatt Caterpillar engines are there to keep customer servers humming within the 280,000 square foot data center in the event its two utility feed both go dark.
They’re part of a $100 million investment CoreSite has made in the Secaucus facility, which provides expansion space for the company tog row beyond its original New York data center at 32 Avenue of the Americas in Manhattan.
NY2 represents the newest wrinkles in CoreSite’s approach to data center design. Data Center Knowledge had a tour of the facility at its opening a few weeks back. Here’s a look at the data center and its features:
 The lobby is the first stop, and is where visitors begin to encounter the multiple measures CoreSite takes to include customer equipment, which includes key card access, biometric scanner, a mantraps to limit access, security cameras watching the entire interior and exterior of the building, round-the-clark guards, and aperimeter fencing. (Photo: CoreSite)
 This is the first of 11 planned data halls within NY2. Each provides about 12,000 square feet of space and 1.5 megawatts of power capacity, and is capable of supporting 185 to 200 watts per square foot of power density. The space can be customized to support retail colocation or wholesale data center requirements. CoreSite uses 47U customer cabinets, which are slightly taller than the traditional 42U rack. (Photo: CoreSite) |
|