Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, July 5th, 2013

    Time Event
    12:30p
    More Points to Consider Before Buying a Data Protection Solution

    Jarrett Potts, director of strategic marketing for STORServer, a provider of data backup solutions for the mid-market. Before joining the STORServer team, Potts spent the past 15 years working in various capacities for IBM, including Tivoli Storage Manager marketing and technical sales. He has been the evangelist for the TSM family of products since 2000.

    Jarrett-PottsJARRETT POTTS
    STORServer

    In part one (add link) of our series about the major items to consider before making an investment in a data protection solution, we covered three areas: 1) ensuring the solution offers more than just backup and recovery, 2) finding a vendor that offers superior customer support as well as subscription and support (maintenance) contracts, and 3) using reliability as a key measure of a solution’s ROI.

    In the second part of our series, I will discuss the importance of finding a solution that’s easy to use, why different data should be treated differently, how to eliminate the burden of virtual machine backups and why all the talk shouldn’t focus on deduplication.

    Ease of Use

    There are many data protection products on the market today, and all of them have features and functions that make them stand out. One of the major items to consider is how easy the solution is to use. When it comes down to “brass tax,” ease of use is one of the most important items. After all, the person responsible for data protection may not have a huge skill set or the time to spend on managing the solution on a day-to-day basis.

    When choosing a data protection solution, look for the ability to manage the system from a single pane of glass. The user interface needs to be simple enough that within a few minutes all daily tasks can be completed. As a bonus, the solution should send alerts to your inbox and mobile devices in an easy-to-understand report that includes important information about the previous night’s activity and the status of those activities.

    This simple to use requirement dovetails into historical reporting. If the user knows what the solution is doing on a day-to-day basis, then he or she will also be able to tell what the system has been doing for the last few days, weeks, months or longer. This allows for planning the future with little to no hands-on work. For example, if there is a report that gives weekly growth for the last 26 weeks, users will be able to predict when they are going to run out of space in the solution or when to purchase more tapes. It is a very simple example, but shows that the solution should help plan for the future as well as operate today. And, following this example, users will be able to budget six months or more in the future for growth—a great advantage when budgets are tight.

    The single-server footprint versus the master/media footprint can also make the solution easier or harder to manage as will automatic client software updates that keep IT administrators from spending valuable time making manual updates to systems across the infrastructure.

    With business-wide administration, monitoring and reporting, plus the flexibility enabled by automation, a new solution should create administrative time-savings that can measurably reduce the cost of operations.

    Data Life-Cycle Management: From Birth to Death

    IT organizations can drive up the cost of storage unnecessarily by treating all data the same and storing it all on the same media. All data is not created equal. When looking for a data protection solution, managing the ones and zeros from the time they are created to the time they can be deleted is important.

    Long-term archive and hierarchical storage management (HSM) allow organizations to store data on different tiers based on specific policies, enabling administrators to migrate and store data on the most appropriate tier. For example, older and less frequently accessed data can be moved to a slower, less-expensive storage platform, such as tape, leaving more expensive disk storage available for more high-value data. Automated data archiving also helps organizations ensure compliance with data retention policies and reduces the costs associated with compliance.

    When making a decision about data protection, look only for solutions that reduce costs by providing automated, policy-based data life-cycle management, moving data to the most cost-effective tier of storage while still meeting service level requirements. This helps ensure recovery objectives are met while enabling transparent data access.

    Support for Virtualized Environments is Key

    Virtualization technology has helped IT organizations of all sizes reduce costs by improving server utilization and reducing application provisioning times. However, this produces two new problems that most people do not account for when choosing a data protection solution. First, the cost savings offered by virtualization can disappear quickly in the face of virtual machine sprawl. Second, the link between physical and logical devices becomes harder to map and track, making a virtual environment more complex than most can imagine.

    Data protection can become a unique challenge in these environments. For example, backing up and restoring data for a dozen or more virtual machines residing on one physical server can bring all other operations on that server to a complete halt.

    When searching for a data protection solution, investigate whether the product provides an effective solution to this challenge by eliminating the burden of running backups on a virtual machine, and instead, off-loading backup workloads from a VMware ESX or ESXi-based server to a centralized vStorage backup server. The solution must improve the frequency of protection and enable faster recovery of data, helping increase the business value of virtualization. The solution should also help reduce license management costs by removing agents from the individual virtual machines.

    Data Reduction, Not Just Data Deduplication

    Data protection is not just about data deduplication. Many data protection products and providers talk about data deduplication as if it will save the world. In fact, data deduplication is only a small part of the solution. What needs to be talked about is across the board data reduction.

    Data reduction technologies are the first line of defense against rapidly expanding data volumes and costs. A solution that provides built-in data reduction technologies, such as progressive-incremental backup, data deduplication and data compression, can enable organizations to reduce backup storage capacity by as much as 95 percent. Advanced tape management and efficient tape utilization capabilities can further reduce data storage capacity requirements.

    While some solutions create massive amounts of duplicate data through repetitive full backups, necessitating expensive data deduplication solutions, others provide progressive-incremental backup technology that avoids the duplicate data in the first place by creating only an initial full backup and then capturing only new and changed data. Built-in data compression and data deduplication operate at multiple storage layers to minimize the amount of data being retained for operations and disaster recovery.

    Dramatically reducing backup storage requirements not only helps cut capital expenses, but can also decrease network bandwidth requirements and shrink backup windows. This results in reduced operational impact of backups and helps ensure high levels of application uptime.

    In the next part of this three-part series, I will discuss how making the right licensing decision can save you money, how to scale data protection, setting different policies for different data and the role of unified recovery management.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    1:00p
    Multi-Service and Multi-Cloud Providers on the Rise

    cloud-monitors

    AUSTIN - The rise of the multi-service, multi-cloud provider is upon us. While there will always be companies that focus on a single core competency such as colo or managed services, the world is demanding a mix of services. Providers need to enable everything from physical space to virtual machines, be it through partnerships, ecosystems, or providing it directly.

    That’s why service providers up and down the stack have been expanding their data center offerings.  Companies like Onramp have moved beyond colo and managed services and begun offering cloud. VAZATA, which started off as a managed hosting provider, bought its way into the colocation business. The larger data center services players like Telx are remaining cloud-neutral, focusing on building ecosystems of cloud providers that customers can leverage.

    What’s happening with services is now happening with cloud. Companies like Peer 1 Hosting are recognizing that there’s a distinct, middle ground between public and private; an offering that incorporates the security of a private cloud and the flexibility of a public cloud. The company dubs this “Mission Critical Cloud.” Peer 1’s Mission Critical Cloud shows the company has hit a vein: it took Mission Critical cloud less than three months to hit the revenue its public cloud did in nine months.

    The industry is evolving, and a number of companies commented on how this is changing their businesses at the HostingCon conference last week in Austin, both through panels and individual interviews.  

    Is Colo Evolving?

    VAZATA’s Lance Black hits the nail on the head. “Customers want to take advantage of economies of scale,” said Black. “There will always be big colo, while smaller and mid-sized companies will offer a higher level of high touch, managed services and virtualization.”

    At a time where many are converging into the cloud world, VAZATA has moved into colo, realizing that customers are demanding a mix of services. The statement that there will always be big colo is applicable to Telx, which has chosen to forego offering its own cloud. There are several reasons for this. Offering its own cloud would put Telx in competition with its customers. It’s about offering a range of services, and Telx enables this through pointing customers to other customers and connecting them.  “Half of CIOs said they would not rearchitect their apps for cloud,” said Telx’s John Freimuth. “Enterprise IT is adopting colo. We need to build a virtual exchange where customers can get these services from a number of providers.” 

    For providers like OnRamp, “the key message is options,” said Chad Kissinger, the company’s founder. “Certain segments are choosing to forego colo and go a different route.”  Several companies throughout the HostingCon event mentioned that customer deployments are becoming more complex, and providers must adapt to meet this challenge.

    Evolving Isn’t Easy

    “Colo and cloud involves two completely different sets of people,” said Black from VAZATA. “A colo salesperson is a real estate sales person. In the world of cloud, it’s more of an IT guru – an architect.” In addition to having the technology, there are personnel considerations. While there are an increasing number of multi-service providers, there’s no denying that accomplishing this involves a slew of different personnel resources that a company might not have.  Cloud specialists are in short supply and demanding a lot more money. Tierpoint’s first colo deal was in 2010, and in less than three years it has a growing cloud business.  ”Now we have a couple million in cloud revenue,” said Tierpoint’s Andy Stewart. “The biggest change is the people needed to sell and deploy cloud.”

    John Freimuth of Telx also discussed the very different business models, including the advantages of a pure real estate play through REIT status. “REITs have access to higher rates of capital at lower interest rates,” said Freimuth. “We’ll always need data centers.” For some of the big builders like Digital Realty Trust, going cloud doesn’t make any sense because of their scale and business model. The big players will make sure to remain neutral when it comes to cloud, focusing our building up rich customer ecosystems to attract more customers, rather than competing and potentially angering existing customers.

    4:00p
    Deutsche Börse To Launch Cloud Exchange

    cloud-trading-2

    Will cloud computing capacity soon be traded like pork bellies? That’s the intriguing possibility raised by this week’s news that Deutsche Börse will launch a “cloud exchange,”  a trading venue for outsourced storage and computing capacity that will launch  early next year.

    Cloud Exchange AG is a joint venture formed with Berlin-based Zimory to create a neutral, secure and transaparent trading venue for cloud resources. The exchange is based on the belief that the underlying, bare bones compute is becoming a commodity like wheat or oil. CPU, RAM and storage will be bought and sold in a liquid market.

    This isn’t the first attempt to create a market for cloud capacity. The notion of a commodity-style exchange for cloud capacity emerged in 2010, shortly after Amazon Web Services began offering spot pricing for cloud instances. SpotCloud was created in by Enomaly in 2011 to pursue this vision, and was later acquired by Virtustream.

    High Finance Meets Cloudy Markets

    The entry of a financial exchange into the cloud computing market takes the concept to the next level, potentially bringing new levels of sophistication.

    “With its great expertise in operating markets, Deutsche Börse is making it possible for the first time to standardize and trade fully electronically IT capacity in the same way as securities, energy and commodities,” said Michael Osterloh, Member of the Board of Deutsche Börse Cloud Exchange.

    An exchange along the lines of what Deutsche Börse is planning could work really well with customers who desire presences in multiple jurisdictions. European companies in particular will see value in such an exchange, as they need capacity within a particular jurisdiction to serve a particular region (e.g. German compute to serve a German clientele).

    By creating a liquid market to trade compute and storage on one of the leading financial exchanges, participants can easily buy capacity for ad-hoc needs, while suppliers can sell capacity directly, or trade it among themselves. Cloud customers who don’t have capacity in a specific location can purchase it there.

    The primary users for the new trading venue will be companies, public sector agencies and also organizations such as research institutes that need additional storage and computing resources, or have excess capacity that they want to offer on the market.

    Global, Vendor-Neutral Market Envisioned

    The marketplace will be international and vendor-neutral. Deutsche Börse Cloud Exchange will set and monitor standards regarding the product offering, admission procedure, and changes of providers and guaranteed purchased capacity. Clients will be able to choose capacity providers freely, as well as select the jurisdiction that will apply to the outsourced data. The product offering will initially include outsourced storage capacity and computing power.

    Product standards and technical provision will be developed in close cooperation with potential marketplace participants and related parties. These parties include representatives from the traditional IT environment, national and international SMEs, and large corporations from a wide range of industries, such as CloudSigma, Devoteam, Equinix, Host Europe, Leibniz- Rechenzentrum, PROFI AG, T-Systems and TÜV-Rheinland.

    “Participating in Deutsche Börse’s vendor-neutral platform for IaaS cloud computing products was a no-brainer,” said Robert Jenkins, CEO of CloudSigma. “CloudSigma was founded on the idea that computing power should become ubiquitous, convenient and shaped by user requirements. Trading compute on one of the leading financial exchanges is really a great market validation of what we’ve already been working toward and will help to further drive innovation in computing.”

    “The cloud marketplace of Deutsche Börse offers companies a further choice to purchase top secure and tested cloud-services from T-Systems,”said Frank Strecker, responsible for the cloud business of T-Systems. “Due to the high level of standardization this will all be done with great speed and efficiency and with up-to-date prices.”

    4:30p
    Protesters Target NSA Data Center

    A data center may seems like an unlikely location for a protest. But the NSA’s Utah Data Center in Bluffdale has become a focal point of the agency’s data collection, which has become a topic of global controversy in the wake of disclosures of previously secret documents by former NSA contractor Edward Snowden.

    Yesterday, a group of about 150 people gathered near the Bluffdale facility to protest the NSA’s information gathering activities. A smaller group of about 65 protestors marched to the data center and tied red, white and blue ribbons o the fence surrounding the secure facility, according to the Salt Lake Tribune.

    The march was one of dozens of events around the country in response to Restore the Fourth, a grassroots movement backed by several major Internet communities seeking to focus attention on the Fourth Amendment and its relevance to digital surveillance by the U.S. government.

    For more details, read the account in the Salt Lake Tribune.

    << Previous Day 2013/07/05
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org