Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, April 21st, 2017

    Time Event
    4:03p
    Dell EMC Sells SaaS Backup Firm Spanning to Private Equity

    Brought to You by Talkin’ Cloud

    SaaS data protection provider Spanning Cloud Apps announced on Thursday that it has been acquired by affiliates of Insight Venture Partners from Dell EMC, which acquired Spanning in October 2014. Terms of the deal have not been disclosed.

    With the transaction Spanning Cloud Apps will become an independent company operating under the Spanning brand in Austin, Texas. According to a statement, Dell EMC will remain a strategic partner and continue to sell Spanning Backup.

    Spanning offers backup for SaaS applications like G Suite, Office 365 and Salesforce applications.

    The independent company will be led by Jeff Erramouspe, who previously served as CEO of Spanning before it was acquired by EMC. All 56 current employees will remain with the company.

    “As we look towards the future for Spanning, we see an incredible opportunity to bring our solution for SaaS data backup and recovery to organizations around the world,” Erramouspe said in a statement. “The support from Insight gives us the freedom and fuel to continue to provide the best in SaaS data protection solutions and continue to grow stronger as we enter this new stage in our company’s journey. At the same time, Dell EMC maintains a strategic partner that helps their customers protect critical SaaS data.”

    Spanning has seen 70 percent year-over-year revenue growth and more than 7,000 customers, according to a press release. It restored around 18 million items for customers in 2016, and expects to continue growth with its global data center expansion, and distribution agreements with major channel partners.

    Insight has invested in other SaaS companies including New Relic and Shopify, Fortune reports.

    “We’ve seen a marked increase in SaaS adoption in the enterprise, especially in large and mid-market organizations,” Philip Vorobeychik, Vice President at Insight said. “With that comes a greater awareness of the fact that companies are responsible for recovery of data lost due to user error or malicious activity. We believe Spanning has all the ingredients for continued success and market leadership – a great team devoted to customer success, innovative products, committed customers and a strong global presence.”

    This article originally appeared on Talkin’ Cloud.

    4:30p
    Microsoft launches IoT Central to Simplify Internet of Things Management

    This week Microsoft’s Sam George, the Partner Director for Microsoft Azure IoT announced on the companies Internet of Things (IoT) blog that the company is now offering a new Software-as-a-Service (SaaS) to help companies manage their IoT efforts.

    According to George, IoT is now a big part of many companies infrastructure and growing quickly. That means they need a service to help them not only manage the IoT devices they are starting to use but also the means to analyze the vast amounts of data they generate.

    Microsoft IoT Central is built upon the companies proven cloud service Microsoft Azure and compliments their existing Azure IoT Suite to enhance the control and customization options of the services.

    Right now the service is not available to the general public but you can sign up to get pre-release information and to be notified when access opens up.

    In addition, if you are an IoT company looking to possibly partner with Microsoft in this area you can also make contact with them to open up that dialog.

    Finally, here is an overview video of the upcoming service:

    4:49p
    Friday Funny: Kip and Gary Go to a Data Center Conference

     — When are they opening up the bar?

    Here’s the cartoon for this month’s Data Center Knowledge caption contest.

    Here’s how it works: Diane Alber, the Arizona artist who created Kip and Gary, creates a cartoon, and we challenge our readers to submit the funniest, most clever caption they think will be a fit. Then we ask our readers to vote for the best submission and the winner receives a signed print of the cartoon. Submit your caption below in the comments below.

    Congratulations to Mark Breeze, who won February’s contest for the Valentine’s Day cartoon. Tim’s caption was: “Finally, our VP’s vision for the Cloud is becoming reality.”

    Some good submissions came in for last month’s White Space edition; all we need now is a winner. Help us out by submitting your vote below:

    What should the bubble say?

    Take Our Poll

    Stay current on Data Center Knowledge’s data center news by subscribing to our RSS feed and daily e-mail updates, or by following us on Twitter or Facebook or join our LinkedIn Group – Data Center Knowledge.

    5:00p
    What’s Next in Storage: Managing Data by Objectives

    David Flynn is Primary Data CTO & Co-founder.

    Today, storage systems offer a wide range of capabilities across performance, protection, and price. This diversity gives IT unprecedented choice as they select the ideal storage to meet application needs. However, even with these options, it is clear the “right resource” often doesn’t stay that way for long, as an estimated 80 percent of an enterprise’s data is cold, and admins must still work hard to ensure applications don’t suffer from performance problems.

    We have systems that are fast, systems that can scale, and systems that provide low-cost capacity, but what we do not have is a way to ensure all of those different attributes are used effectively to serve evolving application demands while reducing costs in the face of rapidly increasing data volume. Fixing the challenge of aligning the right data to the right resource is the next big step taking shape in storage industry innovation. This requires an automated approach that manages data by objectives, which can be done when data is virtualized through an enterprise metadata engine as I wrote about in a previous post on Data Center Knowledge.

    A metadata engine seperates the data path from the control path so that storage can be unified within a single global namespace. This provides the ability for data to be managed across all storage, according to objectives set by IT or application owners. A metadata engine can then automatically and non-disruptively move data to the right storage to meet these objectives, ensuring desired service levels are always met, which is a crucial step toward the Software Defined Datacenter. Let’s examine how managing by objectives can ensure the right data is in the right place at the right time.

    Modern Storage Offers Unique Capabilities

    Flash, cloud and shared storage systems each deliver varying levels of performance, protection, price, and capacity, which in turn provides different benefits to business. Server-side flash storage is very fast, provides low latency and high IOPS, but is generally considered less reliable and is more expensive than shared storage. Shared storage, like NAS filers and SAN arrays, delivers lower performance and is more inexpensive than server-side flash, but offers higher levels of data reliability. Cloud and object storage options feature high capacity density and lower costs for storing cold data.

    Set Your Target, Meet it Automatically

    IT or application owners can automate data management by setting target objectives for data across the following attributes:

    • Performance: IOPS, latency, and bandwidth objectives ensure ideal application performance
    • Protection: Availability, reliability, durability, and security are set to meet application protection requirements.
    • Time: Objectives can be based on file activity or inactivity, often in combination with other objectives. For example, objectives can ensure that all active files in a share that have been accessed in the last day are placed on storage that can deliver 10,000 IOPS, 100MB/bandwidth, and 0.5ms latency and all files that have not been accessed in the last 30 days are moved to a preservation tier, such as object or cloud storage.
    • Pattern Matching: Objectives can be based on regular expression pattern matching. For example, admins could set objectives that files matching “.tmp” be stored on the local storage tier, while all other files are on shared storage tier.

    Tiering Data By Objectives

    With objective-based management, enterprises can tier data across different storage resources according to the different storage capabilities that the data requires. For example, many companies have data that goes cold quickly as it ages, as is the case with cell phone billing data that typically goes cold after a 30- to 45-day billing cycle. At many telecommunications companies, that data is rarely accessed again.

    In addition, many applications have cyclic demands. Payroll might need higher performance once a month, but because IT can’t easily move data from a capacity tier to one with higher performance, they typically leave this data on storage that meets its peak needs.

    With a system that can automate data placement, enterprises can “set and forget” data management objectives and be assured that data will meet its business requirements. If business needs change, a few clicks can realign data to the best resource for the evolving requirements. Storage administrators can even create a service catalog that application administrators can use to assign to their own service levels, with set costs per unit of data that are more reflective of actual consumed capacity rather than the high costs of overprovisioned storage.

    Objective: Save with the Cloud

    When it comes to cloud archival, the challenge is determining what data can be safely archived, and how to move that data once it is identified. Managing data by objectives allows IT to automatically identify data that meets the enterprise’s criteria for cloud archival and move it between the cloud and on-premises storage, as needed.

    Many archiving solutions move data using simple rules based on attributes like file creation date. These solutions are error prone, can impact productivity, and require IT intervention to fix. Objective-based management makes decisions based on actual client, application, and user access, and can retrieve files automatically if they are needed again.

    Companies are beginning to use the cloud as a store for their backup data, but restores can be costly due to the bandwidth charges associated with retrieving data from the cloud provider. Objective-based management can retrieve data granularly, at the file level, making it possible to restore just the file that is needed, without IT intervention, minimizing cloud bandwidth charges.

    Automate Agility and Response to Changing Business Needs

    Managing data by objectives gives petabyte-scale enterprises the ability to automate the movement of data according to business objectives, from creation to archival, including the integration of public clouds as an active archive. It also automates core management tasks, making it easy for companies to maximize storage efficiency and cost savings, while protecting performance and protection to meet required service levels.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    << Previous Day 2017/04/21
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org