Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, August 19th, 2014

    Time Event
    6:35a
    EqualLogic Founders’ Startup DataGravity Launches First Data-Aware Storage Product

    DataGravity wants to expand the role of storage. The startup has a rich pedigree and is releasing its first products, called the Discovery Series, after a successful beta period, initially offered in two models, the DG2200 and DG2400, with 48TB and 96TB capacity levels.

    DataGravity offers what it calls a data-aware storage solution. It is an early-stage company with the mission of turning data into information. The mission is not just to enable data discovery, but to make it simple to use so it proliferates in an organization.

    According to a Gartner report, more than 80 percent of the data being produced today is unstructured. DataGravity says companies are struggling to maintain, much less benefit from, their rapidly growing stores of human-generated data. Companies are paying a premium for storage infrastructure, layered management applications and siloed processes that only introduce greater complexities without offering real-time insights.

    This is the challenge DataGravity is trying to address with a unified storage platform that offers insights with the same richness of intelligence, regardless of whether the data is block or file. It supports NFS, CIFS/SMB and iSCSI LUNs, with the additional capability to manage virtual machines natively.

    DataGravity is targeting the mid-size market, using a 100 percent channel model. The IT department is the entry point for the company. Based on the beta period, the company says that security and audit personnel get involved next, followed by end users themselves.  To company says that for a customer to build out similar functionality, it would require cobbling multiple pieces together, with a price tag beyond the mid-size market’s reach.

    Founders John Joseph and Paula Long were the founders and original architects of storage startup EqualLogic, a company that was acquired by Dell in 2008 for $1.4 billion as the largest cash-only payout ever for a venture-backed firm at the time. DataGravity is privately funded by Andreessen Horowitz, Charles River Ventures and General Catalyst Partners. It has raised $42 million since it was founded in 2012.

    The company argues that storage needs to evolve beyond being a simple container. “We believe that storage, provisioning, deduplication and in-line compression have become standard features [in storage],” Long, the startup’s CEO, said. “The features we’re talking about will be the next thing people ask for in their RFPs. Storage can’t be just a container for much longer.”

    The DataGravity Discovery Series delivers storage, protection, data governance, search and discovery. “It is a unified, flash-optimized system, with enterprise-class high availability performance and features,” Longs co-founder John Joseph, the company’s president, said. “Coupled with that is the DataGravity engine. On ingest it is looking at people, content and time. It exposes that intelligence in a single user interface.”

    Atop an all-flash array, DataGravity’s software catalogs data using a variety of parameters, presenting a historical look at how that data is used. Customers can use this to find out what’s taking up space, how and which users access data and what they contributed over time.

    The DataGravity platform can be used for compliance, finding subject matter experts within the company based on their activity and general understanding into how and why storage is being used. This enables mid-market companies to glean new insights and make better business decisions.

    IDC program vice president of storage Laura Dubois said the unstructured data dilemma is growing. “IDC has been predicting technology would catch up to provide an answer to the market demand,” she said. “The DataGravity approach is transformational in an industry where innovation has been mostly incremental. DataGravity data-aware storage can tell you about the data it’s holding, making the embedded value of stored information accessible to customers who cannot otherwise support the cost and complexity of solutions available today.”

    12:30p
    Hitachi Data Systems Acquires Backup Appliance Heavyweight Sepaton

    Hitachi Data Systems boosted its data protection strategy recently by acquiring long time storage partner Sepaton. Besides spelling ‘no tapes’ backwards, Sepaton provides data protection solutions for data-intensive enterprise environments in the form of a purpose-built data backup appliance (PBBA). Its appliance boasts 80 TB/hour backup speeds on 64-bit processing nodes and the ability to scale capacity with 3.8 petabyte disk shelves (before deduplication).

    Sean Moser, vice president of software platforms product management at HDS explained in a blog post that the acquisition was a good fit with its approach to “a service-level-oriented protection scheme that balances the cost of protection with the business needs for data availability – whether it’s number of copies, location of copies, or frequency of copies. This acquisition better enables us to help our customers reduce the cost of protection, enable more data to be protected against disaster and offer greater flexibility in where or how it is protected.”

    Sepaton has been a partner with HDS for many years and becomes one of many acquisitions HDS has made as it continues to build out its vision for data protection. Besides broadening its portfolio of offerings, the acquisition places HDS in a more direct competitive position with rivals EMC, IBM and others who are competing in the PBBA market. In the past HP sold branded versions of Sepaton appliances until phasing them out in favor of its own StoreOnce products.

    Headquartered in Marlborough, Massachusetts, Sepaton will operate as an independent subsidiary of HDS. The deal brings many large customers, who have previously had combined solutions with Sepaton and HDS, under HDS’ wing. The company also carries a handful of patents, including a 2014 one for Dynamic Deduplication.

    The acquisition price was not mentioned, however, as a venture-backed company Sepaton has been around a long time and has received $98 million  in funding over the past 14 years.

    3:06p
    Data Center Design: Customization vs. Personalization

    Chris Crosby is CEO of Compass Datacenters.

    Although many enterprise end users view their data center requirements as unique to them, this is often a costly and time-consuming misconception.

    In fact, if we eliminate considerations like vendor specificity for major components (ex: a preference for Caterpillar generators as opposed to Cummins), Tier certification, cooling methodology and LEED qualifications, the most unique aspects of a data center are typically found in areas related to the physical size or layout of the facility itself.

    Unfortunately, the limited designs of most wholesale data center offerings coupled with their eagerness to be all things to all comers actually facilitate poor decision making on the part of their customers which manifest themselves in “custom” solutions that force the customer to unwittingly pay for the inadequacies of their own providers.

    The evolutionary leap: industrialization to productization

    In the late 2000’s the data center industry, led by companies like Digital Realty in the MTDC space, and to a lesser extent, I/O in the area of pre-fabricated modular offerings, began to incorporate fundamental industrialization concepts that have characterized other industries for years into their own operations. In some instances these operational enhancements can date their antecedents to the days of Henry Ford and are characterized by the following elements:

    • Semi-standard design
    • Accelerated schedules via simultaneous activity, in this case the off-site development of power room components
    • Cost control via volume purchase agreements derived via large quantity purchases of major materials

    Under these structures basic functionality (power and often cooling) is standard but other elements such as the physical size of the raised floor, component suppliers and reliability configuration are often left to the discretion of the end user.

    The next step in this historical evolution process is productization. In a productized environment the customer offering has been developed to include the most commonly required site attributes within a standard product. This structure is analogous to automobile offerings in which attributes like air conditioning and power steering are included in the basic vehicle itself. By incorporating elements such as Tier III and LEED certification, hardened shell and cooling methodology into a replicable design the customer’s costs are reduced through the ability of the provider to precisely purchase required components and materials along with the ability of the company’s manufacturing partners to accurately plan for their labor requirements. By offering a standard product that features an easily replicable design and feature set the provider is also able to offer end users a more accurate delivery date coupled with an accelerated delivery schedule. In short, productization is defined by:

    • Standard product design
    • Comprehensive standard feature set
    • Integrated supply chain
    • Cost control via ability to order materials based on known quantities

    The issue facing many wholesale providers at this point in time is that their business models are inhibitors to moving from industrialization to productization.

    Customization vs. personalization

    Customization and personalization are functions of industrialization and productization respectively. Due to the non-standard nature of industrialized offerings the customer is typically encouraged to work with the provider to incorporate their desired attributes into the facility, often times at additional cost and time to them.

    Under the productized paradigm the majority of major customer requirements (typically those of 50-60 percent of the marketplace) are already incorporated into the product to enable the customer to focus on the elements of the data center design that reflect their unique requirements for attributes such as security system type, branding and color scheme. A mode similar to purchasing a car while only having to determine elements such as the color of the vehicle and whether to include a single or multi-CD changer.

    As data center requirements continue to grow, so will the corresponding need for quickly implementable, reliable solutions. Due to its focus on standard feature sets and replicability, the principles of productization closely align with these next generation requirements as opposed to those of its industrialization predecessor.

    From a historical perspective, one of the major questions facing the industry moving forward will be how will providers adapt to the need to make this evolutionary leap and who will remain standing in the future.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    3:41p
    Datacenter Dynamics Converged Seattle

    Datacenter Dynamics Converged Seattle will be held Thursday, September 4, at the Hyatt Regency Bellevue in Seattle, Washington.

    This conference and expo brings together the people, processes and technologies that help develop world class data center strategy. The multi-track conference program – headlined by a host of international experts – looks at “what’s new” in our industry, and debates “what should change.”

    From the idea of disposable data centers to software defined and everything in between, DCD Converged is sure to keep North America’s most senior data center professionals up-to-date with an industry changing at lightning speed.

    For more information – including speakers, sponsors, registration and more – follow this link.

    To view additional events, return to the Data Center Knowledge Events Calendar.

    4:23p
    Motivated To Influence Data Center New York 2014

    Motivated To Influence Data Center New York 2014 will be held Tuesday, October 7, at the Metropolitan Pavilion in New York, New York.

    This is a new management and technology conference where data center industry experts share insights and best practices to support the delivery of digital services and drive data center transformation. Bringing together business leaders, IT professionals and data center practitioners, the event offers a comprehensive agenda for data center topics, networking, and opportunities to extend invitations to socialize.

    Participants will leave with the knowledge and know-how to optimize their organizations’ data centers for efficiency, growth and leading-edge innovation.

    For more information – including speakers, sponsors, registration and more – follow this link.

    To view additional events, return to the Data Center Knowledge Event Calendar.

    5:00p
    Datapipe Expands into Government Cloud Hosting Sector with Layered Tech Acquisition

    logo-WHIR

    This article originally appeared at The WHIR

    Datapipe is expanding into the government sector with its acquisition of managed hosting and cloud provider Layered Tech on Tuesday. The terms of the acquisition have not been disclosed.

    Datapipe said that the acquisition will help grow its expertise in offering compliant managed cloud services and introduce solutions for the government sector with FISMA and FedRAMP compliant cloud offerings.

    Founded in 2004, Layered Tech has focused on providing compliant hosting solutions, introducing its cloud hosting offerings in 2006. The company recently launched a program where it provides free cloud hosting to startups in the payment processing and healthcare industries. In July, Layered Tech spoke about its new federal cloud platform-as-a-service solution at Microsoft’s Worldwide Partner conference.

    As organizations in the healthcare and government sectors are being urged to adopt cloud services to cut costs and improve efficiency, hosting providers that can not only provide compliant services, but also offer a more consultative approach will win big as the healthcare cloud market in particular is expected to reach $6.5 billion by 2018.

    “We have seen increased client value in delivering secure and compliant managed cloud solutions that directly connect public cloud platforms, such as AWS, to our own hosted private cloud and traditional hosted infrastructure,” Robb Allen, CEO of Datapipe said. “With the acquisition of Layered Tech we are looking forward to providing this unique combination of security, choice and control to Government clients as they tackle the complexities of leveraging cloud-based services.”

    Datapipe said that Layered Tech will help strengthen its Enterprise Cloud Risk Management Solution, and deliver a hybrid cloud solution to federal customers. The acquisition will expand its data center presence through Kansas City, Missouri; Denver, Colorado; and Dallas, Texas.

    As part of the acquisition, Datapipe will establish the Datapipe Government Solutions business unit. Former Layered Tech Government Solutions president and general manager Dan Tudahl will serve in the same role in this unit.

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/datapipe-expands-government-cloud-hosting-sector-layered-tech-acquisition

     

    6:00p
    DataBank Names Arsalon Co-Founder Bryan Porter as CTO

    Colocation and managed services provider DataBank named Bryan Porter as CTO. Porter will direct all aspects of the company’s engineering and product development.

    Porter came to DataBank via the company’s acquisition of Kansas City’s Arsalon Technologies, a hosting and cloud provider. He co-founded Arsalon in 2001 and served as the company’s president.

    Overall, Porter has 15 years of senior leadership experience in networking and product engineering, according to DataBank.

    DataBank continues to launch complementary products to their colocation and customer data center offerings. Last year the company moved up the stack to offer managed services and continued to expand geographically.

    “We are extremely happy Bryan has joined the DataBank management team,” said Tim Moore, CEO of DataBank. “His experience is a great asset for us as we expand our service offerings and develop new customer solutions.  The depth and breadth of his knowledge in both delivery and management of products and services will be pivotal in our ability to continue the execution of our growth plan.”

    DataBank is famous for turning the Federal Reserve Bank office in Dallas, Texas, into a highly secure data center. The company expanded from there, discussing its growth strategy in 2013. It acquired VeriSpace in March 2013 to establish a presence in Minneapolis and continues to expand in Minnesota, recently announcing plans for a 20-megawatt data center in Eagan.

    7:00p
    Rackspace Expanding Data Center Capacity Globally

    As the cloud infrastructure services market grows, so does the amount of data center space that hosts the systems that support cloud services. While the largest cloud providers, the Amazons and Microsofts of the world, mostly build their own data centers to support their services, mid-size players rely to a large extent on wholesale data center landlords – the likes of DuPont Fabros Technology and Digital Realty Trust.

    One of such mid-size providers is Rackspace, which, despite having a tough time in the stock market since early 2013, has been growing its data center footprint worldwide. DuPont Fabros is its primary data center landlord, and Digital Realty is its second biggest provider.

    Its capacity build-out indicates healthy demand for its services. The company’s rule of thumb is to add capacity in a market before it exceeds 80 percent of the current footprint, Rackspace chief operating officer Mark Roenigk said in an interview.

    The company has been adding data center space across major markets in the U.S., as well as in Europe and Asia-Pacific. In the latter region, its focus has been on Australia and Hong Kong.

    Company plans to double Australia capacity

    One of the hottest markets for Rackspace today is Australia – a market where the company has only been since about two years ago. Roenigk said business was strong there and the provider is expanding capacity to support the demand.

    Rackspace has about 2 megawatts in its Sydney data center and eyeing addition of another 2 megawatts on the same campus.

    Demand drivers in Sydney are similar to those the company is observing in the U.S. There are many new businesses getting off the ground, using a cloud-only infrastructure strategy. There are also lots of enterprises with headquarters in Australia’s most populous city that opt for hybrid infrastructure, putting the “crown jewels” on dedicated servers and using Rackspace’s public cloud to burst capacity.

    In Asia, Rackspace finished an expansion project at its Hong Kong site about one year ago and is eyeing another location, potentially another one in Hong Kong or, alternatively, one in Singapore.

    A cooling pilot (and a huge capacity expansion) in U.K.

    Europe is another new market for Rackspace. Construction is in full swing on the 130,000-square-foot project in Crawley, U.K., which Rackspace is co-developing with Digital Realty. The first 10-megawatt building is not even finished, but the project has already received two sustainability awards, Roenigk said, sounding amused.

    It will be the first Rackspace data center that will use indirect evaporative cooling and act as sort of a pilot project for the relatively new technology. If it works as expected, Rackspace will implement it in other facilities it builds in the future.

    A direct evaporative cooling system adds moisture to the warm air stream, which is cooled when a certain amount of moisture evaporates and exhausted into the environment. An indirect evaporative cooling system does the same, except instead of pushing outside air that has gone through the evaporative cooling process into the data hall uses it to cool a separate air stream via a heat exchanger.

    Many web-scale data center operators, such as Facebook, use direct evaporative cooling in their facilities. The indirect approach is better because it prevents impurities in outside air from getting into the air supplied to the IT gear, Roenigk explained.

    West coast hungry for cloud data centers

    Rackspace is also growing its data center footprint across multiple major markets on its home turf in the U.S. It has recently taken down an additional 8 megawatts of capacity at the DuPont Fabros campus in Ashburn, Virginia, as it prepares to bring online between 3 megawatts and 6 megawatts in the Dallas market and weighs expansion options on the west coast.

    Rackspace is considering two primary options for the west coast expansion: a greenfield development in Boardman, Oregon, where it bought a 200-acre property about three years ago, and acquisition of a fully built data center in northern Nevada. Roenigk declined to provide any details on the Nevada facility, saying only that the company was currently going through a due diligence process for the potential deal.

    One of the considerations playing a role in the decision is availability of renewable energy. The Boardman site has access to hydroelectric energy, which is abundant in Oregon, while in Nevada the company would have to pay for renewable energy that will be fed into the common grid and use power from the grid, which has a mix of sources.

    If there is urgent need to expand west coast capacity, Rackspace may also simply take more space with one of its existing providers.

    West coast plays an increasingly important role for Rackspace, whose business for the first decade of existence was driven primarily by east coast and Midwest customers, Roenigk said. He attributes this trend to “mostly just the number of new startups on the west coast.”

    Strategic changes yet to play out

    The amount of servers Rackspace deploys to serve its customer base grows every quarter. Its server count at the end of the second quarter was about 108,000 – up from about 106,000 at the end of the first quarter. The company has recently adjusted its business strategy to a certain extent, placing an emphasis on providing managed services along with cloud infrastructure. It has also recently overhauled its bare-metal cloud offering. OnMetal (the new bare-metal cloud service) was originally rolled out in northern Virginia and Dallas is next up. If the changes prove to be successful – the OnMetal business is growing nicely, according to director of infrastructure strategy Aaron Sullivan – its server count will only grow, and so will its data center footprint.

    << Previous Day 2014/08/19
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org