Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, October 2nd, 2013

    Time Event
    11:30a
    SGI Acquires FileTek for Storage Virtualization

    SGI announced that it has acquired big data storage virtualization provider FileTek, for an undisclosed amount. Adding storage virtualization, large-scale data management and Active Archive will expand SGI’s storage portfolio and enable customers to manage data assets efficiently.

    “We’re thrilled to welcome FileTek and its customers to SGI,” said Jorge Titinger, president and CEO of SGI. “With the addition of FileTek solutions, SGI enables existing and new customers to align both unstructured and structured data with the most cost effective storage throughout its lifecycle, with seamless user access and reliable petascale protection. This acquisition also reflects our strategy to build on SGI’s leadership in High Performance Computing, expertise in Big Data, and experience delivering over 600 petabytes of storage capacity annually, to become a global leader in petascale storage solutions.”

    SGI said it will continue to support FileTek customers. FileTek’s assets include its StorHouse lifecycle management and active archive solution and Trusted Edge intelligent analysis software. SGI will sell and support Trusted Edge and StorHouse products under the SGI brand, and will integrate technologies over the near and long term while protecting SGI and FileTek customer investments.

    “StorHouse and Trusted Edge solutions have experienced wonderful acceptance in the market,” said Bill Loomis, CEO of FileTek. “We are excited that this acquisition will continue to allow both products to fully achieve their potential through wider global distribution with SGI. Our customers will benefit from resources, expertise and solutions under the SGI brand that enable government and commercial enterprise environments to lower the cost of Big Data and high-volume storage.”

    12:30p
    Automation: The Key to Unlocking IT Innovation and Creativity

    Dustin Snell is CEO, Network Automation.

    Dustin-Network-AutomationDUSTIN SNELL
    Network Automation

    Many theories of human development, notably Maslow’s hierarchy, use a pyramid to describe the stages of growth, theorizing that basic requirements must be met before people can move on to higher developmental stages. People who are struggling to meet basic needs (food, shelter, security, etc.) generally can’t operate at full capacity, but once their basic requirements are met, they can focus on solving problems and expressing themselves creatively.

    Something similar is at work within organizations: Before working professionals can fully explore opportunities for innovation, they have to expend resources to take care of the day-to-day operations that are essential to the organization’s basic functioning. In real-world terms, this can mean that talented IT professionals are spending their days writing code and memorizing syntax instead of innovating to solve problems and support critical business strategies.

    The issue of how to best allocate IT resources is becoming more pressing in part because of the rate of growth in data storage and transfer activities across multiple streams. As the sheer volume of data generated grows at an astonishing pace, IT budgets have generally remained flat. IT strategists who are seeking new ways to capitalize on data and leverage expanded computing capacities often turn to virtualization to improve organizational performance.

    The Pros and Cons of Virtualization

    Virtualization can be an excellent solution to extend computing capacity, allowing companies to fully utilize their existing hardware assets, improve redundancy and scale up to accommodate demand spikes in real time. That’s why products like VMware are so widely used by savvy CIOs, including 100 percent of the Fortune 100 companies’ IT departments. But while the benefits of virtualization are considerable, it can be a challenge for businesses to integrate virtual machine functions with their business process automation tools.

    This means IT professionals have to find new ways to manage business process conditions and events when using virtual assets. Too often, it requires IT departments to devote extensive time and effort to creating and maintaining complex codes to ensure that virtual machine performance parameters interact successfully with the company’s business process conditions and events. And that means IT professionals who are engaged in writing code and memorizing syntax don’t have time to develop innovative new business solutions.

    No-Code Automation Means More Time for Creativity

    A better approach is to find a solution that gives IT professionals a way to integrate virtual assets with business process applications without having to develop and maintain code. An ideal strategy would enable integration across multiple applications and systems, including popular assets like VMware’s Server, VMware Workstation, ESX/ESXi, VCenter and VMware Player systems. By empowering IT professionals to manage physical and virtual assets and multiple applications within a unified workflow, a no-code automation solution can free up time for IT professionals.

    And more time means that automation can remove the obstacle to innovation within IT organizations. By eliminating the need to write code and memorize syntax to perform basic functions, automation can liberate IT talent to explore higher-level problems and think creatively about new business solutions. An automation solution that provides no-code development capabilities can streamline the development and maintenance processes that consume IT resources, leaving time and space for innovation.

    Instead of spending hours each week meticulously creating and maintaining code to automate standard business processes and facilitate the integration of virtual asset protocols, IT professionals could engage in more creative work. A talented IT team would be free to consult with cross-functional groups to identify new and emerging business requirements and market demands, develop creative solutions and deliver competitive advantages.

    By unlocking IT creativity and innovation through automation, companies can move beyond the basics to achieve higher levels of organizational development – and yield  greater marketplace success.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    1:00p
    The Network is the Bottleneck: Stacks and Flows Are the Answer

    ORLANDO, Fla. - It’s the age of DevOps. Rapid iteration is all the rage. Developers are writing and deploying new applications faster than ever, rolling out new features in real time. And then …

    “What happens is that they run into the network,” says Lori MacVittie. “The network is the roadblock. It’s slowing things down.”

    MacVittie, the Senior Product Manager for Emerging Technologies at F5 Networks, discussed new strategies for automating network management Tuesday in a session at Data Center World Fall. Cloud software stacks and software-defined networking (SDN) are two approaches to making networks easier to manage and configure.

    In a presentation titled “Stacks and Flows: Decoupling Hype from Reality,” MacVittie familiarized data center managers with the technologies that can help automate networks – a critical step in making data centers more agile, which has been a key theme of this week’s conference.

    Wanted: Network Automation tools

    “The network is the bottleneck in the data center,” said MacVittie. “It’s the last of the three groups (servers and storage are the others) that needs to be automated.”

    Many companies are sorting out how to scale their server and storage infrastructures. Meanwhile, MacVittie noted that 54 percent of enterprises report adoption of some level of DevOps, in which apps and updates are being shipped more frequently. The network is critical to everything, and not as flexible in adapting to frequent changes.

    “We need to made the network as elastic as the rest of the infrastructure,” said MacVittie. “Everybody’s pushing on you. So you want to be more agile and flexible and create new services.”

    Opening Up the Network

    The networking sector has long been dominated by a handful of vendors focused on switches running on proprietary software. SDN seeks to use OpenFlow and related technologies to separate the hardware and software, typically by shifting management functions to a commodity server running open source software that can manage the switch.

    That would give data centers the option of using more commodity hardware and open source software in their network infrastructure.

    “If OpenFlow takes off, it’s going to supplant a lot of your existing network management systems,” said MacVittie.

    ‘A Nascent Market’

    The reality check? For all the buzz, SDN technologies are emerging and immmature. “Pretty much the entire market is still in flux,” said MacVittie. “Expect to see that for another couple of years. it’s a nascent market.”

    The good news: that gives data center managers time to get up to speed on SDN and what it can mean to their efforts to transform their facilities.

    While a growing number of vendors are rolling out gear that can use OpenFlow, MacVittie urged the audience to pay attention to the Open Daylight Project, which may further expand the options for network management. Open Daylight still uses OpenFlow, but has added extensions to incorporate hardware that isn’t OpenFlow enabled, which can now be automated and included in OpenDaylight architectures.

    “You can still get SDN for your equipment that might be proprietary and closed,” said MacVittie.

    What About Stacks

    Then there are the stacks – cloud software platforms like OpenStack, CloudStack, OpenNebula and Eucalyptus. These platforms tie together hardware, a control layer and a presentation layer, and are extensible.

    “The goal of the stack is to automate the data center,” said MacVittie. “What the stacks try to do is use the APIs the vendors are providing today. You can create a plugin that goes into the stack and runs the equipment.”

    For all the automation, some networking challenges will remain.

    “You can’t automate taking a plug out and putting it back in,” said MacVittie.

    1:30p
    A Closer Look at eBay’s Bloom-Powered Data Center

    Last week eBay went live with a new data center power design that abandons diesel generators and UPS units in favor of Bloom Energy fuel cells powered by natural gas. The Utah facility is using 30 Bloom Energy Servers as the primary power source for its entire IT load, using the utility grid for backup power should the Bloom units fail. Until now, data centers using Bloom boxes (like Apple) have used them for supplemental energy while retaining traditional UPS units and generators for emergency power. In this video, eBay VP of Global Foundation Services Dean Nelson and Bloom VP of Mission Critical Services Peter Gross discuss the new data center, how it came about and what it means. This video runs about 3 minutes.

    For additional video, check out our DCK video archive and the Data Center Videos channel on YouTube.

    2:00p
    Splunk Cloud Unlocks Machine Data Analytics To Broader Audience

    At its annual Worldwide Users’ Conference in Las Vegas this week machine data analytics company Splunk advanced its portfolio of products, with a new version of its enterprise edition, a cloud service, and a free service. The .conf2013 event can be followed on Twitter hashtag #splunkconf.

    Splunk Cloud

    Splunk (SPLK) announced the general availability of Splunk Cloud, a new service that delivers Splunk Enterprise in the cloud. Available as a cloud service to gather insights into machine-generated big data, the service is powered by Amazon Web services. It includes access to Splunk Enterprise apps, APIs, alerting and role-based access controls. Splunk Cloud also integrates with on-premises deployments of Splunk.

    “We are expanding our offering because we heard our customers loud and clear – they want Splunk Enterprise as a cloud service,” said Dejan Deklich, vice president of cloud engineering, Splunk. “Delivering the enterprise-class Splunk Cloud, based on award-winning, patented technology, enabled us to also make Splunk Storm free. This is exciting because we are now giving a free service to developers who have to pay for the same results from other vendors.”

    White Ops, a digital advertising and security provider, uses Splunk Cloud as a security analytics platform to detect digital advertising impressions that are fraudulently made by bots, delivering actionable intelligence they use to rationalize media spending on behalf of their customers.

    “Splunk Cloud is critical to our bot detection operations,” said Tamer Hassan, chief technical officer of White Ops. “It helps us fight crime by catching malicious traffic, both online and on enterprise networks. Splunk Cloud lets us use the full potential of Splunk Enterprise, and helps us do the number-crunching and deep analysis of massive traffic flows that we need to do to catch the bad guys, without having to install or manage any infrastructure.”

    Splunk Enterprise 6

    Splunk also announced the general availability of Splunk Enterprise 6. The new release delivers powerful and fast analytics, the company said. Splunk Enterprise 6 introduces three innovations to make analytics dramatically faster: Pivot opens up the power of analytics to non-technical business users and analysts, Data Models provide meaningful representation of underlying machine data, and a high performance analytics store is a transparent acceleration technology.

    “Splunk Enterprise 6 is the platform for machine data for everyone, with powerful analytics and performance that unlock machine data insights to an entirely new set of users,” said Guido Schroeder, senior vice president of products, Splunk. “With an enhanced user experience, simple management of enterprise deployments and a rich developer environment, Splunk Enterprise 6 gives technical users the ability to define the meaningful relationships in the underlying data, enabling business users and analysts to easily manipulate and visualize data in a simple drag-and-drop interface. All of this, with amazing performance on low-cost commodity hardware.”

    Numerous Splunk customers and partners cited accolades for the company and its new enterprise edition. ”Security analysts at Oak Ridge National Laboratory utilize Splunk Enterprise to analyze large volumes of diverse machine data streaming in real time,” said Jesse Trucks, cyber security engineer, Oak Ridge National Laboratory. “It is vital these analysts be able to directly manipulate and interact with the data to quickly obtain operational security intelligence. Splunk Enterprise 6 will enable many more analysts to discover patterns and generate information from our data with the new visualization capabilities.”

    2:30p
    BMC Launches Big Data Management Solution for Hadoop

    BMC launches a big data management solution for Hadoop, Dataguise raises $13 million for expansion, and IBM acquires big data analytics company The Now Factory.

    BMC launches big data management solution for Hadoop

    BMC announced the availability of BMC Control-M for Hadoop, a new big data management solution that dramatically reduces the batch processing time for extremely large collections of data sets, simplifying and automating Hadoop batch processing and connected enterprise workflows. The new solution is a purpose-built version of the company’s Control-M workload automation offering. BMC Software specifically designed Control-M for Hadoop to improve service delivery by detecting slowdowns and failures with predictive analytics and intelligent monitoring of Hadoop application workflows. “BMC Control-M provides MetaScale with a control point that allows us to integrate all those big spokes to our hub which is big data.  By processing and analyzing large data on Hadoop, we are able to quickly react to changing market conditions and better serve our members through improved competitive pricing and new product launches. And by performing intensive calculations on very large and complex data sets very quickly using Hadoop, we can now reliably meet our production schedule and largely eliminate use of traditional ETL tools.”

    Dataguise closes $13 million Series B funding

    Data security intelligence and protection solution provider Dataguise announced that it has closed a $13 million Series B funding round led by Toba Capital with additional capital coming from the investment arm of a leading electronic conglomerate. The new funds will finance global sales, marketing, channel development, as well as support innovation of its DgSecure suite of products which allow businesses to achieve a 360-degree view of their sensitive data assets across big data and traditional data repositories. Additionally, industry veterans Vinny Smith and Paul Sallaberry have been appointed to the Dataguise Board of Directors. “Having Toba Capital and other investors participate in our success will be instrumental as we enter the next phase of rapid growth and continued product innovation that is changing the dynamics of securing privacy data in Hadoop,” said Manmeet Singh, CEO, Dataguise. “This round of financing gives us additional runway to execute further with our growing family of channel partners and end customers, while extending our technical capabilities and aggressively expanding our footprint and market reach worldwide.”

    IBM to acquire The Now Factory

    IBM announced an agreement to acquire The Now Factory, a privately held Dulbin provider of analytics software that helps communications service providers (CSPs) deliver better customer experiences and drive new revenue opportunities. As a part of IBM’s MobileFirst Aanlytics portfolio CSPs using The Now Factory’s software can gain real-time insights into their customers by analyzing massive quantities of network and business data. With this type of insight, CSPs can provide an enhanced quality of service to their customers by better managing negative experiences and network outages.  “Today’s announcement is part of IBM’s strategy to continually establish leadership in the era of big data and capitalize on the opportunity to analyze data in real time,” said Bob Picciano, General Manager, Information Management, IBM Software Group. “The Now Factory’s software enhances IBM’s Big Data and Analytics portfolio by improving the speed, development and implementation of big data solutions, and gives communications service providers the ability to better service their customers.”

    3:00p
    Video: TelecityGroup Opens Helsinki Data Center

    TelecityGroup has opened Hansa, the latest of the Group’s new data centres located in the Helsinki Metropolitan area, Finland. TelecityGroup selected the facility’s location in the Vantaa area of Helsinki as the ideal location for the new data centre due to its extensive connectivity options, its secure location and its proximity to a nearby 195MW power plant. “Finland is a major strategic European internet hub with excellent operational conditions, in terms of both location and premium infrastructure,” said Marko Vanninen, Managing Director, TelecityGroup Finland. ”We believe that by increasing customer capacity in the region we will bring more international customers to the Finnish market. This will be further boosted by the government’s planned reform of electricity tax, as well as the plans for a direct marine data cable to Germany.” The data centre will be completed in stages and when complete will be one of the largest carrier-neutral data centers in Finland. Here’s a video overview of the Helsinki data center:

    For additional video, check out our DCK video archive and the Data Center Videos channel on YouTube.

    5:00p
    How to Upgrade A Data Center for Efficiency and Density

    Whether it’s cloud, virtualization, big data or IT consumerization, the increasing use of these platforms is driving data center demand. For enterprise data centers today the lack the power and cooling capacity for expansion limits the ability to handle those technologies.

    Consider this: statistics from analyst firm, Gartner indicate the following:

    • 82.4 percent of total operating system deployments will be virtualized by 2016
    • The global public cloud services market will grow a projected 18.5 percent in 2013 to $131 billion. Furthermore, over 75 percent of enterprises worldwide plan to pursue a private cloud strategy by 2014.
    • 42 percent of IT leaders globally have either invested in big data or plan to do so within a year

    In this this white paper, Eaton outlines the clear disadvantages of using packaged power and typical cooling solutions. Furthermore, they go on to describe the key components behind a best-of-breed efficiency and power density upgrade strategy. The conversation revolves around:

    • Utilizing sophisticated UPS hardware
    • Intelligent, compact power distribution
    • Flexible, high-efficiency containment
    • Intelligent, logical and complete management functionality

    Finally, in working with the modern data center platform – it’s important to know the benefits of working with best-of-breed technologies which enhance efficiency as well as power density for your entire data center platform.

    Utilizing best-of-breed power and cooling systems that maximize capacity and minimize waste without locking companies in to a limited set of deployment options and vendors is a far more effective approach. Download this white paper today to learn why many data centers are ill-equipped to support today’s most important new technologies; and see why packaged power and cooling solutions can be a flawed way to upgrade existing facilities. Furthermore, this paper describes the core components of a data center upgrade strategy capable of enhancing efficiency and power density more completely and cost-effectively.

    9:00p
    More Scenes from Data Center World, Fall 2013
    Whether it was products to manage power systems or fire protection for the data center, participants at Data Center World had the opportunity learn about and demonstrate a wide array of products in the exhibit halls space. (Photo by Colleen Miller.)

    Whether it was products to manage power systems or fire protection for the data center, participants at Data Center World had the opportunity to learn about and demonstrate a wide array of products in the exhibit hall space. (Photo by Colleen Miller.)

    ORLANDO – AFCOM’s Data Center World, kept rocking and rolling this week with more sessions and an exhibit hall with many vendors ready to meet with participants. Many were seeking appropriate solutions for the needs of their data center and to learn more about innovations in technology. Visit our photo highlights page to see the action from Monday and Tuesday – Exhibit Hall Buzzes at Data Center World.

    << Previous Day 2013/10/02
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org