Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, October 21st, 2015

    Time Event
    12:00p
    Who is Winning in the DCIM Software Market?

    Despite continued skepticism in the industry, the data center infrastructure management software market is growing. DCIM software is expensive, complicated, difficult to implement, and difficult to define, which are all good and justified reasons for skepticism. But at least the first three characteristics can be applied to any major enterprise software product. There’s no “easy button” for DCIM – at least today – and that’s something more and more data center operators are learning to come to terms with.

    IDC this week released its latest MarketScape report on the DCIM software market, finding that while the very top players are the same, competition for their market share is heating up, as new vendors have enter the market, and as previously existing ones step up their game.

    Gaining on the Leaders

    The three leaders in the DCIM software market are still Schneider Electric, Emerson Network Power, and Nlyte Software. IDC ranks market leadership based on both technological capabilities and business strategy.

    But a group of other vendors the market research firm considers “major players” has moved much closer to the top three. Within this group are also some newer entrants with highly competitive solutions.

    Two of those newer entrants are the Swiss industrial automation giant ABB and the German conglomerate Siemens. The two companies have a special advantage if the data center market continues to move in the direction of greater automation.

    Both have significant industrial automation experience, which they are using as a “calling card” in the DCIM software market, Jennifer Koppy, research director at IDC and author of the report, said. Both Schneider and Emerson have automation capabilities, but ABB and Siemens are emphasizing them to “make their mark.”

    There are also smaller players that have the potential to become a big headache for the bigger ones. Companies like Device42 and Cormant, instead of trying to address every data center management functionality under the sun, have chosen a few areas that are significant pain points for customers and focused on making them cheaper and easier to deploy.

    “They’re very low-cost ways to get started with DCIM,” Koppy said about Device42 and Cormant. They make software easy to download and get up and running very quickly in contrast with the others, and the strategy is working.

    “They’re growing in leaps and bounds,” Koppy said. “Device42 is growing exponentially faster than the rest of the market. It will keep the market interesting over the next couple of years.”

    What will determine their success in the future, however, will be the extent to which they’ll be able to expand the usefulness of their solutions, she warned.

    Major DCIM software players other than the three leaders, according to IDC:

    • Panduit
    • CommScope
    • Sunbird Software
    • FNT
    • ABB
    • Siemens
    • Cormant
    • Device42
    • FieldView
    • CA Technologies
    • RF Code

    Modius is a lone vendor in the “contenders” category.

    Market Expected to Get Close to $1B by 2019

    Compared to revenues raked in by vendors selling other kinds of data center technology, DCIM software isn’t a huge market, but it is growing steadily.

    IDC’s forecast is that it will reach about $576 million in size this year – up from $475 million in 2014. Koppy projects that the market will grow at a compound annual rate of about 16 percent between 2014 and 2019, at which point it will reach $988 million.

    Today, Emerson has the largest share of the DCIM software market, and Schneider has the second largest, according to IDC.

    The Importance of Services

    Providing services around DCIM planning and deployment is a big part of the revenue calculation for DCIM software vendors. Of the $576 million in revenue IDC expects vendors to make this year, $184 million is revenue from services. The proportion remains similar over the next five years in IDC’s projections.

    Domenic Alcaro, VP of mission critical services and software at Schneider, told us in an earlier interview that companies were increasingly realizing just how costly large-scale DCIM implementation can be and coming to terms with those costs. That companies were recognizing that deploying DCIM should be viewed as similar to deploying other major enterprise applications – like those by Oracle or SAP, which traditionally require hiring an entire implementation team – is a sign that the market is maturing, he said.

    Being able to provide a high level of services around DCIM has become necessary for a DCIM software vendor to stay competitive, Koppy said.

    DCIM to Become Key Differentiator for Colos

    One of the most promising market segments for DCIM vendors are data center colocation providers. Koppy expects take-up of DCIM solutions by colo companies to accelerate, as they look for ways to stand out in the market.

    Today, some colocation providers are using off-the-shelf DCIM software, and some are taking the “white label” approach, buying commercial solutions but giving them their own brand names and offering DCIM capabilities to customers as a service.

    “Colocation providers are choosing their dance partners now in the DCIM space,” Koppy said. She expects DCIM to become a major differentiator for colos next year.

    Another growing opportunity for DCIM vendors is the expansion of data center capacity to places not traditionally known as major data center markets. Referred to as “edge” data centers, companies are building these facilities primarily to store digital media content and cloud services for delivery to customers in those areas faster.

    There’s demand for remote management of those facilities by data center providers, and DCIM software that offers such capabilities finds its way to those sites.

    Pressure on IT Drives DCIM Maturation

    After a period of vagueness and a lot of frustration with DCIM, the market is showing signs of maturation. Data center operators in general have a better understanding of DCIM technologies, Koppy said.

    The market is maturing, and it’s pressure to provide “better, faster, cheaper delivery of IT resources that is really driving that,” she said.

    IT shops recognize that they need better visibility into their data center infrastructure and, ultimately, more automation, and DCIM software, in combination with other management systems, such as IT service management software and building management systems, offers a way to get there.

    1:00p
    Docker Buys Tutum, a Cloud Service for Container Management

    Docker, the startup behind the popular Linux container format, has acquired Tutum, which provides an easy way to manage containerized applications and deploy them on any cloud infrastructure service or in companies’ own data centers. Terms of the transaction were not disclosed.

    Tutum adds an infrastructure management capability to Docker’s existing tool set that helps developers take an application from development to production. Docker, a company built around open source technology, also expects the cloud service to become a significant new revenue stream, in addition to its existing money-making product called Docker Hub Enterprise.

    Providing Tutum capabilities as a cloud service will be a “central part of our commercial offering,” David Messina, VP of enterprise marketing, said.

    The value of Tutum is its integrated set of capabilities, including networking, storage, monitoring, and scheduling, under one umbrella, Messina explained. While container services by cloud infrastructure giants Amazon, Microsoft, and Google focus on lower-level infrastructure tasks like clustering and scheduling, Tutu’s platform “covers the whole application lifecycle,” he said.

    Besides application management tools for developers, Tutum enables IT operations teams to deploy applications across distributed infrastructure that can consist of multiple cloud services or data centers and move applications from one set of infrastructure to another.

    The service has been running in beta since the second half of 2013 and boasts more than 24,000 beta users today, according to Messina. It will continue to run in beta as Docker integrates it with its products and services.

    The plan is to roll out an integrated offering that covers the entire spectrum, from Docker Hub to Tutum, he said. Once that offering is in general availability, the company will also announce pricing of the service.

    3:00p
    Why Cloud Security is Now Part of the MSP Job Description

    Ellen Rubin is CEO of ClearSky Data.

    Here’s a scene you don’t want to be in: Your client was breached, and it’s too early to know exactly what happened. However, you know the client’s cloud-hosted data is now at risk, and its tendency to transfer unencrypted data, despite your caution against the practice, played a role in the situation. It’s unlikely that this instance was your fault, but you’re not going to be able to stick your head in the sand and claim that protecting this data wasn’t part of your job.

    Today’s managed service providers (MSPs) are adding cloud services to their portfolios to meet their clients’ needs, but despite technological advancements in recent years, every cloud consumption model can expose companies to new risks. To keep your team and your customers safe, consider some of the most common sources of risk and implement these four ways to help clients support safe cloud environments:

    Encryption, Key Management and Operational Controls

    Every enterprise customer incorporates some level of compliance as it manages software, conducts operational processes and physically manages data and infrastructure. These regulations may be industry-specific compliance, mandated by a partner organization or dictated by the company. In any case, you must adhere to your clients’ individual preferences and standards as you handle their data and confirm that customer organizations will remain fully in compliance as they’re using your cloud services. Your MSP team has to remain one step ahead of your customers in terms of upholding best practices and educating itself on industry threats.

    For example, even if your customer doesn’t typically encrypt the majority of its data, encrypting data in transit and at rest should be table stakes for your team. If a client isn’t running an encryption-key management system, he may request that you set one up, and you must be ready to respond to his request in a way that simultaneously protects your team from liability issues and protects your customer from security risks. In other words, clients should maintain exclusive control of their encryption keys – you should never see a customer’s private data. Finally, if metadata and configuration or management data apply to your service, consider how these elements will be encrypted for the customer side and on your team’s behalf.

    Invest in Third-Party Audits of Your Services

    It’s impossible to predict when a malicious party will launch an attack on a customer’s data, but this lack of foresight doesn’t mean you can’t identify likely sources of threats and help clients secure private information before attacks wreak havoc. According to Pew Research Center, Internet-connected systems are inviting targets for cyberattacks, but data protection isn’t necessarily top of mind when online applications are being designed. In other instances, hackers frequently gain access to private systems through undetected weak points in security architecture, or additional copies of data that were created for backup or testing.

    An MSP team can locate potential exposures and entry points in clients’ cloud infrastructure by enlisting a third-party organization to run penetration-testing services and regular, independent security audits. Then, use information from the audit to ensure that there are no back doors in the servers, storage, applications or other systems that handle data. Those insights can also inform the firewalls you build, the intellectual property (IP) you add to your service portfolio, and the cloud and network security practices you enlist.

    Build and Nurture an Atmosphere of Trust6 With Your Customers

    Earning your clients’ trust as you handle their data is a major factor in ensuring security. From development to archives, MSPs should be transparent about data retention and how it’s encrypted. It’s also critical to enact security controls for your staff members, which may include access-permission policies, highly secure access environments and protection processes that follow an employee’s termination. These controls should also be communicated to customers, especially if they’ll play a role in managing the service you provide, or if you’ll need to tie up loose ends when a customer ceases to do business with your team.

    Be Transparent About the Location of Data at All Times

    Throughout the data lifecycle, it’s crucial to be aware of your exposure to liability issues, especially if a situation occurs where a customer will need increased control of the location of its data. For example, when the U.S. government ordered Microsoft to disclose Microsoft Exchange emails hosted in a regional data center in Ireland in 2014, the case raised serious questions in the industry about data ownership. Not only should you offer customers the ability to control which regions and countries host their data, but your MSP team should avoid gaining or retaining access to encryption keys that may cause issues if that data is seized.

    In some cases, enterprises are placing higher expectations on the security practices of their MSPs than they enforce for their internal IT teams. To maintain security as you help customers leverage the cloud, don’t assume your tactics are safe if you’ve never gone through a breach, and don’t wait for news of exposure to motivate your team and your customers into frantic action. Instead, educate yourself about the risks you’re most likely to face, test your strategies and show your clients you can fully protect their data.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:19p
    US to Deploy Open Compute-based Supercomputers for Nuclear Security

    In the first publicly disclosed deployment by a government agency of computing hardware based on specs that came out of the Open Compute Project, the Facebook-led open source hardware and data center design initiative, the US Department of Energy has contracted Penguin Computing to install a supercomputing systems at three national labs.

    The systems, called Tundra Extreme Scale, will support the work of the National Nuclear Security Administration, whose mission is “enhancing national security through the military application of nuclear science.” Among other things, the NNSA is charged with maintaining safety and reliability of the US nuclear weapons stockpile and drives nuclear propulsion technology for the US Navy.

    Not only is the $39 million deployment of supercomputers at Los Alamos, Sandia, and Lawrence Livermore National Labs will be the first publicly known deployment of OCP gear in the public sector, it will also be one of the largest deployments of OCP gear in the world, according Penguin. The largest deployment is at Facebook data centers.

    Another major deployment is the hardware in Rackspace data centers designed by Rackspace to support its cloud services.

    Penguin, headquartered in the Silicon Valley, occupies a niche within the OCP hardware market, selling OCP-based high-performance computing systems rather than simple, “commodity” gear most other vendors in the space, such as Quanta and Hyve, sell.

    The deal shows that companies like IBM and Cray, who have been mainstays in the government HPC market for many years, are facing a new major competitive threat.

    Penguin expects the supercomputers at the national labs to achieve peak performance between 7 and 9 petaflops. One petaflop represents a quadrillion of calculations per second.

    The Tianhe-2 supercomputer in China, currently considered the world’s fastest supercomputer, is capable of 33.86 petaflops, according to Top500, the organization that ranks HPC systems biannually. An IBM Blue Gene at DoE’s Argonne National Lab, at 8.59 petaflops, is comparable in performance to the Penguin systems and ranks fifth on the most recent Top500 list.

    Penguin’s Tundra Extreme Scale systems at the three national labs will be powered by Intel Xeon E5-2695 v4 processors.

    6:51p
    Trend Micro Acquires HP Tipping Point to Boost Network Defense

    varguylogo

    This post originally appeared at The Var Guy

    Security software vendor Trend Micro has acquired HP’s TippingPoint business in a bid to strengthen its hold as a leading security solution provider.

    The $300 million agreement will allow Trend Micro to combine its security services with HP’s network security solutions, which includes the company’s next-generation intrusion prevention systems. The deal is expected to help Trend develop a “multi-faceted” approach to security and to cement the company as a leading name in the growing cybersecurity market.

    “Organizations need a layered threat defense working seamlessly across the enterprise to address threats before, during and after an attack,” said Eva Chen, CEO of Trend Micro, in a statement. “This new next-generation network defense solution combines our best-in-class network breach detection system, with proven intrusion prevention and response capabilities from TippingPoint.”

    The acquisition will also give Trend Micro access to HP TippingPoint’s Digital Vaccine LABS (DVLABS) solution for real-time threat intelligence.

    TippingPoint’s Zero Day Initiative will be used in tandem with the Trend Micro Smart Protection Network to speed malware detection times.

    The acquisition is the latest development in Trend Micro and HP’s partnership, with the two companies having worked together since 2014, according to the announcement. Following the acquisition, both companies said they will continue to work together and will form a strategic partnership with TippingPoint around several key business areas, including resale, managed services and OEM activities.

    The deal is expected to close later this year, but no official date has been announced.

    This first ran at http://thevarguy.com/information-technology-merger-and-acquistion-news/102115/trend-micro-acquires-hp-tipping-point-boost-networ

    << Previous Day 2015/10/21
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org