Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, June 22nd, 2016

    Time Event
    12:00p
    Cisco’s Tetration Brings Data Center Automation to Legacy Apps

    The bet is this: Not every data center will modernize the way it automates and orchestrates applications by completely renovating its infrastructure. Many enterprises don’t appear to have that option, whether or not they have thousands of employees or billions in revenue. Last Wednesday, Cisco announced what is being marketed, at least in the early going, as an analytics platform that leverages artificial intelligence to optimize the interactions between applications at all levels of the data center by smoothing out their data flows.

    It’s called Tetration, and it will be made available in July. It’s an odd name, to say the least. Behind it is a concept that requires a bit extra human concentration to digest: Most applications today are legacy software running in slightly more modern contexts: usually virtual machines. Keeping those applications running smoothly in data centers requires network monitoring, automation, and security, the dynamics behind all three of which are beyond the application’s comprehension and control and to some extent beyond the ability of the infrastructure itself.

    Tetration applies a form of the concept that telcos are utilizing today to keep their voice and data traffic flowing — real-time network analytics — to oversee network operations on a very grand scale and to apply remediation techniques that will optimize applications to a level in the ballpark with full-scale containerization, Tetration’s creator, Cisco Fellow and network engineer Navindra Yadav, told Data Center Knowledge.

    Detour to Nirvana Land

    According to a Cisco promotional video, Tetration will apply machine learning to analyze, in real-time, two principal categories: the data flow of IP packets between switches and the dependencies between software components running in virtual machines. They’re two very different aspects of the data center — enough to make one wonder whether Tetration is supposed to be an infrastructure monitor or a services monitor.

    But don’t wonder about it too long, warns Cisco’s first professionally-produced commercial for Tetration. “Changing the infrastructure or adding a new security policy might just break everything,” states the commercial’s pitchman, before adding a CGI cube to a symbolic data center that then catabolizes before him like dominoes in a windstorm.

    It’s a message that plays to customers’ fears. If those fears are indeed genuine, they may arise from what Cisco’s Yadav described to us as a lack of comprehension. “If you look at the marketplace, 90 to 95 percent [of data center administrators] don’t know the dependencies in their applications,” said Yadav.

    Organizations that have shifted to a containerized infrastructure, such as Docker; that have in turn adopted infrastructure automation such as Chef, Puppet, or Ansible; and that have made the move to an orchestration platform such as Kubernetes or Mesosphere’s DC/OS, are what Yadav calls “the bleeding edge.” These are data centers whose developers are building their own Docker packages, so they know what and where their software dependencies lay. And their data centers now run in what Yadav unabashedly describes as “Nirvana land.”

    But that’s the five to ten percent that fall outside Cisco’s target market for Tetration. “For a majority of legacy applications that have been deployed in the enterprise, they don’t have that mapping,” he said.

    Crawling in the Dark

    Here is where Cisco’s acquisition last March of the orchestration platform CliQr Technologies comes into play. As Yadav described, there are two types of dependency maps that reveal the interrelationships between software components: policy mapping at the Level 2 network layer and inter-application communication higher up the stack. These relationships are managed by security policies, and although tight control over those policies may result in secure networks, they may also lead to slower data centers.

    In a company blog post Wednesday, Cisco senior manager for enterprise solutions marketing, Craig Huitema, acknowledged the weight of this dilemma — is it an infrastructure monitor or a services monitor? — by citing the lyrics of a song by a band called Hoobastank (here we go again with the strange names), called “Crawling in the Dark:” “Show me what it’s for | Make me understand it | I’ve been crawling in the dark | Looking for the answer | Is there something more | Than what I’ve been handed?”

    A software dependency map and a network policy map, as Cisco’s Yadav described them, end up being the same map. So addressing the security issue is effectively the same thing as addressing the optimization issue. This unified map is the pattern which Tetration learns: The behavior of interdependent software components is directly correlated to the performance of servers, he said.

    So why was a machine learning model necessary in order to comprehend such performance patterns?

    “We usually do not,” he admitted. “Once you’ve deployed that application, unless you’re trying to do an upgrade, the behavior changes based on the workload — which usually, once we’ve stabilized the model, becomes more static. Then we’re transitioning into the security space, where we are seeing whether the behavior is evolving over its baseline. That’s the security signal that we’re using.”

    As a static pattern, Yadav said, the application behavior becomes a learned configuration that CliQr can put to use in automating that application going forward. That includes a provisioning system that would, presumably, give users an Amazon-like way of dialing up applications on demand, though he declined to provide specifics here.

    Next, this pattern transferred to the Tetration appliance, which can lead the application on an alternate route to Yadav’s Nirvana land. But when the pattern strays outside its learned boundaries, even though it’s not being upgraded, Tetration uses that breach as a “security signal,” to which it responds first by ascertaining whether the application, or parts of it, have “morphed” into something else, perhaps by way of malware injection. Again, Yadav declined to go into further detail as to what happens at this point, citing Cisco proprietary issues.

    Why wouldn’t it be in Cisco’s best interest, we asked, to help lead data centers into Nirvana land by giving them the tools and expertise they need to automate and orchestrate applications on new and more flexible infrastructures, involving SDN and containerization, for example, rather than giving customers an appliance that helps them cope with their infrastructure not being in Nirvana land for the time being?

    “The metadata that we learn? We feed it to CliQr,” he responded. “And CliQr is the one that’s doing this transformation, taking these guys into containers and all the configs — this is what CliQr really does. And even these apps go into nirvana-land. We’re not holding them back; we’re doing automatic discovery. Rather than using humans to go through each line of code and figure out dependencies, Tetration does that automatic discovery, and then gives it to CliQr. And then CliQr turns around and automates provisioning and orchestration.”

    De-mystification

    So Cisco’s value proposition is this: Tetration can ascertain the proper infrastructural configuration data from any application that has already settled down into a regular rhythm. It learns this data from various agents, including software injected into virtual machines, as well as hardware sensors being embedded into Cisco switches, including its Nexus 9200-X and Nexus 9300-EX. Tetration then feeds that data into CliQr, which responds by generating the automation scripts necessary for that application to continue to run properly, or perhaps even more optimally, in the data center. That same data, meanwhile, is used by Tetration to generate security alerts when an application behaves outside its pattern for reasons that can’t be easily explained away.

    It’s a complex proposition, which may be the reason why Cisco’s marketing softens the focus considerably, referring to threats everywhere that might just break everything, and casting its solution as a “time machine” straight out of H. G. Wells. But if Cisco has learned anything about its customers’ behavior in the last decade — even if algorithms were used in the process — it knows that buyers are responding less and less to fuzzy, ubiquitous threats, and more to specific strategies with definitive payoffs. We’ll know whether Tetration is reaching these customers if we see the fog starting to lift, after it’s made generally available next month.

    6:10p
    Slow Waning of the Enterprise Data Center, in Numbers

    While enterprise data centers aren’t going away completely any time soon, the amount of money companies are investing in these facilities and the IT gear they house is declining quickly. More and more workloads are moving to the cloud or into facilities operated by data center providers of various kinds, while corporate IT budgets, with a few exceptions, are flat or declining.

    The latest data center industry survey by the Uptime Institute shows that the shift of enterprise IT workloads from corporate data centers to various versions of outsourced infrastructure is happening faster than previously expected. Survey data from previous years “suggested that the shift to cloud computing would be gradual for conservative enterprise IT organizations,” authors of the report on the latest survey wrote. “However, this year’s data indicate that those assumptions may be incorrect.”

    See also: Top Cloud Providers Made $11B on IaaS in 2015, but It’s Only the Beginning

    About 1,000 data center operators and IT practitioners responded to the survey. They were split about equally between executives, IT management, and IT facilities staff.

    Here are the key numbers from the Uptime Institute Data Center Industry Survey 2016:

    71%: Estimated percentage of all IT assets currently sitting in enterprise data centers.

    20%: Estimated percentage of all IT assets currently sitting in colocation data centers.

    9%: Estimated percentage of all IT assets currently deployed in the cloud.

    50%: Portion of enterprise IT budgets that have been either flat or shrinking over the last five years. Only about 10 percent have seen their budgets increase meaningfully, while the rest have seen modest increases.

    55%: Portion of enterprise server footprints that have been either flat or shrinking over the last five years.

    ~50%: Portion of enterprise IT shops that said they were planning to cut spending on Hewlett Packard Enterprise and Dell equipment this year. HP Proliant and Dell PowerEdge are the two most critical server platforms in enterprise IT. Some of the spending will be redirected to converged hardware platforms. (451 Research, Uptime’s sister company)

    30%: Portion of companies that said they would cut spending on HP Enterprise servers by more than 50% this year. (451 Research)

    ~50%: Percentage of senior executives who expect the majority of their IT workloads will be hosted in colocation data centers or in the cloud. Of them, 70 percent say this will happen over the next four years, and 23 percent say it will happen by next year.

    See also: How Long Will the Cloud Data Center Land Grab Last?

    Changes in enterprise IT spending patterns are having an effect on data center providers. Here’s how colocation providers’ budgets have changes since 2014, and how the amount of colocation and enterprise data center builds have changed:

    Uptime 2016 survey colo budget chart

    Source: Uptime Institute Data Center Industry Survey 2016

    The numbers on builds are especially telling. These are based on responses to a question whether or not the company has built a new data center in the last 12 months.

    7:53p
    Why Keep the Enterprise Data Center?

    Big cloud providers like Amazon Web Services continue forecasting doom and gloom for the enterprise data center. The most recent example was AWS CEO Andy Jassy’s keynote at the cloud provider’s conference in Washington yesterday, in which he said few companies will own data centers in the future.

    That message is of course self-serving, and it’s a message AWS has been hammering for years. The reality is that while the amount of investment companies are making in their own, internal enterprise data centers is either flat or shrinking, no-one really knows whether that on-premise infrastructure model will go away completely. In fact, many signs today indicate that the hybrid infrastructure model, where companies use a mix of on-prem, cloud, and outsourced data center services, is sticking around for the foreseeable future.

    As Apple’s former data center network manager, Jason Forrester, recently put it in an interview with Fortune, companies need “complete command and control” of the infrastructure that supports their core strategic applications, such as Apple’s Siri, Uber’s mapping service, and company accounting or inventory tracking systems. “It’s hard to have that command and control if you can’t even tour your public cloud data center,” he said.

    The reality is that most companies are keeping at least some data center capacity in-house, while outsourcing the complicated and expensive task of managing infrastructure for non-core applications, such as email and web apps, to the cloud. “Most enterprise applications are highly customized for the company’s needs, which means they don’t fit neatly into the public cloud mold,” Forrester said.

    See also: How Juniper IT Went from 18 Data Centers to One

    Recent survey data illustrates clearly that while a lot of capacity is being outsourced, a lot of it is also staying put. Results of this year’s data center industry survey by the Uptime Institute, published today, show that 50 percent of enterprise IT budgets have been either flat or shrinking over the last five years, and that 55 percent of enterprise server footprints have been either flat or shrinking as well.

    At the moment, however, more than 70 percent of enterprise workloads are still running in corporate data centers, according to Uptime. Colocation data centers host 20 percent, and only nine percent is in the cloud.

    Read more: Slow Waning of the Enterprise Data Center, in Numbers

    Enterprise data center construction is slowing, but it’s not disappearing. In 2014, 18 percent of the enterprise IT respondents to the survey said their companies had built a new data center within the previous 12 months. About 15 percent said their companies had done so in 2015. The answer to the same question was affirmative for about 15 percent of respondents this year.

    It’s also important to consider that a lot of the corporate data center requirements are going into facilities leased from wholesale data center providers. This option gives companies full control of their infrastructure without having to own additional real-estate assets.

    It is this type of data center where a lot of the core application workloads by the likes of Uber and Apple have been going. Apple, for example, signed at least two 6MW data center leases – in Northern Virginia and Chicago markets – with wholesale provider DuPont Fabros Technologies last year. The same year, Uber leased a 6MW data center from Digital Realty Trust in Dallas.

    Read more: Who Leased the Most Data Center Space in 2015?

    In another example, enterprise software giant SAP recently bought a piece of land in Colorado from T5 Data Centers where it plans to build a data center of its own.

    It’s undeniable that the total portion of the enterprise IT budget that goes to the corporate data center is shrinking. Whether it will eventually reach ‘zero,’ however, remains a huge unknown.

    See also: Top Cloud Providers Made $11B on IaaS in 2015, but It’s Only the Beginning

    Corrected: A previous version of this article incorrectly stated that SAP had leased data center space from T5 Data Centers. The company actually bought land on a T5 data center campus where it plans to build a data center.

    << Previous Day 2016/06/22
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org