Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, June 28th, 2016

    Time Event
    12:00p
    Yahoo Wants to Sell Its ‘Chicken Coop’ Data Center Designs

    Yahoo garnered a lot of attention in 2009, when it announced the unusual design of a data center it was building in Lockport, New York. Shaped like a chicken coop, the facility would rely primarily on outside air for cooling, use a flywheel-based energy storage system, and have an annualized PUE (Power Usage Effectiveness) under 1.1, which was better than the average data center PUE Google was reporting at the time.

    Now, the patents and patent applications describing the design of the Yahoo Computing Coop data center, also known as the Yahoo Chicken Coop, are for sale. The innovations are part of the trove of thousands of patents and applications the troubled internet giant is hoping to sell. Yahoo expects to make more than $1 billion from the sale, the Wall Street Journal reported, citing anonymous sources.

    The trove of patents is only one of the assets Yahoo is selling as it continues to wrestle with shrinking revenue. It has been soliciting bids for many of its assets, including its core internet business, according to news reports, with Verizon Communications and AT&T reportedly being top contenders for its core assets.

    See also: Yahoo Data Center Team Staying “Heads-Down” Among Business Turmoil

    ‘Non-Core’ Patents

    Technologies described in the portfolio of intellectual property that’s being marketed are what the company considers “non-core,” a Yahoo spokesperson told Data Center Knowledge via email. “Yahoo is exploring the divestiture of a portfolio of more than 3,000 of our non-core patents and pending applications covering a wide range of technology, including early-filed internet search, advertising, and cloud technology.”

    Also on the list are social networking, messaging, mobile, video, and data center cooling patents and patent applications, she said.

    Yahoo plans to structure the potential transaction in a way that will allow it to continue using innovations in the portfolio, including its data center designs, by licensing them from the future buyer.

    The Yahoo Computing Coop has been a key part of the company’s data center strategy in the US, and it plans to continue using it and iterating on it in the future.

    “We’ll continue to have access to the Chicken Coop design through our license-back and will look for opportunities to continue to leverage that incredibly efficient design going forward,” the spokesperson said. “Equally, we see value in sharing our data center cooling technology patents as part of the portfolio that we’re divesting, so architectural design and construction firms can leverage that patented technology.”

    Patent applications that describe key elements of the Yahoo Computing Coop data center design include:

    Among inventors listed on the applications are Mozan Totani, Yahoo’s manager of infrastructure strategy and data center development, and Scott Noteboom, who led the company’s global data center operations when it kicked off the Computing Coop project. Noteboom, who left Yahoo in 2011 to lead Apple’s infrastructure strategy, now runs his own Internet of Things startup called LitBit.

    Read more: This IoT Startup Wants to Break Down Data Center Silos

    Minimalist Data Center Design

    The Computing Coop design maximizes use of outside air for cooling and supplements it with evaporative cooling when necessary instead of relying on power-hungry mechanical chillers. Cold air enters the building through louvers on one of the walls and fills a plenum, where it is brought to appropriate temperature and humidity.

    The design reduces energy use further by omitting air handlers, relying instead on server fans and a minimal amount of facility fans to direct airflow through the servers and into a central “chimney,” through which warm air travels upward by natural convection into the penthouse that gives the building its chicken-coop look. Once at the top, the air is either absorbed for recirculation or pushed outside.

    Yahoo has used the design to build data centers in Lockport and Quincy, Washington. The company also owns and operates data centers in La Vista, Nebraska (close to Omaha), and Singapore, using leased data center facilities in other places around the world.

    3:00p
    Cybersecurity Mania Subsides and Ushers in a Shakeout: Gadfly

    (Bloomberg) — There’s been no better magic word than “cybersecurity” to crack open technology investors’ wallets.

    Corporate executives feared their companies could be the next victim of embarrassing digital incursions like the ones at Target, Sony and the US Office of Personnel Management, so they earmarked big budgets for technology defenses. The rising tide of cyberattacks lifted all the companies selling cyber life preservers.

    But now the cybersecurity industry has reached a critical crossroads. The industry’s relative newcomers such as FireEye and Bromium are under pressure to prove they can be lasting winners or sell out as the fragmented industry gives way to consolidation. Members of the computer security old guard including Symantec, meanwhile, are scrambling to remake themselves and reduce reliance on technologies like antivirus software that are out of step with modern hacking threats.

    Both trends have spurred a flurry of sale activity, both realized and potential. This month, the private equity owners of security firm Blue Coat sold the company on the eve of an IPO at a price that roughly doubled their money. Intel is pursuing a sale of its security division former known as McAfee, the Financial Times reported Monday. One of the most well publicized of the new crop of cybersecurity startups, FireEye, has fielded takeover offers, according to Bloomberg News.

    See also: Number of Costly DoS-Related Data Center Outages Rising

    This flurry is a sign the cybersecurity mania has peaked. If Intel sells McAfee at roughly the same price it paid six years ago — a possibility floated by the FT — the chip maker must be hoping it can ditch the onetime computer-security pioneer before the cybersecurity bubble completely deflates.

    For a time, though, the money flowed nearly indiscriminately in the hunt for solutions to increasingly sophisticated cyberattacks. Investments in private cybersecruity startups tripled over five years, from $1.1 billion in 2011 to $3.8 billion in 2015, according to research firm CB Insights.

    Members of the cybersecurity youth movement such as FireEye, Tanium and Skybox Security appealed to investors and customers because they took a different approach to dueling the cyber bad guys. The newer philosophy is that it’s nearly impossible to prevent wily attackers from breaching corporate computing networks, as antivirus software sought to do for years. Instead, the younger companies emphasize technology that can quickly identify where the breaches are and limit the damage before it spreads to essential corporate information and causes Sony-level havoc.

    But it got tough to tell the legitimate technologies from similar-sounding pitches made by “me too” types. Investors have started to rethink their indiscriminate bets.

    At a recent Bloomberg technology conference, Roelof Botha, a partner at prominent Silicon Valley investment firm Sequoia, lamented the excessive hype and funding of young cybersecurity firms. “It’s an area where there are just too many companies, too many niche products that don’t deserve to be standalone companies,” he said.

    He compared cybersecurity to the markets for disk drives and e-commerce, both onetime fragmented industries where only a handful of companies thrived. (Not all panelists at the Bloomberg conference agreed with Botha’s pessimism.)

    In the public markets, cybersecurity is showing signs of strain. An index of cybersecurity companies has declined about 18 percent in the last year, according to Bloomberg data, compared with a 3 percent decline for the S&P 500. Dell had a tough time selling investors on the IPO of its cybersecurity spinoff, SecureWorks, although the firm’s lack of profits may be the biggest factor in its market struggles. FireEye has shed about 81 percent of its stock market value since its peak of more than $13 billion in early 2014.

    This is the stark new reality for cybersecurity firms: Companies with last-generation technology such as Blue Coat will flip to new owners who are happy for steady cash flow but no hyperbolic growth. Members of the new guard will need to grit their teeth and sell to Cisco, IBM or other tech industry consolidators — and maybe accept prices below their lofty peaks. Hackers aren’t going away, and companies still need to fight off cyberattacks. But for the cybersecurity industry, the moment of reckoning is here.

    This column does not necessarily reflect the opinion of Bloomberg LP and its owners or the opinion of Data Center Knowledge and its owners.

    5:01p
    HPE Wants to Give Data Center Automation a Fresh Start

    When Hewlett Packard Enterprise announced its Data Center Automation Suite [PDF] a little over a year ago, it was with the promise of providing tools for automating provisioning, patching, and compliance across “the full stack.” On Tuesday, the company gave the idea another try, indicating that it’s learned some things about heterogeneous data centers over the past year.

    HPE appears to be very mindful that data centers are already deploying open source automation tools such as Chef and Puppet. Now that more data centers are moving containerized environments into production, the tools used by IT or DevOps professionals and those used by software developers suddenly find themselves alongside one another.

    According to Nimish Shelat, HPE’s marketing manager for Data Center Automation, the service will now work to absorb the automation scripts that both departments are using — and may still continue to use — into a single environment under a newly reinforced, unified portal. Mind you, DCA has been integrating Chef recipes and Puppet scripts already, but HPE wants to give the platform a fresh start, beginning with how it integrates into existing data center environments.

    See also: Why HPE Chose to Ship Docker in All Its Servers

    “Despite the fact that [data center operators] have investments in place,” Shelat told Data Center Knowledge, “they are realizing that, as the complexity and scale of their environment grows, the tools they have invested in are not enough. Some of them are not heterogeneous or multi-vendor in nature, and as a result, they end up with multiple tools they have to deal with to manage their environments.”

    Same Tasks, Different Tools

    Indeed, there are CI/CD platforms such as Chef and Puppet, container orchestration tools such as Kubernetes and Mesosphere DC/OS, application performance monitoring tools from New Relic and Dynatrace — all of which claim to provide some aspect of that “single pane of glass” for data center management and automation. There’s enough of these single panes of glass, it seems, that stacked end-to-end they could form their own skyscraper.

    See also: Cisco Tetration Brings Data Center Automation to Legacy Apps

    But as subcultures form within organizations around the use and maintenance of these individual tools, HPE argues, the job of integrating tasks across departments in an organization ends up being done manually. Carrying workloads across silos, remarked Shelat, introduces innumerable opportunities for human error.

    “We have realized there is a common pattern,” he said. “Server folks tend to do provisioning, patching, and compliance; network folks tend to do provisioning, patching, and compliance; database and middleware folks are doing the same. When you talk to all of them, they want to bring automation into their lifecycles, so they can do things more quickly; and they all desire a standardized, consistent way of doing things.”

    See also: HPE Rethinks Enterprise Computing

    Above and Beyond

    In framing the present objectives for Data Center Automation, Shelat painted a mental picture of an automation layer above the level of task-oriented automation, in which he placed Chef and Puppet. In this upper layer is the “standardized, consistent” method to which he referred: an oversight process comprised of flowcharts that may be visually assembled using a drag-and-drop development environment. Each of these flowcharts represents a broad function of automation, such as provisioning a service, patching an application currently in production, and validating a process for compliance rules.

    Within each of these flowcharts, an automation process — which may include a Chef or Puppet script — may be incorporated as what he called an “atomic element.”

    “We are not saying it will be one or the other; there will be certain lines of business and certain areas of IT that are adopting open source technologies like Chef and Puppet,” conceded Shelat, “or that are trying them out in a certain area. In which case, despite the diversity of investments that might exist in the environment, we can package up the automation that is created by Chef and Puppet, through their scripts and recipes, and adopt it and integrate it into the automation that’s delivered through Data Center Automation. You can do the design-level work at the Chef and Puppet level, but the execution can be triggered through DCA.”

    Compared with Jenkins

    Puppet and Chef are configuration management tools. A recipe in Chef, and a script in Puppet, specifies the infrastructure resources that an application requires to run as intended, in the client’s data center or on their cloud platform. A CI/CD platform such as Jenkins can stage these items as units in a pipeline of continuous integration — automation at one layer above configuration management. So HPE is evidently positioning Data Center Automation as an alternative to Jenkins. Instead of pipelines, DCA offers flowcharts that may be more familiar, or easier to digest, for IT admins who perceive automation as a somewhat less intricate process than a mammoth cluster of pipelines.

    Jenkins employs a three-stage pipelining process; DCA now employs a four-stage process, categorized as “Design,” “Deploy,” “Run,” and “Report.” An automation task, or “flow,” can be tested and simulated in the “Design” environment. But once it is promoted to the “Deploy” stage, it’s capable of being triggered by another flow, or in response to a service request or an incident report, Shelat said. An audit log is being continually generated, and an end-of-day report lets admins see analytics reports on how flows are being triggered, and how they are performing in response.

    One of these, he described, is the Return on Investment report, which estimates the amount of human work time reclaimed through automated responses, over a given period of time. Those reports can be aligned with goals that are declared beforehand, he said, similar to the way a Google Analytics report depicts how the phraseology and deployment of an online ad campaign meets with goals for revenue and viewership.

    “When they prioritize the top things they want to automate,” said Shelat, “they are most likely the time-bound things, or the areas they tend to slow down the most. Provisioning of servers could take several days. Then they have an idea of how much time they will save, when they have this entire thing automated and integrated end-to-end.”

    5:30p
    After Brexit Vote, US Tech Giants Face Splintered Digital Future in Europe

    (Bloomberg) — US technology businesses like Alphabet and Facebook toil to make the real world as borderless and global as the digital worlds they create. The physical version just got a lot messier in Europe, the second-largest market for these giant companies.

    Britain voted on Thursday to leave the European Union, fracturing what was slowly becoming a single digital market into potentially two—or possibly more—jurisdictions for technology issues ranging from data privacy, competition, tax and recruiting.

    “After the vote in the UK, we are obviously entering a moment of some uncertainty and concern,” Facebook public policy executive Joel Kaplan said at the Computer and Communications Industry Association’s Transatlantic Internet Policy Reception in Washington on Monday.

    Exhibit one in the new European reality for US tech giants is data privacy, an area where Facebook has already sparred with regulators. These companies worked hard to get a single set of rules for data protection and privacy across the region. After the UK leaves, the country will likely have to create its own set of privacy regulations, potentially driving up compliance costs associated with moving consumer information across borders and complicating the operation of data centers.

    See also: Digital Realty Buys Eight Equinix Data Centers in Europe

    That could influence consumer data collection used in advertising on Alphabet’s Google search engine and Facebook’s social media pages, the shipment of a book ordered from a retailer on Amazon’s site in Germany to a shopper in the UK, or the management of data centers that power Microsoft’s cloud computing services.

    Take the EU’s new General Data Protection Regulation, a sweeping legal framework approved in April and set to take effect in May 2018. When the UK leaves, GDPR will no longer apply to the country. Experts expect the UK to adopt similar rules to GDPR in its stead, though a separate set of rules—no matter how similar—will cause headaches for US tech companies seeking simplicity as they operate across Europe.

    “They want one law, one framework, one consistent approach,” said Eduardo Ustaran, a London-based lawyer at Hogan Lovells specializing in privacy law. “The lack of harmonization makes more cost than regulation itself.”

    See also: Dutch Data Center Group Says Draft Privacy Shield Weak

    Ustaran said he’s been trying to calm business clients down since the vote. “There is very much a sense of panic or at least concern,” he said.

    If the UK wants to take part in the free flow of data across European borders after leaving the Union, it will have to adopt data-protection standards that the EU deems “adequate” in meeting the same standards as GDPR. The UK Information Commissioner’s Office, which oversees the country’s data processing, said Friday that “international consistency around data protection laws and rights is crucial, both to businesses and organisations, and to consumers and citizens.”

    Unless the UK harmonizes with the new EU rules, US companies will lose the ability to process European consumer data in the UK, said Jane Finlayson-Brown, a partner at the law firm Allen & Overy in London. This could impact companies that want to use data centers in the UK—even as backups if their data centers in other EU countries go down.

    See also: Interxion Expands as European Data Center Market Heats Up

    Cloud-computing businesses, such as Amazon Web Services and Microsoft Azure, are particularly vulnerable to the complications of a split regulatory region. Cloud companies function more efficiently when they can easily shift loads from one data center to another. Restricting the types of data that can be stored in specific locations hampers that flexibility.

    “A lot of companies are saying, ‘Help. We don’t know what to do. We’re completely dependent on free flow of data,'” said Todd Ruback, chief privacy officer at Ghostery, which helps companies navigate privacy laws.

    However, Brexit could be an opportunity for the UK to position itself as an alternative to the rest of Europe, especially since GDPR is “very complicated, with a ton of legalese,” said Dana Simberkoff, chief compliance officer at compliance and governance software company AvePoint. If the UK adopts its own regulations that meet GDPR standards but make compliance simpler, the country “could have an amazing opportunity” to be more business-friendly. “What businesses want is clarity, and to a certain extent, that is not what the GDPR delivers,” Simberkoff said.

    Brexit is less worrying, or the specific impact is so far unknown, when it comes to other Europe issues close to U.S. technology companies’ hearts, such as antitrust enforcement, a tax crackdown and the quest for engineering talent across the region. But one thing is clear: The UK has been more favorable to US technology interests, and without that influence, nations like France and Germany, which are less aligned with the US, are likely to take greater control of EU policy.

    “That will shift the balance of power more towards countries where the state plays a bigger role,” said James Waterworth, European vice president for the CCIA, which lobbies for Amazon, Google, Facebook, Microsoft and other US tech companies. “That is not the ideal approach for fast-moving sectors like the high-technology and internet sectors.”

    Google will be especially hard-hit, said Gary Reback, a Silicon Valley attorney who has represented Google rivals that have complained about the company to EU antitrust authorities. Google has invested heavily in UK operations and tried to build relationships with the government of out-going UK Prime Minister David Cameron to increase its political clout in the region, he noted. “Their ploy to control the European Union through influence in Britain has been wiped out,” Reback said.

    Google declined to comment on the impact of Brexit, as did Facebook, Microsoft and Amazon.

    Google recently constructed a new London headquarters, in part because of the British capital’s status as a hub for ambitious professionals from across Europe. When the UK leaves the free trade bloc, companies may lose the ability to move employees freely around the region without separate visas. “That would harm firms that need to move talent around if it came to pass,” Waterworth said, while noting it’s too early to tell how harmful this will be.

    For now, Google’s biggest advertising clients continue to have a major presence in London, which supports the need for the search giant’s presence there, said Andy McLoughlin, a partner at SoftTech VC in California, who is originally from Leicester, England. “The question mark is 15 to 20 years from now: Has London been unseated as the financial and startup center for Europe?” he said.

    —With Bradley Saacks, Jeremy Kahn, and Spencer Soper

    7:19p
    Sabey, Q9 Partner on Cross-Border US-Canada Data Center Offering

    Sabey Data Centers has partnered with Canadian data center provider Q9 Networks, with both companies hoping to make their services attractive to customers that need data center footprint in both the US and Canada.

    Broad geographic reach has become an important factor for data center providers, all of whom are competing with giants like Equinix and Digital Realty Trust, whose global presence makes them natural go-to players for international infrastructure deployments. Neither Seattle-based Sabey nor Q9, partially owned by Bell Canada, currently have data centers outside of their respective countries.

    In 2012, Canadian telco Bell (BCE Inc.) partnered with Ontario Teachers Pension Plan and two US private equity firms, Providence Equity Partners and Madison Dearborn Partners, to acquire Q9 for C$1.1 billion from its previous owner ABRY Partners.

    The two data center providers will now market each other’s services to their existing customers and prospects, “effectively creating an international offering of data center locations and services that spans the US and Canada from coast to coast,” they said in a joint statement issued Tuesday.

    Sabey operates data centers in Washington State, Virginia, and New York City, while Q9’s footprint is in Toronto, Calgary, and Kamloops, British Columbia.

    John Sabey, the US company’s president, said the partners expect strong demand for their pan-North American offering from the financial services and energy sectors, “especially with oil and gas producers who operate in both countries.”

    See also: Sabey Earmarks Half of Manhattan Skyscraper for Office Space

    << Previous Day 2016/06/28
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org