Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, January 9th, 2017

    Time Event
    1:00p
    Equinix Eyeing Expansion in Emerging Markets with Cloud Giants

    As it continues to pursue the opportunity to provide data centers enterprise IT shops move their on-prem infrastructure into, Equinix also has its eye on the cloud data center needs of the biggest hyperscale players.

    In the near future, we may see the Redwood City, California-based data center provider help one or more web giants, such as Amazon Web Services, Microsoft Azure, or Google Cloud Platform, bring big chunks of cloud data center capacity to emerging markets. Speaking at a conference last week, Equinix CEO Steve Smith said he could foresee working with a global-scale anchor tenant to enter South Africa, South Korea, and Taiwan, broaden Equinix’s Latin American footprint, go deeper into China, and potentially tackle India.

    The implication was that the retail colocation and interconnection specialist would consider signing a large-scale wholesale lease with a strategic partner to act as a magnet to attract regional and local colocation and interconnection customers in new markets. It’s not unheard of for Equinix to do the occasional wholesale deal with a strategically significant customer.

    When asked if public cloud was “friend or foe,” Smith replied that “colo and cloud are natural allies.” He expects enterprise customers to utilize small colocation deployments to take advantage of secure access to the Big Three and other cloud providers at Equinix IBX data centers.

    Speaking at the Citi Internet, Media, and Telecom conference in Las Vegas, Smith reviewed a busy 2016 and went over highlights of the company’s near-term strategy, which included catching “the next wave of cloud deployments,” a focus on access to submarine cables, catering to the growing Internet of Things market, enabling new use cases for enterprise ecosystems, developing Software Defined Networking technology for WANs, and managing growth and culture of a company that’s rapidly increasing in size. Over the last two years, Equinix acquired TelecityGroup in Europe, Bit-isle in Japan, and finished 2016 with the announcement of its blockbuster $3.6 billion acquisition of a broad portfolio of Verizon data centers in the Americas.

    Preserving Company Culture

    A consistent theme throughout a Q&A discussion with Smith at the conference was what he saw as the need to preserve the Equinix culture, with “the house growing taller and wider.” Over the past two decades, the company has invested $19 billion to build out its global network and grown through 13 M&A transactions.

    Post-Verizon, it will operate 175 data centers in 42 metros spanning 22 countries across Asia-Pacific, EMEA, and the Americas. Equinix today finds itself at an inflection point where the company is getting more complex to run, Smith said. While wanting to keep bureaucracy out and the organizational chart flat in hopes of retaining flexibility and speed, he also expects to continue to run selling, general, and administrative expenses at 22 percent — the highest in the industry — acknowledging that it was “part of the model.”

    The Enterprise Opportunity

    Last year at the Citi conference, Equinix announced a goal of trippling its customer base over the following five to 10 years, with the majority of new customers expected to come from a dozen or so different industries. During the third quarter of 2016, for example, Equinix saw record bookings with financial services firms, which was partially due to adding the insurance vertical into that category.

    The company says it has over 9,000 global customers, including one-third of the Fortune 500 and one-quarter of the Global 2000. Notably, 2,700 of them are companies that provide various technology infrastructure services, such as network and IT services, private and public cloud hosting, and managed services.

    According to Smith, 55 percent of Equinix’s revenue comes from large enterprises with infrastructure deployed in its data centers across all three regions: Americas, EMEA, and Asia-Pacific.

    On average, the company adds 150 to 200 new logos per quarter with 60 to 70 percent of those falling into the enterprise bucket. However, there are between 300,000 and 500,000 small and medium enterprise customers who collectively spend more than $1 trillion each year, representing an enormous opportunity for a company like Equinix.

    Read more: Equinix CEO Unveils Aggressive Plan to Court Enterprises

    Top Metros Deliver in 2016

    Bookings in the top urban markets around the world led the way for Equinix in 2016. In the US they were New York, Northern Virginia, Chicago, and Silicon Valley; in Europe they were Paris, Frankfurt, and London; and in Asia-Pacific they were Sydney, Singapore, Hong Kong, and Shanghai.

    The Verizon deal gives the company entry into three new markets: Bogotá, Houston, and Culpeper, Virginia.

    Read more: Why Equinix is Buying Verizon Data Centers for $3.6B

    Source: Equinix

    While acknowledging that there is certainly “competition around the edges,” Smith believes that Equinix data centers, located at key control points, “can cover almost any enterprise requirement from its existing footprint.”

    4:00p
    Microsoft Wants to Patent an Underwater Data Center Reef

    One idea Microsoft researchers voiced when they announced their underwater data center experiment in 2015 was having its enclosure act as an artificial reef, a habitat for marine life. Now, the company wants to include the idea in the body of intellectual property that’s come out of Project Natick.

    Microsoft has filed an application to patent an Artificial Reef Datacenter, adding it to the list of applications describing other elements of its underwater cloud, such as a cooling system that uses the ocean as a giant heat exchanger and intrusion detection for submerged data centers.

    If it graduates from experimentation stage to production, Project Natick will represent an enormous shift in the way the data center industry operates. According to Microsoft, not only can you put servers closer to more people if you submerge them off the coasts of major population centers, you will have easier time securing the required permits and guarantee a more predictable environment than on dry land. About half the world’s population lives in coastal areas, and the ocean floor is a relatively stable environment, with nearly constant water temperature and no disturbances from storms, currents, and politics.

    After the team submerged a one-rack data center capsule off the coast of California for the first time in 2015 and saw encouraging results, Project Natick manager, Ben Cutler, announced last year that the next step would be to deploy a data center that’s about the size of a shipping container on the ocean floor.

    This is a diagram of an entire server farm consisting of containers submerged and anchored above ocean floor (Source: Microsoft’s Artificial Reef Datacenter patent application)

    The artificial-reef application, first spotted by Patent Yogi, describes multiple potential design approaches to encouraging marine life to occupy the enclosure. They include designing a structure that appears inviting to sea creatures, providing warmth, dispersing nutrients, and minimizing acoustic energy emanating from the machines inside. Of course, the data center being a stable structure with stable environmental conditions around it is already inviting enough to many ocean inhabitants.

    The application describes data center enclosures that can sit directly on the ocean floor or anchored to float several feet above, when the floor is uneven. Microsoft is not limiting the patent to oceans, however, also listing lakes, rivers, and even flooded quarries as bodies of water that may become home to Azure, Office 365, or Xbox servers in the future.

    4:30p
    Microsoft to Reorganize Partner, Service Teams, Promises No Job Cuts

    Brought to You by Talkin’ Cloud

    Microsoft is implementing some changes to its partner and services teams that will take effect Feb. 1, according to a report by ZDNet on Wednesday.

    The moves affect Microsoft’s sales, partner and services teams under the Worldwide Commercial Business group, the report says, which is led by executive vice president Judson Althoff. A Microsoft spokesperson told ZDNet that the reorg would not result in any job losses.

    The changes come more than three years after Steve Ballmer outlined his vision for “One Microsoft” where instead of siloed teams working on individual products, the company operates as “one, big integrated company, where all the products work with each other, and all the teams work together.”

    As part of the reorganization, Microsoft is combining its Enterprise & Partner Group (EPG) and Small and Mid-Market Solutions and Partners (SMS&P). The combined businesses will be led by Chris Weber, current corporate VP of midmarket solutions and partners, ZDNet reports.

    Various partner teams inside the company will come together under the One Commercial Partner business, which will be led by Ron Huddleston, corporate VP of Enterprise Partner Ecosystem for Microsoft.

    Huddleston joined the company last year from Salesforce where he held a significant role in building Salesforce AppExchange and its cloud-based channel, OEM and ISV program, ZDNet says.

    As part of the changes, Gavriella Schuster and the WPG team will be moving into One Commercial Partner. Schuster was named corporate vice president for Microsoft’s Worldwide Partner Group in June 2016, succeeding Phil Sorgen.

    On the cloud side, Microsoft is creating a new unit called Microsoft Digital which will be led by Anand Eswaran, corporate VP of Microsoft Services. This unit will focus on getting partners and customers to use Microsoft’s cloud services, and will include evangelists, developers, and digital advisors and architects.

    The changes come as Microsoft has consolidated some of its events, and renamed its Worldwide Partner Conference to Microsoft Inspire, which will be held in July in Washington, D.C.

    This article originally appeared here, at Talkin’ Cloud.

    6:12p
    IBM Breaks Record in U.S. Patents With Cloud, AI and Health Bets

    (Bloomberg) — IBM received a record-breaking number of U.S. patents in 2016, topping the list for the 24th consecutive year, showing the efforts the company is making to expand its business into products that process and analyze vast amounts of health-care data.

    International Business Machines Corp. was awarded 8,088 patents in 2016, it said Monday. The top five winners remained the same as the previous year, with Samsung Electronics Co., Canon Inc., Qualcomm Inc. and Alphabet Inc.’s Google nabbing those spots, according to data compiled by IFI Claims Patent Services, a unit of Fairview Research LLC.

    IBM has been working to turn around its business by putting more resources into cognitive computing, which adds a layer of data analytics and machine learning to software and information in order to pull out computer-generated insights and automate processes. Chief Executive Officer Ginni Rometty has been targeting the health-care industry specifically, which she often refers to as the company’s “moonshot.”

    “What drives the company’s focus on it is the underlying fact that health care is an instance of people drowning in data,” Chief Innovation Officer Bernie Meyerson said, adding that emerging technologies such as cheaper genomic sequencing are going to create even more health care-related information. “If you don’t have a system that can deal with that data in volume, then the physician has no chance.”

    Some of IBM’s patents last year involve using cardiac images to characterize the shape and motion of the heart and help detect cardiac disease, a drone that would be able to measure contamination in places like hospitals and on manufacturing floors and a method to plan routes based on a traveler’s mood.

    More than 2,700 of the 2016 patents involved cognitive computing or cloud technology, the company said in a statement. While about 100 patents directly involved health care, many of the more cognitive computing ones could also be applied to the industry, Meyerson said. IBM consistently spends about 6 percent of its annual revenue on research and development.

    See alsoNew IBM Cloud Data Centers in UK ‘Infused with Intelligence’

    Among the top 50 recipients of patents, U.S. companies got 41 percent of the total, according to IFI. Amazon.com Inc., Apple Inc. Boeing Co. and Cisco Systems Inc. were among the big gainers. In contrast, Japanese tech companies saw decreases in the patent awards from a year ago.

    “The Japanese companies are always very strong,” said Larry Cady, a senior analyst with IFI. “They remain the No. 2 country, but the total number of grants went down across the board.”

    Korean, Taiwanese and Chinese companies getting an increasing number of patents, and may eventually reach the levels of Japanese companies, he said.

    A record total 304,126 patents were issued in 2016, about 2 percent more than 2015. Computers, telecommunications and semiconductor patents dominated among the top recipients, Cady said.

    See alsoMicrosoft Wants to Patent an Underwater Data Center Reef

    6:39p
    Even in the Digital Age, Businesses are Compelled to Physically Move Data

    Arvind Venugopal is Senior Product Manager at Cleo.

    Amazon recently launched a service to literally drive a truck to your data center, load it up with all of your data, and drive it back to an Amazon server farm to plug it in and push it to the cloud. The rationale behind this offering stems from the idea that businesses looking to move massive amounts of data – terabytes and petabytes of information – to Amazon’s cloud don’t have a fast, affordable option to do so over the internet. But what if they did?

    It’s hard to believe that we’ve advanced so far technologically that we now have to rely on “old school” actions to move and maintain the digital data. But that’s what it’s come to given most software companies’ inability to quickly and securely pipe massive amounts of information into and out of the enterprise.

    Given these limitations, mega-companies like Amazon recognize the need to bring its own massive data centers – those comprising its Amazon Web Services (AWS) – closer to the point of data generation. But this particular truck service, with its obvious security and risk issues, might be something that happens only in the largest companies and just once or twice a year. So what’s a business to do about the high-volume, large data set transfers it must facilitate on a daily basis?

    Transferring petabytes of data anywhere – physically or digitally – will still take a considerable amount of time to move, and not every business can afford to summon an Amazon truck to back up its databases or move log files to the cloud. But the organization looking to replace unsecured physical data movement and affordably move large volumes of information while maintaining control of the endpoints will benefit from accelerated data transfer and governance capabilities, especially when those capabilities already fully integrate with your B2B systems, cloud solutions, and internal applications.

    The Data Deluge

    In today’s business ecosystem, as the size of data increases for all communications with customers, partners, distributors, and suppliers, as well as with internal applications and backup systems, the smooth exchange of data becomes much more difficult using traditional legacy file transfer tools. By 2020, experts estimate that 1.7 MBs of new digital information will generate every second for every person on Earth, and that number increases exponentially for businesses.

    Organizations currently generate, capture, move, and store things like:

    • Customer information
    • Banking and other financial data
    • Personal medical and genetic data
    • Social media interactions and customer support logs
    • Shipping records, customs documents, and logistics information

    But businesses are using traditional protocols over high-latency networks – or, on a smaller non-Amazon-truck scale, are still shipping physical devices – to transfer these growing data sets internally and externally. With all of this digital innovation, data centers and organizations rely far too much on less innovative methods as a means to simply get the job done.

    A high-speed data transfer protocol harnesses the power of lean file transfer and integration technology to accelerate the flow of large files and data sets, specifically over long distances between servers, data centers, and other destinations. Accelerated file transfer enables businesses to efficiently move data while also maintaining control and governance while:

    • Moving large data sets into and out of data lakes
    • Copying databases between data centers to create redundancy and backup
    • Transferring and receiving huge data sets from partners

    Recognizing the need for a high-speed file transfer solution now positions your business for a future that includes further data volume increases, but recognizing the kinds of capabilities such a solution should include can be less clear.

    Key Functionality

    A next-generation accelerated file transfer solution should quickly, easily, and securely move extremely large files while also supporting blazing-fast transfers of smaller files, all while optimally using existing network bandwidth and resources.

    While most high-speed data transfer solutions in the market are either hardware-based or based on lesser-used network technologies, an advanced software-based solution can be deployed as part of an enterprise shared-architecture model powered by the same engine that drives your routine business processes, all on a single platform.

    Transfer speed obviously will be the priority, but an advanced high-speed solution also must enable organizations to monitor and track performance metrics from both an IT and a business perspective, as well as act on data via reports and dashboards based on business objectives.

    In addition to the tracking, alerting, and authentication, a leading solution supports:

    • External accelerated file transfer for files of all sizes with support for other protocols and big data connectors
    • SSL encryption to secure data in motion
    • Guaranteed delivery of transferred packets
    • Feeding data directly to cloud architectures or big data storage mechanisms
    • Metadata tagging to extend the technology into other applications
    • Automatic checkpoint restart and data integrity checks

    Summary

    Amazon recognized that the way most companies currently move massive amounts of information isn’t good enough, and with data volumes only continuing to increase across the consumer and business spectrums, their customers need solutions. Vendors extending high-speed software solutions to digitally exchange information will be ahead of the curve in serving their customers.

    For the business that can’t simply order up a high-tech truck to take away its data, an advanced high-speed file transfer solution from one of these vendors provides an efficient and affordable way securely move its mission-critical data.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    8:11p
    CenturyLink Acquires SAP Solutions Specialist SEAL

    CenturyLink has acquired SEAL Consulting, an Edison, New Jersey-based SAP solutions provider, for an undisclosed sum.

    The deal underscores the Monroe, Louisiana-based telco’s intent to continue investing in its already broad portfolio of IT services after having sold its global data center fleet to Medina Capital and BC Partners. The company’s executives have said in the past that the divestment of nearly 60 data centers didn’t mean it would begin winding down its IT services business.

    SEAL helps enterprises with implementation, post-implementation support, and outsourcing for a variety of SAP solutions. CenturyLink and SEAL expect the telco’s existing infrastructure and services will strengthen the combined company’s SAP business.

    CenturyLink is already an SAP supplier, providing the German enterprise software giant’s cloud-based in-memory database HANA, and the acquisition brings an additional team of SAP experts into its fold.

    Read moreCenturyLink Sells Its Colo Business to Fund Level 3 Deal

    8:48p
    Secondary Market Specialist Peak 10 Buys Louisville Data Center

    Data center provider Peak 10 has acquired a data center in Louisville, Kentucky, its second in that market, the company announced today.

    The Charlotte, North Carolina-based service provider specializes in colocation and managed services in secondary US markets, and the deal expands its presence in a secondary market with heavy enterprise presence. Louisville is home to headquarters of numerous Fortune 500 companies, including Yum! Brands, KFC, Kindred Healthcare, and Humana.

    The 33,000-square foot data center already has an anchor tenant. Peak 10 did not name the tenant, describing it as a “communications and data services provider.” Companies that fit that description and have data centers in Louisville include Cogent, tw telecom, and Level 3 (in the process of being acquired by CenturyLink).

    Peak 10, owned by the San Francisco-based private equity firm GI Partners, now operates 16 data centers across 10 markets, primarily in the Southeast and Mid-Atlantic regions. The company recently appointed Chris Downie, former CEO of data center provider Telx, to replace its outgoing chief executive and founder David Jones.

    Read moreNew CEO Charged With Taking Peak 10 to the Next Level

    Corrected: A previous version of this article said Peak 10 operated about 30 data centers. A company spokesperson has clarified that 14 of them are “pods” rather than full-fledged data centers, so the total number of data centers has been changed to 16.

    9:36p
    Report: Apple to Consolidate Server Production in Arizona

    In a move that may somewhat appease President-elect Donald Trump – who has criticized Apple for outsourcing iPhone manufacturing to China – the company is reportedly planning to consolidate production of servers that populate its data centers around the world in Mesa, Arizona.

    The company has applied for permission to produce servers and data center cabinets at the former 1.3 million-square foot manufacturing plant in Mesa it is converting into a data center and, apparently, a computer assembly facility. The application to the federal Foreign-trade Zones Board seeks permission to produce “finished products” at the site and to ship materials and components from overseas without paying customs duty.

    An anonymous source familiar with Apple’s data center operations told Business Insider that the company has been producing servers for its data centers at individual data center sites but plans to consolidate production at the Mesa facility.

    While it’s not unusual for companies that operate data centers at Apple’s scale to design their own IT gear, it is unusual for them to also assemble it in-house. Companies like Facebook, Microsoft, and Google usually outsource actual production of their custom servers and network switches to design manufacturers in Asia.

    Apple, like the three companies mentioned above, is a member of the Open Compute Project, the open source data center hardware design community founded by Facebook. Unlike its peers, however, Apple has yet to make an official design contribution to OCP.

    One possible reason Apple is keeping hardware production in-house is that it considers its server designs a competitive advantage and doesn’t want to expose them to contract manufacturers.

    << Previous Day 2017/01/09
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org