Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, October 17th, 2016

    Time Event
    6:40p
    Space: the Ultimate Network Edge

    A breakthrough technology developed initially for defense purposes and later commercialized for civilian use is a familiar cadence. One of the most recent examples is free-space optics – a way to use lasers instead of fiber-optic cables to transmit data.

    Wide adoption of self-driving cars is on the horizon; industrial manufacturers are quickly moving toward a world where not only every machine on the factory floor communicates with a server somewhere through a network, but every piece of equipment they sell does the same, for the duration of its useful life. From connected homes to cheap smartphones in everybody’s pockets, skyrocketing growth in the number of network-connected devices coupled with advances in optical technology are driving fundamental changes in the way global network infrastructure is designed.

    A whole new internet backbone is being built, consisting of intercontinental submarine cables, capable of handling unprecedented amounts of bandwidth, 5G wireless networks, and satellites that beam data down to earth using lasers. This backbone is being designed to bring connectivity to places that either didn’t have it before, or didn’t have anywhere close to the amount of bandwidth they will soon need. It is enabling a lot of data to come in from the edge of the network rather than outward, from centralized computing hubs.

    Ihab Tarazi, CTO at Equinix, the world’s largest data center provider, says we’re witnessing the beginning of the next wave of investment in global connectivity driven by the need to collect data at the edge. The company has been talking about this trend for some time, but it hasn’t been clear, until now, how exactly this new global backbone will take shape. “The pieces are finally becoming clear,” he says.

    Enabling the Internet of Things from Space

    Tarazi believes a company Equinix recently partnered with is implementing one of those pieces. Laser Light Communications is planning to launch a “constellation” of eight to 12 laser-enabled satellites – called HALO – that will circle the planet and together with terrestrial networks create a hybrid high-capacity network capable of bringing connectivity literally anywhere in the world. “This one is about massive global coverage,” he says.

    Equinix data centers, places where the bulk of the world’s networks interconnect, will be the primary hubs for distributing data coming from space onto terrestrial networks and vice versa. Each of those hubs will be equipped with three “ground nodes” – 4 feet in diameter, 6 feet tall, 1,000 pounds heavy – and each ground node will have three laser heads, so it can “see” three different satellites, Robert Brumley, CEO of Laser Light, explains.

    The interconnection hubs will not be the only buildings equipped with ground nodes. Laser Light will also deploy them directly in places where data originates, such as corporate campuses. That’s an example of the edge Tarazi is talking about. A financial services firm, for example, or a Hollywood studio, will be able to beam data directly to Medium Earth Orbit – that’s where Laser Light’s satellites will reside – for instant transfer to a bank in Singapore or a video editing outsourcer in Bangalore.

    Major cloud and content companies, some of whom recently started investing directly in new submarine cable construction projects to improve global reach, are also potential customers. These are companies like Microsoft, Google, Amazon, Facebook, or Netflix. Traditional telcos, such as Telefonica or Vodafone, could use Laser Light’s services as another way to connect to regional long-haul networks.

    Another important type of ground-node location will be submarine cable landing stations. “Cable landing sites are points of aggregation and disaggregation,” Brumley says, meaning landmass networks converge at these points to transfer data across the oceans and pick up data traveling the other way to distribute it to its countless destinations on dry land.

    Data will travel at 100 Gigabit per second between ground nodes and satellites and at 200 Gigabit per second from satellite to satellite – about 100 times faster than radio links used in satellite communication today.

    laser light network diagram

    This diagram shows an example of how Laser Light’s satellites will be interconnected with ground nodes and with each other. Click to enlarge. (Source: Laser Light slide deck)

    AI to Route Data Packets Around Clouds

    US Department of Defense, as well as American, European, and Japanese space agencies have been developing free-space optics technology for decades, and Laser Light is making what is probably the first effort to commercialize it at global scale. “This is the first company we know of in that category,” Tarazi says.

    The Reston, Virginia-based startup’s parent company is Marble Arch Partners (formerly Pegasus Holdings), which specializes in commercializing military technologies for global markets and vice versa, adapting commercial tech for military use.

    On the ground, Laser Light’s network will include dozens of SD WANs located in major metros throughout the world. SD WANs, or Software-Defined Wide Area Networks, are enterprise-grade WANs enabled by software that’s disaggregated from hardware it runs on, making them more agile and easy to automate. They are an emerging alternative to WANs that rely on proprietary and expensive, tightly integrated hardware-software bundles networking vendors have sold for decades.

    With free-space optics technology around for some time now, and with carriers already starting to deploy SD WANs, the real technological innovation Laser Light is bringing to the table is the software that will manage its global network. “It’s not only the coolest thing that we’re doing, but it’s also going to be the most important thing that we will have done,” Brumley says.

    One of the biggest barriers to implementing free-space optics is weather, and Laser Light’s plan is to build a network operating system that will literally route traffic around clouds, which interfere in beaming data between Earth’s surface and its orbit. “Lasers have challenges when it comes to atmosphere,” he explains.

    If you are located in Emeryville, California, for example, and it’s a cloudy day in the Bay Area, the system will not send the signal directly to Emeryville. Instead, it may drop it down further south, say in Sacramento, where the sky is clear and from where the data will be routed along terrestrial fiber to its intended recipient by the SD WAN.

    Rather than simply tracking the weather in real-time, the system will use a machine learning model, trained over several years with weather data, to predict the best routes automatically. Laser Light recently received a US patent for the concept and is currently talking to vendors that may be interested in writing the software. It will “probably the most disruptive part of our program, because it’s really converging predictive analytics with software-define networks and ever changing atmospheric conditions,” Brumley says.

    New Breed of Enterprise Networks Emerging

    Building the network OS, the SD WANs, the satellites, and, crucially, raising money to fund it all, are parts of the execution phase Laser Light has now entered, following about three years of design and development. To date, the startup has been funded by its parent company.

    Some of the elements are already under contract, including satellite payloads and ground nodes at network interconnection points, Brumley says, with the company waiting for the funding to execute those contracts.

    Equinix’s Tarazi thinks there’s little doubt that the market for the type of service Laser Light is planning to provide is there. The question, he says, is how big that market will end up being. Will it be limited to places that are extremely underserved by terrestrial fiber, or will there be broader use cases? “It’s not a question of if people will use it,” he says.

    There are potentially convincing use cases for companies that are now investing in the new breed of enterprise networks to enable distributed infrastructure for the Internet of Things. They are combining their own backbones with their own wireless spectrum for last-mile device connectivity and lots of edge computing nodes that aggregate device data. Any company that needs to push lots of data from lots of connected devices through its network will benefit from a service like Laser Light’s, which would provide many more network access options.

    It is clear that new technologies will be needed to enable more data to come in from network edges, and, while he believes Laser Light is the first to make an attempt to make commercial free-space optics at global scale a reality, Tarazi thinks there will be more players using this and other kinds of tech that will change the way global networks are architected. “I think there will be more than one company,” he says. “It’s just the beginning of it.”

    6:53p
    Symantec Vows to Become ‘New Force’ in Cybersecurity
    Brought to you by MSPmentor

    Brought to you by MSPmentor

    Eleven weeks after acquiring web defense software maker Blue Coat Inc., Symantec’s still-integrating leadership laid out a vision for working closely with channel partners to dominate the cybersecurity market.

    The merger that became official on Aug. 1, created a firm with more than 3,000 engineers, 385,000 worldwide customers, 175 million endpoints and $4.65 billion in annual revenue.

    A company news release at the time described the new entity as “the industry’s largest pure play cybersecurity company.”

    During an opening keynote at last week’s Symantec Partner Engage 2016 event in Los Angeles, CEO Greg Clark told principals from hundreds of channel firms that the new Symantec has the financial and technological wherewithal to become a top player in the space.

    “I can tell you…that we are going to emerge as a new force in cybersecurity,” he told the gathering.

    Clark said that after removing Cisco Systems, Symantec’s remaining competitors combined are losing money.

    “We have the healthiest balance sheet in cybersecurity,” the CEO said.

    Partners from both Blue Coat and Symantec learned they would continue to operate as separate partner programs through the end of Symantec’s fiscal year, on March 31, 2017.

    After that, the groups will be brought together under a unified program that will attempt to leverage the best aspects of the previous, component programs.

    Symantec intends to use its depth of engineering talent as a key differentiator, helping to produce and improve cybersecurity technologies that actually work well, Clark said.

    He also reiterated the company’s commitment to open source development.

    “Platforms matter,” Clark said. “Open platforms to integrate security technologies really matter.”

    “We integrate with our competitors,” he went on. “We’re trying to solve problems for our customers.”

    Symantec technologies are already in use by 80 percent of the Global 1000, and its solutions are white-labeled by corporate giants like AT&T. The new larger company expects continued success with large enterprise and government customers.

    The software maker will soon release SEP 14, the successor to Symantec Endpoint Protection version 12; and a new Integrated Cyberdefense Platform, designed to seamlessly and cost effectively provide policy-based protection across web, email and network.

    “Selling into the high end, we will mow down our competitors here,” Clark said. “We think we are the first truly integrated cybersecurity platform.”

    But the company also aspires to dominate the midmarket cybersecurity space, and ensuring profitable opportunities for partners is key to that strategy. Symantec officials promised robust partner training and ubiquitious product awareness campaigns.

    “We have a really substantial marketing spend; the biggest in the industry,” Clark said. “Eighteen months from now…we’re going to be really hard to compete against.”

    This first ran at http://mspmentor.net/msp-mentor/symantec-vows-become-new-force-cybersecurity

    7:03p
    Private Cloud vs. Public Cloud: Which Option is Cheaper?
    Brought to You by The WHIR

    Brought to You by The WHIR

    Even though commercial private cloud offerings can offer a lower total cost of ownership in many cases because of the prevalence of qualified administrators, 451 Research says TCO is just one factor in selecting a particular cloud model.

    According to 451 Research’s latest Cloud Price Index, in many cases, security and control of private clouds outweigh any financial considerations when managing mission-critical apps.

    While commercial private cloud offerings such as VMware and Microsoft currently offer a lower total cost of ownership when labor efficiency is below 400 VMs managed per engineer, when labor efficiency is greater than this, OpenStack is a better financial option. The report says that past this tipping point, all private cloud options are cheaper than both public cloud and managed private cloud options.

    “Salaries and labor efficiency have a disproportionately large impact on pricing, so our analysis provides a true picture of total cost of ownership, beyond the technology costs,” Dr. Owen Rogers, Research Director of the Digital Economics Unit at 451 Research said in a statement. “But as with any IT purchasing decision, cloud buyers need to look beyond the pricing and evaluate all the risks, such as the impact of vendor lock-in over the long term. “

    “While the CPI provides a basis for assessing options, we suggest buyers consider a hybrid or multi-cloud strategy so they can determine the best execution venue for each workload based on cost, management, technology and location requirements.”

    While private clouds may be cheaper past the so-called tipping point, analysts said that public clouds are the least wasteful option because of on-demand provisioning.

    451 Research said that “organizations that fail to meet utilization and labor efficiency thresholds can quickly reach a point where they are wasting thousands of dollars each month compared with a public cloud solution.”

    According to the Cloud Price Index released in March, public cloud users in the U.S. pay the lowest rates, followed by those in Europe, who pay a 7 to 19 percent premium for the same application.

    This first ran at http://www.thewhir.com/web-hosting-news/private-cloud-vs-public-cloud-which-offers-a-lower-total-cost-of-ownership

    10:02p
    Amazon Launches Three Cloud Data Centers in Ohio

    Amazon has launched three data centers in Ohio, adding a second US East availability region for the Amazon Web Services cloud.

    Each of the new Amazon data centers is in a different location within the state, representing a separate availability zone within the region. That means users can design their applications to switch from one data center to another in case of an outage.

    This is the latest major cloud data center location to come online in a recent burst of expansion by internet giants vying for share of the cloud services market, where AWS is far ahead of its competitors. Amazon, as well as Microsoft and Google, are spending billions of dollars every quarter as they build out the global infrastructure to support these services.

    “We’ll continue to add new infrastructure to grow our footprint and make AWS as useful as possible for all of our customers around the world,” Amazon CTO Werner Vogels wrote in a blog post announcing the Ohio region.

    There are now 38 Amazon cloud data centers across 14 regions, 16 of them in five regions in the US, and the company has already announced it is building nine more in four additional regions – Canada, UK, France, and China – expected to come online “in the coming months,” according to a news release.

    Microsoft, which currently has 30 operating cloud availability regions, has announced six upcoming ones. Google said last month it had eight locations in the works to add to the five it has today.

    Read more: Amazon, Google Detail Next Round of Cloud Data Center Launches

    Like most of its rivals, Amazon has a mixed data center sourcing strategy, both leasing data center facilities and building its own. In Ohio, it used the latter approach. Vadata, the Amazon subsidiary that oversees its data center construction projects, negotiated tax breaks and land deals in the towns of Dublin, Hillard, and New Albany, according to earlier news reports.

    Not only will its cloud customers be able to set up redundancy within the Ohio region, they’ll also have the option to extend application topology between Ohio and the first and largest AWS region in Northern Virginia. The company is offering low data transfer rates between the two regions to enable this, according to Vogels.

    By enabling redundant, high-availability topologies across Amazon data centers, the company is courting more enterprise customers to the cloud service, from both private and public sectors. All cloud giants are aggressively courting enterprises, who are reportedly in the early innings of deploying large critical applications in the cloud instead of their own data centers.

    Last week, Amazon announced its biggest step yet in pursuit of the enterprise cloud market, partnering with VMware, whose infrastructure management software is present in most of the world’s enterprise data centers. VMware will soon sell its software as a service available on AWS, promising seamless integration between the world’s biggest cloud and customers’ on-premise VMware environments.

    Read moreVMware Gives AWS Keys to Its Enterprise Data Center Kingdom

    To offset carbon emissions associated with grid energy its data centers consume, Amazon has been making utility-scale power purchase agreements with developers of renewable energy projects. In Ohio, it contracted with EDP Renewables, a company building a 100MW wind farm in Paulding County to buy the project’s entire energy output.

    10:11p
    IBM Profit Margins Shrink Again From Shift to Cloud Computing

    (Bloomberg) — International Business Machines Corp. said profit margins shrank for the fourth quarter in a row, underscoring the technology company’s challenge in shifting to more subscription-based software and cloud services.

    Key Points

    • Operating gross profit margin in the third quarter was 48 percent, down 2.1 percentage points from the same period a year earlier. This missed the average analyst estimate of 50.1 percent.
    • Sales were $19.23 billion, down 0.3 percent from the previous year, beating the average analyst estimate of $19 billion. While this is the closest IBM has gotten to revenue growth in more than four years, it’s still the 18th quarter of sales declines.
    • Profit, adjusted for certain items, was $3.29 a share, better than the average analyst estimate of $3.23, according to estimates compiled by Bloomberg.
    • The strategic imperatives group — which includes businesses such as artificial intelligence, cloud computing and data analytics — reported revenue of $8 billion in the quarter, up 16 percent from a year earlier.
    • IBM shares fell 3.5 percent in extended trading to $149.31. They closed little changed at $154.77.

    See alsoCloud by the Megawatt: Inside IBM’s Cloud Data Center Strategy

    The Big Picture

    Since Chief Executive Officer Ginni Rometty took the top post in 2012, investors have been waiting for her to turn the computer services company around and find growth in the newer businesses — like cloud and artificial intelligence — to offset declines in the legacy hardware, software and services units. The move is taking longer than some investors would like. During the transition, IBM has had to spend more on infrastructure and development, which Chief Financial Officer Martin Schroeter said dented profitability this quarter.

    “We have some pretty heavy investment levels going into the cloud and cognitive businesses, and we’ll continue to make those,” Schroeter said in an interview. “Between mix and investments, that covers the bulk of more than three-quarters of the margin impact in the quarter.”

    << Previous Day 2016/10/17
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org