Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, July 21st, 2017

    Time Event
    3:00p
    Big Data Technology: In-House vs Outsource

    Eldar Sadikov is the Co-founder and CEO of Jetlore.

    For retailers, the specter of big data is one that is constantly looming. Companies are working hard delving into the omni-channel arms race as they try to fend off behemoths like Amazon. Some companies are going so far as to deploy massive amounts of resources into developing their own big data solutions in an attempt to go toe-to-toe with the retail giant.

    The natural question that retailers face is what exactly they need to build in-house vs. what they can, and probably should, outsource to vendors.

    With the proliferation of the software-as-a-service (SaaS) model, it’s becoming increasingly simpler and faster to deploy new solutions in an enterprise setting. This naturally results in ever-increasing innovation in the industry, as old solutions are easily replaced with the more novel and more effective ones in mere weeks.

    At the same time, large retailers have a natural desire to build a lot of things in house in the same way Amazon made heavy investments in internal technology to power everything from the automated fulfillment to user recommendations on site. However, it’s important to realize that not everything can or should be built in house. Retailers should think of the infrastructure as the data platform on top of which vendors innovate in the same way Apple and Android platforms allow individual developers to innovate with the apps.

    We believe that algorithms in the cloud will become the most common SaaS applications in the next few years. Retailers who treat algorithms as “core competency” and limit their development to internal teams will only stifle innovation and fall behind in the long term. Here, we outline the reasons why.

    Cost

    Great algorithmic solutions require immense talent. The war for talent these days is fierce, especially in data science. Data scientists are typically PhDs educated in computer science, statistics or mathematics and require salaries of over $150,000.

    With the limited supply of qualified engineers and data scientists in the market, these engineers are more frequently lured by either startups or the technical giants like Amazon, Google and Facebook. Unfortunately, most brick-and-mortar and online retailers don’t happen to be the sexiest destinations for the top-notch engineers. As a result, retailers have to compensate by paying more than double the already exorbitant salaries.

    Doing simple math shows that a team of 20 data scientists and engineers can easily cost a retailer an excess of $4 million a year. That’s  before factoring in the cost of recruitment, on-boarding, retention of this talent and acquiring and supporting any infrastructure to support the development of the solution. In a comparison, a typical SaaS solution will have a price tag of less than $1 million a year (this is probably an absolute upper bound: typical fees will be lower than $500,000). It’s not hard to see massive savings that retailers achieve by working with vendors.

    Speed-to-Market and Flexibility

    For any technological venture, speed to market is key to determining overall success. This includes the development of internal technology. From project inception to launch, creating a big data solution can take as much as 2-3 full years. That’s two-plus years for a solution you need today. And while the need for an immediate solution is a sizable, the lifecycle of technology isn’t. A two-year wait time can create one of two problems: Either your newly developed solution is nearly outdated at launch, or you become caught in an unending cycle of redesign in an attempt to get ahead of a rapidly progressing technological landscape.

    Meanwhile, with the wide adoption of cloud-based SaaS model, speed of integration and deployment for third-party solutions has never been faster. Some can be integrated and deployed in a mere 20 days, meaning that an immediate need is satisfied quickly with cutting edge technology that is constantly being improved (algorithms are constantly optimized and tuned across the largest retailers in the world). More importantly, third-party vendors also provide a level of flexibility not available with an internally built system. Removing and replacing third-party SaaS solutions is extremely simple without the fear of exorbitant costs and internal politics.

    Innovation

    Technological and algorithmic advances occur extremely fast. Throughout history it has become evident that competition pays a vital role in innovation. SaaS model makes it both easy to deploy as well as easy to replace a solution. As a result, vendors are under a constant pressure to innovate and improve. When there is an internal team, the choice has already been made, hence there is no competition. Once a solution is built and deployed, the goal of the team is to maintain and improve the solution. But you will never really know if the solution your internal team has built is competitive on the market.

    By working with third-party SaaS vendors, retailers are able to evaluate and deploy many cutting edge solutions in a short time frame with little investment on their side. These solutions are in use by many other retailers, and vendors are under constant scrutiny by their customers to innovate and improve. Trying to build these things in- house is not only cost prohibitive and slow, but most importantly, it limits innovation, making your business less flexible and agile in the long run.

    This doesn’t mean that retailers should fully outsource all their technology to vendors. When people talk about technology in the context of big data, they refer to both the infrastructure to store and process data, as well as the algorithms to interpret data and make predictions. Infrastructure includes storing omni-channel customer data like purchases or claimed coupons in a secure, privacy-preserving way and making this data accessible to supporting applications.

    Algorithms are effectively applications on top of the infrastructure that leverage the data to do demand forecasting, churn prediction, dynamic pricing, or product personalization and targeting. They are built on top of the data infrastructure the same way apps are built on top of the operating system. Hence, it is imperative for retailers to invest internal resources and time to build secure, efficient and scalable infrastructure.

    The right infrastructure with external APIs and security (encryption of sensitive data) will enable your company to leverage the cutting edge technology from vendors and continuously innovate. This will allow your company to focus the attention and expertise on core business functions instead of attempting to become experts in unrelated fields. For any business, resources like money, time and brain space are finite. Winning businesses know how to win by pointing those resources in the right places.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:20p
    Report: Lithium-Ion to Gain One-Third of Data Center UPS Market by 2025

    Lithium-Ion batteries, widely used in consumer electronics, including smartphones, and electric cars, have just started being adopted for energy storage in data center backup systems. While vendors that sell this equipment claim that its total cost of ownership in the long run is lower than the traditional lead-acid-based energy storage, the upfront cost is high enough to prevent the technology from taking off in the data center market today.

    That’s not going to be the case in a few years, according to a new report by the research firm Bloomberg New Energy Finance, which says lithium-ion batteries will command a quickly growing portion of the data center UPS market in North America and Europe.

    BNEF expects overall demand for battery backup in data centers in two regions to go from 3.5 gigawatt-hours in 2016 to 14 gigawatt-hours in 2025. During the same period, lithium-ion’s share of the market will rise from 1 percent to 35 percent, the analysts predict.

    Annual data center lithium-ion penetration in North America and Europe, 2016-25 (GWh); (Source: Bloomberg New Energy Finance)

    According to Schneider Electric, which sells data center UPS systems of both types, lithium-ion’s key advantages over the incumbent technology include:

    • Fewer battery replacements (perhaps none) required over the life of the UPS eliminates the risk of downtime posed by battery replacement
    • About three times less weight for the same amount of energy
    • Up to ten times more discharge cycles depending on chemistry, technology, temperature, and depth of discharge
    • About four times less self-discharge (i.e. slow discharge of a battery while not in use)
    • Four or more times faster charging, key in multiple outage scenarios

    (Source: Schneider Electric whitepaper)

    The major drawbacks of lithium-ion, again according to Schneider, are:

    • About two to three times more capex for the same amount of energy due to higher manufacturing cost and cost of required battery management system
    • Stricter transportation regulations

    Simon Zhang, a marketing manager at Schneider, wrote in a blog post that lithium-ion battery cost will go down as a result of greater and greater adoption, driven primarily by growth in the electric-car industry:

    “Li-ion technology will snatch 40% of market share in just 8 years. In a market that’s not exactly known for making rapid technological shifts, that is nothing short of remarkable. While VRLA has much lower upfront cost today, lithium-ion batteries will experience significant cost reductions, driven by sizeable ramp-up in demand in the coming decade, much of which will be for electric vehicles. This cost reduction and increasing familiarity with the use of lithium-ion batteries for back-up in data centers will help ramp up adoption in this timeframe.”

    6:07p
    Webinar: Finding Your Data Center Strategy Sweet Spot

    Title: Finding the Sweet Spot for your Data Center Strategy

    Date: Wednesday, July 26, 2017

    Time: 02:00 PM Eastern Daylight Time

    Duration: 1 hour

    Produced by: AFCOM

    Demand on existing and demand for new technologies is increasing at a rapid pace. This results in a growing number of companies being faced with critical decisions surrounding their data center requirements. From On-Prem to Off-Prem, Micro to Macro, Capex or Opex, this session will focus heavily on the spectrum of data center options currently available for businesses, and essential drivers that influence which solution is the best fit for them.

    Learning Objectives:

    •    Obtain a better understanding of current data center offerings
    •    Learn why capital preference should be one of the first items considered
    •    Find out how decision driving factors vary across industry

    Speaker:

    Laura Cunningham
    Data Center Consultant HPE Technology Consulting

    ​AFCOM Webinars are member only EXCLUSIVE benefits but for this limited occasion you’re welcome to attend! Register for FREE.
    6:59p
    IBM Is Worst Performer on Dow as Cloud Services Unit Falters

    Gerrit De Vynck (Bloomberg) — IBM fell the most in three months after reporting revenue that missed estimates, with sales in a key unit declining for the second consecutive period.

    The quarterly results, released Tuesday after the close of trading, further extend Chief Executive Officer Ginni Rometty’s turnaround plan into its fifth year without significant progress. The company, once considered a bellwether for the tech industry, was the worst performer on the Dow Jones Industrial Average Wednesday.

    Revenue in the technology services and cloud platforms segment dropped 5.1 percent from the same period a year earlier, even though executives had said in April that they expected key contracts to come through in the quarter. The unit is a marker for the strength of the company’s push into newer technologies. Total revenue fell to $19.3 billion, the 21st straight quarter of year-over-year declines.

    The stock tumbled as much as 4.7 percent in intraday trading Wednesday in New York, the most since April, to $146.71. The shares have lost 7.2 percent this year through the close Tuesday, and have missed out on the technology stock rally that propelled companies like Amazon.com Inc. and Alphabet to records.

    International Business Machines Corp. has been working since before Rometty took over in 2012 to steer the company toward services and software, and she has pushed it deeper into businesses such as artificial intelligence and the cloud. Still, legacy products like computers and operating system software have been a drag on overall growth. Some investors are getting tired of waiting for the turnaround to catch on. Warren Buffett’s Berkshire Hathaway Inc. sold about a third of its investment in IBM during the first half of this year.

    Several analysts cut their price targets on the company.

    James Kisner, an analyst at Jefferies, said the “poor earnings quality aims to mask ongoing secular headwinds” in the software business and competitive pressures in services that may result in more investor disappointment. He rates the stock underperform and cut the price target to $125 from $154.

    Better Margins

    Gross margins in the second quarter were 47.2 percent, slightly beating the average analyst estimate of 47 percent. That’s better than last quarter, when a surprise miss on margins sent the stock tumbling the most in a year.

    “We will continue to see, on a sequential basis, margin improvement from the first half to second half,” Chief Financial Officer Martin Schroeter said in an interview.

    Operating profit, excluding some items, was $2.97 a share, compared with the average analyst estimate of $2.74 a share. That measure got a boost from tax benefits, which added 18 cents to the per-share number, IBM said.

    The company’s cognitive solutions segment, which houses much of the software and services in the newer businesses and includes the Watson artificial intelligence platform, has shown the most promise in recent periods, growing in each of the previous four quarters. Yet sales in the unit fell 2.5 percent in the second quarter.

    AI Competition

    Watson, for which the company doesn’t specifically break out revenue, might never contribute a significant amount to the company, Jefferies’ Kisner said in a July 12 note.

    Competition in the artificial intelligence market is heating up, with major investments from the world’s biggest technology companies, including Microsoft Corp., Alphabet Inc. and Amazon.com Inc. On top of that, hundreds of startups are jumping in.

    “IBM appears outgunned in the ‘war’ for AI talent,” Kisner said. “In our base case, IBM barely re-coups its cost of capital from AI investments.”

    The company’s total revenue fell 4.7 percent from the same period a year ago, and missed analysts’ average estimate for $19.5 billion.

    Oppenheimer & Co. managing director Ittai Kidron said the results show IBM “isn’t out of the woods yet.”

    << Previous Day 2017/07/21
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org