Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, January 30th, 2017

    Time Event
    1:00p
    Cloud Boom Buoys Microsoft, Intel, Alphabet Results

    By Alistair Barr and Jillian Ward (Bloomberg) — Alphabet Inc., Microsoft Corp. and Intel Corp., which all posted quarterly results last Thursday, reinforced what’s become a truism in technology: the biggest growth is in businesses that deliver computing over the internet.

    Microsoft topped projections on the strength of rising customer sign-ups for its cloud offerings like Azure, which saw revenue almost double. Intel sales rose more than expected, helped by orders for processors that power data-center servers — the machines at the heart of cloud computing.

    While profit at Google parent Alphabet disappointed, the numbers also signaled that heavy spending to catch cloud leaders Amazon.com Inc. and Microsoft is paying off. Google’s Other Revenue line, which includes cloud computing, jumped 62 percent to $3.4 billion in the fourth quarter.

    “Our cloud business is on a terrific upswing,” Google Chief Executive Officer Sundar Pichai said Thursday during a conference call with analysts. “I definitely think we’re going to have a great year.”

    What started a decade ago as an easy way for startups to run websites has turned into an increasingly popular way for companies of all sizes to access the software needed to run their operations. Many big technology companies are in a strong position to provide this because they already have huge data centers to support their own web services.

    Related: Who Leased the Most Data Center Space in 2016?

    ‘Disruptive Forces’

    More than $1 trillion in IT spending will be directly or indirectly affected by the shift to cloud during the next five years, Gartner estimated in July. “This will make cloud computing one of the most disruptive forces of IT spending since the early days of the digital age,” the research firm said at the time.

    Stock prices reflect this optimism. Microsoft and Alphabet shares hit a record this week. Microsoft gained 2.2 percent, Google was down about 1 percent and Intel rose 1.5 percent at 1:57 p.m. Friday in New York.

    Microsoft Chief Executive Officer Satya Nadella has been working to reposition the company around such internet services. In addition to robust demand for Azure, consumers and corporations continue to purchase Office 365, a cloud-based version of the company’s productivity software that includes Word and Excel. Almost 25 million consumers are now subscribed to Office 365, Microsoft said, and sales increased 47 percent in the fiscal second quarter.

    “As long as cloud is growing, people are happy,” said Mark Moerdler, an analyst at Sanford C. Bernstein & Co., who rates Microsoft shares outperform. “If margins are growing, people are even happier.”

    Microsoft’s Effort

    Redmond, Washington-based Microsoft has been spending on data centers and adding products to win new cloud customers. Chief Financial Officer Amy Hood said in July that gross margins, a measure of profitability, for the commercial cloud business would “materially improve” in the current year. That’s because previous years of investment are starting to pay off as those data centers support more customers.

    Microsoft has pledged to reach annualized revenue of $20 billion in its corporate cloud business by the fiscal year that ends in June 2018. That metric stood at more than $14 billion at the end of the second quarter.

    Intel’s fourth-quarter sales of server chips to cloud service providers jumped 30 percent from a year earlier. Offsetting that, companies and government agencies spent 7 percent less than they had in the same quarter in 2016 on equipping their in-house computer rooms, the company said.

    “It’s moving to the public cloud, it’s moving to those areas at a faster rate than I think we expected,” Intel Chief Executive Officer Brian Krzanich said on a conference call.

    Amazon reports earnings next week and analysts at RBC Capital Markets expect the company’s AWS cloud business to generate $3.6 billion in fourth-quarter revenue, up 50 percent from a year earlier.

    “Microsoft is the plumbing in the cloud,” Moerdler said. “Amazon is much bigger, but still Amazon and Microsoft are pulling away from the pack.”

    5:24p
    Delta Cancels 280 Flights Due to IT Outage

    Delta Airlines cancelled about 170 flights Sunday and 110 flights today as a result of a major IT systems outage, the company said in a statement.

    This was a second system-wide outage for Delta in six months due to IT problems and a second major airline outage within a week’s time. More than 200 United Airlines flights were affected by an IT outage on January 29.

    People involved in the industry have sounded alarms about outdated legacy IT infrastructure big airlines rely on, predicting more and more frequent technology-related incidents, but the companies have been reluctant to do major upgrades, which are costly and difficult to implement because systems have to remain online around the clock.

    Read moreDelta System Failure Marks Wake-Up Call for Airline Industry

    “I want to apologize to all of our customers who have been impacted by this frustrating situation,” Delta CEO Ed Bastian said in a statement. “This type of disruption is not acceptable to the Delta family, which prides itself on reliability and customer service. I also want to thank our employees who are working tirelessly to accommodate our customers.”

    The company has not provided details about the cause of the outage, saying only that Delta’s “essential IT systems” went down around 6:30 pm Eastern on Sunday. The systems were restored “a few hours later” and were back to normal shortly after midnight.

    In an update posted on its website today at 7 am, the company said it was operating the “vast majority” of the flights scheduled for the day but a few additional cancellations were possible.

    Airline outages are extremely costly, because in addition to the expense of repairs and general reputation damage, they have to issue refunds and reschedule flights for affected customers. Delta estimated that its data center outage last August cost the company about $150 million.

    An outage at Southwest in July was estimated to have cost the airline at least $177 million.

    6:11p
    Vertiv Uses Machine Learning to Automate Data Center Cooling

    Vertiv, formerly Emerson Network Power, has introduced a software system that according to the company uses machine learning to automate management of data center cooling systems to improve efficiency.

    Data center facilities managers normally have to manage each individual component of the cooling system (i.e. chillers, air handlers, economizers, etc.) to fine-tune the overall system. Change the setting on one, and the entire system gets affected.

    The idea with iCOM Autotuning, Vertiv’s new software feature, is to use machine learning techniques to control all of the elements automatically, the company said in a statement.

    In a direct-expansion data center cooling system, that means compressors, fans, and condensers are harmonized to eliminate short cycling, which is when cool air returns into the cooling system without going through IT hardware. In chilled-water systems, the autotuning feature avoids rapid fluctuations in valve positions to balance fan speeds, water temperature, and flow rates.

    The feature is part of Vertiv’s Liebert iCOM-S thermal system control. It is available for select Liebert cooling systems installed in North America, the company said.

    While running in production to improve cloud services by the likes of Google and Facebook, machine learning algorithms are seldom applied to data center management. The rare examples of companies that have done it include Google, which uses machine learning to improve data center infrastructure efficiency; Coolan, a startup that used machine learning to optimize the cost of data center hardware acquired by Salesforce last year; and Romonet, whose software analyzes the cost of customers’ data center assets and traces the impact of infrastructure decisions on their bottom line.

    6:25p
    How an Operational Data Hub Becomes the Data Detective

    Ken Krupa is CTO at MarkLogic.

    Data sleuths inclined to using logic and the science of deduction to ferret out patterns and establish links between clues take enjoyment in solving data problems. However, with the amount of data created and stored every day growing exponentially, discovering the relationships that exist among your data requires more than a keen mind. It requires the right technology, too.

    The Covert Culprit: Data Silos

    For many organizations, data lies in multiple, disconnected data silos resulting from earlier departmental initiatives. The inability to undertake complex data integration across silos means organizations are missing out on a full view of their data, which can bring revenue-generating insights and customers’ preferences, and avoiding the risk of fines due to difficulty complying with regulations.

    Mergers and acquisitions can create even more of these silos and the reality of multiple copies of data being spread out across these disjointed systems presents the organization with serious data integrity issues. Compounding the complexity, from financial services firms to multi-national pharmaceutical companies – today’s businesses are under mounting pressure to comply with ever changing regulations, all the while looking over their shoulders for the next disruptive business model that will invariably come along to challenge their established positions.

    Solve the Disparate Data Conundrum with an Operational Data Hub

    The good news is that the right technology implementation – an operational data hub (ODH) – can help organizations solve their data integration challenges and extract more value from their data. An operational data hub is an active integrated data store/interchange that can hold a single, unified 360-degree view of all of your data. Up to 80 percent of today’s enterprise data is in unstructured, or multi-structured format such as office documents, PDFs, message exchange formats, mobile data, digital metadata as well as silos of varying RDBMS models. Because of the variance, it makes sense to build this operational data hub on a database built to handle all types of data. This is where an Enterprise NoSQL database fits the bill because it can ingest any type of data, and eliminate time-consuming data wrangling and ETL (extraction, transformation and loading) processes that can take years to implement and cost millions to maintain – all inherent weaknesses part and parcel of traditional relational databases.

    To choose the right Enterprise NoSQL database, look for integrated search and semantic capabilities as well as full enterprise-grade ACID compliance. Semantics makes it easier to discover these inferred facts and relationships, integrating concepts and categories and providing context. When a database has ACID capability, even the largest datasets are processed consistently and reliably so none of the data is ever corrupted or lost. Importantly, due to the scalability and agility of NoSQL, the system can also be quickly adapted, extended and enhanced to meet changing business and regulatory requirements.

    Here are real-world examples of operational data hubs solving complex data integration challenges:

    The Case of Rampant Risk in the Investment Banking Industry

    Investment banks need to ensure their trade data is high quality, accessible and searchable to mitigate risk and maintain compliance with regulatory imperatives, such as MiFID II, FRTB and Dodd-Frank.

    A leading investment bank built an operational data hub using an Enterprise NoSQL database. This technology enabled a single view of derivatives trades and provides a full audit trail for auditors and regulators. It also replaced 20 RDBMS database instances with a single Enterprise NoSQL database cluster, making trade information retrievable as well as actionable in real-time. And not only has the database enhanced the investment bank’s compliance reporting, it has dramatically reduced its maintenance costs as the system is built on a commodity scale-out architecture. The result is not-only a lower cost per trade but an orders-of-magnitude faster time to delivery whenever business needs change. The bank can now develop and deploy new software – and therefore launch new products in response to the market – much more quickly.

    Combining technical innovation with insights, another leading investment bank needed to maintain and demonstrate compliance with Dodd-Frank data analysis processes.  Using Enterprise NoSQL’s bitemporal data management feature allowed the bank to minimize risk through “tech time travel” – time stamping and “rewinding” the state of their data to identify changes by looking at the data as it was over the course of time without having to reload data backups.

    The Case of Ineffective Fraud Detection in the Insurance Industry

    Fraud detection rates in the insurance industry remains frustratingly low. Most fraud is only noticed after the crime has been committed and it’s become too expensive to recoup the money lost.

    Current rates of fraud detection that rely solely on human expertise are, at best, 10 percent of the total committed – and frequently lower. By using an operational data hub, insurers can take advantage of the combined power of big data, semantics and inference to detect previously unknown fraudulent behavior. What’s more, instead of a passive read-only interaction with the data, an operational data hub with semantics capability allows investigators to interact with the data in a more conversant read-write way, providing a two-way platform that combines the power and scale of computing with human intuition. With such an active and operational 360-degree view of the data, it becomes possible to evaluate a fraud claim within its context, provide additional inferred context as needed and perform comparisons to potentially similar claims in order to identify patterns.

    Rules may also be set up and enhanced based on discovered patterns. When properly equipped to analyze data, assign a risk score to each claim, alert the right people in real-time and delay payment to settle all high scoring suspicious claims, insurers will soon be in a position to successfully detect fraud at a rate that saves billions of dollars each year.

    All Signs Point to a Data Strategy Breakthrough

    Regardless of format or source, the devil is in the details when it comes to an organization’s data and its ability to delve into those data details can become a crucial business differentiator. A flexible operational data hub allows businesses to take advantage of the power of big data, semantics and inference to gain crucial insights. When it comes to complex data integration and building an advantageous 360-view of your business, you don’t have to go it alone and neither does your data.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.
    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    << Previous Day 2017/01/30
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org