Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, March 6th, 2015

    Time Event
    1:00p
    GE’s New Big Data Software Accelerates Power Outage Recovery

    While customers don’t expect 100-percent uptime from an electrical utility, timely outage recovery is best a utility can do to keep users happy. Power companies have traditionally relied on Outage Management Systems – computer systems that help them do things like trace the source of an outage, decide what actions should take priority during restoration efforts, and manage their crews during outages.

    Of course, outage recovery can never be done fast enough, so there is always pressure to find ways to speed up the process, and the utility industry is increasingly looking to big data analytics technologies for help. “There is growing interest in data analytics software in power grid restoration, especially in the wake of recent extreme weather events,” Keith Grassi, global product line leader for GE’s Digital Energy business, said via email.

    Hurricane Sandy, the biggest example in recent history, left millions of customers in the U.S. Northeast, including numerous data centers in New York City, without power for days.

    Earlier this week, GE unveiled a new analytics solution meant to augment power companies’ existing outage management systems to help them recover faster. The first customer is Colorado Springs Utilities, a Colorado Springs, Colorado, power company that is currently installing Phase I of the software.

    Fairfield, Connecticut-based GE, which ranks ninth on the Fortune 500 list, has been growing its focus on big data analytics over the past several years. From power and water, aviation, and healthcare to transportation and oil and gas, there are useful applications for analytics software across all of the industries the conglomerate sells into.

    In 2013, GE invested about $100 million in Pivotal, becoming a minority stake holder in the software in the software company controlled by EMC and VMware that focuses on data analytics solutions for enterprises. That same year, GE launched a cloud platform for what it called the “industrial internet,” or a network of devices in factories and other large industrial facilities that generate data that can be used to manage those facilities more efficiently using analytics software.

    Unlike a traditional OMS, GE’s new software for utilities, called PowerOn Response, collects damage information reported by field crews and network status patrol crews report using mobile devices. It integrates with existing OMS software. It collects data on circuit conditions, facilities damage, device status, and tracks restoration and customer status. It shares the data across the utility and with customers.

    The technology is not cheap. A typical utility should expect to pay from $1 million to $2 million for the solution, according to Grassi.

    First phase of the software being installed in Colorado Springs provides reporting, but the next phase is going to include predictive analytics, Grassi said. On the backend, the software is supported by Oracle and Predix databases and a Geospacial server for circuit rendering. Future releases will employ Hadoop or similar technologies.

    4:00p
    Red Hat Launches Application Container OS

    Red Hat has pushed into the enterprise container OS space, launching Red Hat Enterprise Linux 7 Atomic Host into general availability. With Project Atomic serving as the foundation of the distribution, this operating system is optimized for running the next generation of applications with Linux containers.

    RHEL 7 Atomic Host is a friendly avenue for enterprises to begin evaluating and using container technology in hybrid infrastructure and at scale, making the entire picture secure throughout the application lifecycle. Atomic Host will offer automated security updates, continuously striving to make containers reliable and secure.

    Community-driven Project Atomic develops technologies for creating lightweight Linux Container hosts based on next-generation capabilities in the Linux ecosystem. The tools allow creation of the new RHEL variant.

    The Red Hat-sponsored Project Atomic combines Docker and Googles’ Kubernetes for deploying containers at scale, rpm-ostree for managing updates, and system as a Linux services manager. Together, they help to form the container OS.

    “Red Hat Enterprise Linux Atomic Host enables enterprises to embrace a Red Hat Enterprise Linux Atomic Host container-based architecture to take advantage of the benefits of development and deployment flexibility and simplified maintenance, without sacrificing performance, stability, security, or the value of Red Hat’s vast certified ecosystem,” writes Kimberly Craven, a product marketer at Red Hat.

    Many are betting on container technologies playing a significant role in how organizations deliver and manage applications, so there’s a focus in helping streamline application delivery. The goal for Red Hat here is to create a small host footprint and provide essential functionality for allowing atomic updates and running application containers like Docker.

    CoreOS and Canonical’s Snappy are two early examples of increasing activity in the container OS space. CoreOS’ Rocket container launched a few months ago and is seeing healthy activity.

    Canonical is in a similar position to Red Hat. Both are primarily known for Linux distribution and are interested in providing the larger next-gen framework made up of open source goodness.

    All of this work is an important piece in support of next-gen Platform-as-a-Service (PaaS), an environment for easily developing applications without having to go beyond the coding. Containers are PaaS enabled, provide the portability in the bigger picture, and prevent PaaS lock-in.

    OpenShift Origin community project GearD complements RHEL 7 Atomic Host, enabling rapid application development, continuous integration, delivery, and deployment of application code to containerized application environments.

    GearD was created to provide integration between application containers and deployment technologies like Git to allow developers to quickly go from application source code to containerized application stacks deployed onto production systems.

    Cloud Innovation Practice Helps Cloud On-Ramping and Transition

    Another initiative to help advance next-gen application development is the recently announced Red Hat Cloud Innovation Practice. The practice includes a new team of experts that will assist companies in ramping to the cloud quicker.

    The practice was developed out of the integration of technology and engineering expertise gained through Red Hat’s recent acquisitions of InkTank, the company behind Ceph, and OpenStack cloud firm eNovance.

    5:00p
    Friday Funny: Pick the Best Caption for Rainbow

    It appears Kip and Gary have developed cabin fever and are staying entertained using … rainbows?! Let’s make the best of this long, crazy winter with our Friday Funny Caption Contest.

    Here’s how it works: Diane Alber, the Arizona artist who created Kip and Gary, creates a cartoon and we challenge our readers to submit a humorous and clever caption that fits the comedic situation. Then we ask our readers to vote for the best submission and the winner receives a signed print of the cartoon.

    Several great submissions came in for last week’s cartoon – now all we need is a winner. Help us out by submitting your vote below!

    Take Our Poll
    For previous cartoons on DCK, see our Humor Channel. And for more of Diane’s work, visit Kip and Gary’s website!

     

    7:51p
    Less Than One-Third of US Financial Companies Have Cloud Strategy: Report

    logo-WHIR

    This article originally appeared at The WHIR

    Only 28 percent of American financial companies have an existing cloud strategy, according to a Cloud Security Alliance report. Released on Thursday, the Cloud Adoptions Practices & Priorities Survey Report shows that while 61 percent of financial institutions are currently developing an organizational cloud strategy, the majority, especially in the US, do not yet have one in place.

    In the EMEA and APAC regions 35 and 41 percent of financial companies, respectively, have a plan in place. While cloud adoption is growing, the planning and implementation of cloud strategies are slowed by concerns about controls and security, which are driven by regulatory requirements.

    The report was compiled from the results of a survey conducted by the CSA Financial Services Working Group. More than 100 participants from financial companies of various sizes from around the world responded to the survey.

    Increased transparency and better auditing controls are the top concern for 80 percent of financial institutions moving to the cloud, followed by better data encryption for 57 percent.

    “The responses overall showed a very active market for cloud services in the financial services sector,” said Dr. Chenxi Wang, Vice President, Cloud Security and Strategy at CipherCloud, which sponsored the report. “Cloud has made solid in-roads in this industry with many firms looking to harnessing the power of cloud. There’s plenty of room for growth, particularly for providers who can fill the void for the auditing and data protection controls that are at the top of respondents’ cloud wish list.”

    Asked why they are moving to the cloud, 68 percent of respondents indicated the need for flexible infrastructure capacity, while 63 percent said reduced time for provisioning. CRM (46 percent), application development (45 percent), and email (41 percent) are the top cloud services for financials.

    The report says that as cloud computing becomes more prevalent, hybrid strategies are becoming an industry norm. However, the path there is inconsistent, as the report calls adoption “ad hoc,” and points out that the degree of digital customer interaction significantly influences company plans.

    Bank hacks and government urging have set up an expected spike in spending by financial institutions on data security. The report could also indicate a potential windfall for companies providing control and visibility to multi-cloud environments, such as ScienceLogic, which closed a $43 million funding round in February.

    This article originally appeared at http://www.thewhir.com/web-hosting-news/less-one-third-us-financial-companies-cloud-strategy-report

    8:01p
    Republican Internet Freedom Act Could Block FCC’s Net Neutrality Plan

    logo-WHIR

    This article originally appeared at The WHIR

    While the Federal Communications Commission voted last week to give itself new powers to enforce net neutrality rules, Republicans have reintroduced the Internet Freedom Act, which would block the FCC’s ability to regulate the Internet as a utility.

    House Representative Marsha Blackburn (R-TN) stated this week that she reintroducedthe Internet Freedom Act (PDF) in an attempt “to block the Obama Administration’s efforts to take over the Internet” through newly introduced Net Neutrality regulations. It now has 31 Republican co-sponsors.

    According to the legislation text, the Internet Freedom Act aims “to prohibit the Federal Communications Commission from reclassifying broadband Internet access service as a telecommunications service and from imposing certain regulations on providers of such service.”

    The policy to classify broadband providers as Title II utilities was put forth by FCC Chairman Wheeler early this year and voted on last week. It was largely championed by internet activists and also in line with President Barack Obama’s official net neutrality stance, spurring speculation that the executive branch might have ordered the FCC to adopt these measures.

    Blackburn said in a statement, “Last week’s vote by the FCC to regulate the Internet like a 1930s era public utility is further proof that the Obama Administration will stop at nothing in their efforts to control the Internet. There is nothing ‘free and open’ about this heavy-handed approach. These overreaching rules will stifle innovation, restrict freedoms, and lead to billions of dollars in new fees and taxes for American consumers.

    “Once the federal government establishes a foothold into managing how Internet service providers run their networks they will essentially be deciding which content goes first, second, third, or not at all. My legislation will put the brakes on this FCC overreach and protect our innovators from these job-killing regulations.”

    While there are several unknown aspects of the FCC’s plan around fines and procedures, it is clear that the style of FCC net neutrality enforcement would be aimed at making sure that all content has the same exact priority at a network level, contrary to what Blackburn stated.

    Ars Technica suggests that Internet Service Providers such as AT&T, Comcast, and Verizon have donated significant campaign funds to Blackburn, and that she could be advocating for these companies which stand to lose considerable money by not being able to charge to prioritize data flowing through their networks. Blackburn also filed legislation last week to overturn the FCC’s decision last week to preempt state lawsrestricting municipal broadband projects.

    However, this stance against furthering the FCC’s oversight also corresponds to the Republican ideology of protecting free enterprise. Squaring net neutrality and free markets, Republicans proposed a version of net neutrality in January that wouldn’t include the FCC getting new authority.

    As well, factoring in the fact that the FCC’s new authority could face court challenges, there are still many hurdles around putting through net neutrality regulations, and getting those regulations right.

    This article originally appeared at http://www.thewhir.com/web-hosting-news/aws-worth-50-billion-analysts

    8:18p
    How Google Avoids Cloud Downtime With VM Migration

    Heartbleed, the security vulnerability that affected 17 percent of all web servers on the internet when it was disclosed last April, sent ripples of downtime across users’ infrastructure deployed with various public cloud providers as the providers rebooted cloud VMs to patch against the bug.

    More widespread cloud reboots came in October of last year when major providers like Amazon Web Services, Rackspace, and IBM SoftLayer had to apply a patch to address a Xen hypervisor vulnerability. Another Xen update is driving cloud reboots this month.

    Verizon took its entire cloud offline in January to apply an infrastructure upgrade. The company later said the upgrade included a change that would enable the company to uprage the infrastructure without taking it down in the future.

    Google, however, has not had to bring customer VMs running in its Compute Engine cloud down since late 2013, when it introduced “transparent maintenance,” or a way to do live VM migration from one host to another to tinker with the infrastructure.

    Miche Baker-Harvey, a tech lead for VM migration at Google, explained how Google does this in a blog post published earlier this week. Live VM migration helps Google address a multitude of issues, from regular server, network, or data center electrical infrastructure maintenance to security updates, system configuration changes, or host OS and BIOS updates.

    Those were issues Google engineers expected to address with migration. Once the practice was implemented, however, they found that there were also other situations where live migration was useful. In one case, some servers had overheating batteries, affecting neighboring servers as well. Before bringing the offending server down to replace the battery, they moved VMs it was hosting to a different machine.

    At a high level, the process they use is simple: copy as much state data as you can to the target VM while keeping the source VM running, and then move the remaining data to the target, causing a blackout so brief, it is completely unnoticeable to the customer. The move is registered in the customer’s log.

    Here’s an infographic that explains the essentials, courtesy of Google:

    Google VM migration

    Google has done hundreds of thousands of VM migrations since transparent maintenance was rolled out. “Many VMs have been up since migration was introduced, and all of them have been migrated multiple times,” Baker-Harvey wrote.

    9:26p
    Microsoft Updates Doc Database, Launches Azure Search

    Microsoft has enhanced several Azure services, including Azure DocumentDB, a fully managed, scalable NoSQL document database service. Azure Search, search-as-a-service, has entered general availability and Azure Media Services Premium Encoder has entered into preview.

    All three exemplify Microsoft’s moving Azure forward and differentiating its cloud with advanced functionality. All enhancements broaden the potential use cases where Azure makes sense, aimed at workloads that are traditionally done predominantly outside of cloud.

    Microsoft also added a few more super-powered instances, A10 and A11. Both are aimed at compute intensive applications such as video encoding, risk modeling and simulation.

    DocumentDB will enter general availability in about a month, with three standard performance levels. Different collections of data within can be assigned to different performance levels to tune for performance levels needed.

    The managed NoSQL document database offers rich query and transactional processing over a schema-free JavaScipt Object Notation (JSON) data model.

    “Developers want to build cloud-based applications that support multiple platforms and different concurrent versions between multiples applications for user-generated content, IoT, and gaming scenarios,” Vibhor Kapoor, director, product marketing at Microsoft Azure, on the Azure blog. “They want these applications to deliver high-scale and reliable performance. NoSQL has emerged as the leading category of database technology to address these needs.”

    Enhancements ahead of GA include Hadoop integration, a Java SDK, larger document support, SQL parameterization. Additional regions, hourly billing and larger account sizes have also been enabled.

    Azure Search helps developers bake search functionality into web and mobile applications.

    “Azure Search enables developers to reduce the friction and complexity of implementing full-text search, and differentiate their applications by leveraging powerful features not available with other search packages such as enhanced multilanguage support,” Kapoor wrote.

    Azure Search supports more than 50 languages at launch and uses natural language processing used in Microsoft Office and Bing.

    Updates that come with GA include easier data loading from DocumentDB, Azure SQL Database, and SQL Server on Azure VMs to Search using new indexers. A .NET software development kit (SDK) was also made available.

    Azure Media Services Premium Encoder entered into preview. It offers advanced encoding capabilities for on-demand media workflows. Microsoft claimed it is tuned and suited for broadcast industry/professional media transcodes.

    Enhancements include automated decision-making logic that adapts to a variety of inpute file formats, support for additional input and output codex and file formats, and a powerful workflow design tool.

    The Media Services are fully integrated with Azure Content Delivery Network (CDN), two services heavily used in conjunction by streaming media companies. One offers a quick way to encode, the other a quick way to provision edge services for global reach and better performance.

    A general enhancement is enhanced application security. Azure Active Directory has some new functionality in preview that allows assigning shared application accounts to groups, and better password practices through randomly created complex passwords in custom intervals.

    << Previous Day 2015/03/06
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org