Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, January 14th, 2014

    Time Event
    2:15p
    How the Cloud Learned to Stop Worrying and Love Encryption

    Mike Klein is president and COO of Online Tech, which provides colocation, managed servers and private cloud services.

    Mike KleinMIKE KLEIN
    Online Tech

    It’s straight out of a movie.

    The film buffs among you might recognize that the title of this article is borrowed from the classic dark comedy Dr. Strangelove directed by Stanley Kubrick and starring George C. Scott and Peter Sellers. It’s about a rogue general during the height of the Cold War who causes a nuclear cataclysm because he’s overly worried about what the Russians are putting in Americans’ drinking water. Although the plot of the movie is about the Cold War, many of the themes of the movie will bring a knowing smile to people who work with data centers for a living: Complex technology that might fail at any moment. Fear of breakdowns in security systems. Fail-safe protocols that are inevitably susceptible to human error. Nerdy folks in dull, fluorescent-lit, windowless rooms worrying about worst case scenarios. If it weren’t for the nuclear bombs in the movie, it could be a film about data centers, couldn’t it?

    Because of the nature of our jobs and the mission critical importance of the IT systems we support, it’s no surprise we’re a bunch of worriers. Maybe not paranoid like the cuckoo folks in Dr. Strangelove, but there are plenty of things that keep us up at night, with good reason. For those of us who work in data centers for companies in regulated industries, security has a special place atop the list of things to worry about.

    Regulations like HIPAA and PCI and Sarbanes-Oxley put a heavy emphasis on security in order to protect patient privacy and the security of financial transactions, and encryption needs to be a central part of a company’s security strategy in order to ensure that their IT operations are compliant.

    The Encryption Conundrum

    So why is encryption important? The short answer is that the regulations require it, and what regulators say goes. HIPAA has explicit rules about how encryption should be deployed in the data center and IT networks. PCI does, too, for protection of financial information. Sarbanes-Oxley is a little more coy about encryption. SOX language doesn’t explicitly mention encryption, but it’s impossible to achieve all of its security requirements without employing encryption technology. So all of the regulations are unanimous: Encryption isn’t negotiable.

    It’s no surprise that encryption is a major focus of each of these regulations: Proper encryption can not only prevent security breaches, but also minimize the impact if a breach happens. If customer information or patient information is lost or stolen, encryption provides an additional layer of security that prevents bad guys from doing anything with that data. As a last line of defense, it can save companies millions of dollars in costs if a security breach does happen, and help them avoid costly fines, legal actions and negative publicity.

    If encryption has so many benefits from a security perspective, why isn’t it ubiquitous? Because encryption in traditional corporate data centers is hard, expensive, prone to human error (with all the responsibilities related to key management, for example) and it creates performance bottlenecks that can make the phone ring off the hook for data center professionals. Those issues conspire to make encryption a major pain in the you-know-what in an enterprise data center environment, but it’s even worse in the cloud. Encryption in the cloud has even more technical challenges…and is even more expensive…and often has an even more pronounced impact on performance. Those drawbacks have discouraged companies from being aggressive about encryption, despite what the mandates say. In fact, less than half of companies that work under mandates like SOX, HIPAA and similar regulations have successfully implemented encryption processes in their cloud deployments.

    You Better Have A Surf Board

    So what does this have to do with us in the data center industry? Everything—because clients look to us to solve these issues when they run into a dead end themselves. We are trusted partners and collaborators on their data center challenges, and—if you can pardon the oceanic metaphor—I see a rising swell of encryption requests on the horizon that will build and build in size as it make its way toward data center and colocation providers in the coming months. Healthcare, financial services and retail companies are increasingly looking to their IT partners to solve their encryption problem for them, so the wave will be on us before you know it—and we all need to get a surf board sooner rather than later.

    In the past (particularly in corporate data centers) the most common strategy for encryption was to “bolt it on”. Please forgive the colloquial language, but descriptive language like that is the best way to describe what it looked like. Data center folks often physically attached encryption hardware or suggested bolt-on software tools. It wasn’t pretty, but third-party solutions were the easiest way to address the need to add encryption and cross that urgent request off to-do lists.

    The “bolted on” approach does not work at scale in a cloud environment. It’s hard to deploy, support and can significantly degrade application performance. Data center and colocation companies need a better approach to encryption for clients in regulated industries. The approach that makes the most technical and practical sense is a “built-in encryption” model. Built-in encryption involves four elements:

    • Encryption of data in transit
    • Encryption of data at rest on the disks
    • SSL certificates
    • Encrypted backups

    Death, Taxes and Encryption

    There is a famous saying about the only two things that are guaranteed in life, but I feel confident that cloud providers can expand that list to include a third: death, taxes and encryption. Our clients expect us to deliver encrypted solutions, and we need to be prepared.

    Online Tech has developed a model for an “encrypted cloud” that takes care of each of the encryption steps that clients have difficulty doing themselves (encryption of data in transit and at rest, SSL certificates and encrypted backups). Encryption for data at rest is built into the server architecture rather than bolted on as plug-in software. Building encryption into a SAN’s server hardware removes performance bottlenecks and makes it easy to fulfill the encryption at rest part of the equation. This ensures there is no risk of stored data exposure when drives are removed or arrays are replaced. This model for an encrypted cloud also eliminates the problems related to key management, which has a million and one opportunities for human error. A truly encrypted cloud environment automates the key generation process as well as key distribution and other key management responsibilities.

    Sarbanes-Oxley, HIPAA and PCI aren’t going anywhere any time soon, so that means encryption requirements are here to stay. Encryption is our client’s problem, so that means it is our problem. No need to worry, though. There are new models for tackling this encryption challenge in the cloud, and the data center companies that learn to stop worrying and love encryption are the ones who will save the day for our clients.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    3:47p
    LSI Expands Oracle Exadata Systems with PCIe Flash

    LSI powers Oracle Exadata X4 systems with PCIe Flash, StoneFly launches Twin Scale Out NAS appliances, and Juniper responds to a report from Elliot Management on suggestions the hedge fund made on how to improve business.

    LSI expands Oracle relationship with PCIe Flash for Exadata X4 systems.  LSI announced that its Nytro flash accelerator cards have been selected as the PCIe flash acceleration technology for Oracle’s Exadata X4 Database Machine. The two companies have also collaborated on bringing a new LSI Nytro technology called Dynamic Logical Capacity (DLC) to Exadata customers. “Oracle Exadata X4 is a fully integrated and optimized database platform combining hardware and software designed from the ground up to work together to deliver maximum performance and value to customers,” said Juan Loiaza, senior vice president for Exadata systems at Oracle. “Flash-based storage is a key element of Exadata, and LSI Nytro flash accelerator cards deliver the performance and reliability that Oracle demands for Exadata. The expanded flash capacity enabled by DLC technology further enhances cost-effectiveness, raising the value we’re able to deliver customers.” Oracle selected LSI as its supplier of PCIe flash acceleration technology for Exadata systems after extensive testing and qualification processes. LSI Nytro cards deliver excellent performance, endurance and reliability in real-world production environments and applications ranging from data warehousing to online transaction processing to mixed workloads.

    StoneFly launches Twin Scale Out NAS appliances.  Storage provider StoneFly unveiled its TSO “Twin Scale Out” NAS storage appliances. StoneFly TSO series appliances are designed for scaling – with up to 36 drives and 144 TeraBytes of storage per node, with a single global namespace and single file system. Content creation, transformation, production and archive, as well as unstructured data are best stored in large scale out NAS appliances like the StoneFly TSO.  TSO allows you to add server nodes on the fly as well as add CPU horsepower and storage capacity from additional TSO nodes. “TSO is the most scalable and resilient NAS in the market, utilizing the latest technology of no metadata, and establishing new storage standards in big data, media and cloud markets,” said Mo Tahmasebi, president and CEO for StoneFly.

    Juniper Comments on Elliot Management filing.  After making waves with Riverbed last week, Elliot Management has turned its focus on Juniper Networks. Elliot made a 13D filing, stating that Elliot-related funds beneficially own 6.2 percent of Juniper in stock and options. The presentation said that it wants Juniper to return money to shareholders and streamline its product offering. Juniper issued a statement, saying that it “continues to deliver improved financial and operational performance as evidenced by five consecutive quarters of year-over-year revenue growth and our continued efforts to streamline the Company’s cost base.” The company said it has not discussed the report with Elliot, but intends to review it carefully. The Juniper statement continued, by saying that it believes ”Juniper has an innovative and robust product portfolio and is well positioned to deliver enhanced shareholder value and we are optimistic about the growth opportunities in the key markets we serve.”

    5:00p
    Red Hat Launches Test Drives on AWS

    At its annual Partner conference in Scottsdale, Arizona this week Red Hat (RHT) announced new Test Drives on Amazon Web Services (AWS) with three Red Hat partners – CITYTECH, Shadow-Soft, and Vizuri. Through the AWS Test Drive program, users can quickly and easily explore and deploy ready-made solutions built on Red Hat technologies.

    “AWS Test Drive is an exciting program to help drive sales and open new business opportunities with AWS,” said Terry Wise, director, Worldwide Partner Ecosystem at AWS. ”With the AWS Test Drive labs, customers can register and experience an evaluation of enterprise software solutions using the self-paced hands-on lab platform in a matter of minutes, as opposed to days or weeks. Once a customer completes the AWS Test Drive, they can quickly move to production by working with sponsoring partners. We are excited for Red Hat to launch these new test drive labs and to expand this opportunity to many more Red Hat partners.”

    For CITYTECH, the test drive allows users to interact with sample business and process workflows and modify a mortgage application’s business flow. Featuring the soon-to-be-released Red Hat JBoss BPM Suite, Red Hat JBoss Data Grid, and Red Hat JBoss Operations Network, this Test Drive gives customers hands on experience with Red Hat JBoss Middleware on the AWS platform. Shadow-Soft is a Red Hat Premier partner and expert open source systems integrator. The test drive provides users a hands-on experience with Red Hat’s business process management (BPM) and complex event processing (CEP) offerings in a single, flexible platform delivered on AWS.

    Strategic partner Vizuri’s test drive shows users how to securely control their data in a cloud using Red Hat’s software-defined storage offering and the complete application stack running on their own servers. It gives users the ability to see how they can manage and scale their solution as it replicates within and across their datacenters. In addition to Red Hat Storage, Vizuri’s Test Drive will feature ownCloud, an open source file sync and share solution.

    “We are excited to create Red Hat based solutions on AWS with our extended partner ecosystems,” said Mark Enzweiler, senior vice president, Global Channel Sales, Red Hat. ”We’ve enjoyed working with CITYTECH, Shadow-Soft, and Vizuri to develop these initial solutions, and are eager to develop additional Test Drives with other partners. We believe these Test Drives are invaluable, they enable partners to use their expertise in pulling together complete solutions to solve complex customer challenges and illustrate how easily customers can use these tools to migrate to the cloud.”

    Red Hat plans to work with additional Red Hat partners to release more AWS Test Drives through the coming year.

    8:35p
    Court of Appeals Strikes Down FCC Net Neutrality Rule

    fiberoptics1-tnThis post originally appeared on The WHIR.

    A court of appeals in Washington, D.C. struck down the FCC’s net neutrality rules on Tuesday, arguing that its restrictions to “treat all Internet traffic the same regardless of source” have no legal merit.

    According to a report by PCWorld, the root of the appeal court’s decision is that unlike phone companies, broadband providers aren’t classified as common carriers. The ruling said that since the FCC “has failed to establish that the anti-discrimination and anti-blocking rules do not impose per se common carrier obligations, we vacate those portions of the Open Internet Order.”

    Verizon took the FCC to appeals court over the rules at the beginning of November, arguing that the FCC does not have the authority to treat it as a common carrier or a utility. It should be regulated as an information service, which is lightly regulated in comparison.

    The decision is a victory for Verizon and other broadband providers as certain customers will pay more to receive preferential treatment and faster speeds. The concern that net neutrality advocates have is that this could create an environment where smaller companies are unable to compete, a very bad move for innovation and the Internet. The issue could also have negative implications for hosting providers and their customers.

    The FCC will still be able to require broadband providers to disclose how data is being managed on their network, which may offer some transparency into the treatment certain types of traffic gets.

    “The D.C. Circuit has correctly held that ‘Section 706 . . . vests [the Commission] with affirmative authority to enact measures encouraging the deployment of broadband infrastructure’ and therefore may ‘promulgate rules governing broadband providers’ treatment of Internet traffic,’” FCC Chairman Thomas Wheeler said in a statement. “I am committed to maintaining our networks as engines for economic growth, test beds for innovative services and products, and channels for all forms of speech protected by the First Amendment. We will consider all available options, including those for appeal, to ensure that these networks on which the Internet depends continue to provide a free and open platform for innovation and expression, and operate in the interest of all Americans.”

    Recently, AT&T introduced bandwidth sponsorship, a service for businesses to pay for the bandwidth so that it does not count against the end user’s data usage. Some net neutrality advocates said that although AT&T ensures data sponsorship will not affect the speed or performance of delivery, it is still a problematic issue.

    This post was first published here: http://www.thewhir.com/web-hosting-news/court-appeals-strikes-fcc-net-neutrality-rule

    << Previous Day 2014/01/14
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org