Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, June 1st, 2016

    Time Event
    4:01a
    Mesosphere’s Data Center OS to Get Power, Cooling Visibility

    Mesosphere, the San Francisco-based startup behind the Data Center Operating System, has gotten a lot of attention – and funding – from the IT industry and venture capital heavyweights because of the way its software abstracts and orchestrates disparate IT resources and presents them as uniform pools of compute applications can use.

    Based on open source Apache Mesos, DC/OS orchestrates bare-metal servers and VMs, regardless of where they are running, in a public cloud like Amazon Web Services or Microsoft Azure or in the user’s own data center. It owes its wide appeal to the wide variety of resources it reaches, but it hasn’t had visibility into one important aspect of enterprise infrastructure: the physical data center resources underlying the IT.

    A new partnership Mesosphere has reached with Vapor IO, an Austin-based data center technology startup founded, promises to change that. The two companies will integrate DCOS with Vapor’s OpenDCRE (Open Data Center Runtime Environment) software, which will give DCOS visibility into things like data center power and cooling resources applications consume, Vapor IO announced Wednesday.

    Vapor IO was founded by Cole Crawford, former executive director of the Open Compute Project, the open source data center and hardware design community started by Facebook.

    Taking things further, Mesosphere and Vapor IO plan to create a scheduler they say will give a user visibility into the cost of running a particular application on-premise or in a public cloud. Once they decide what type of environment is more cost-effective for that application, they will be able to use the scheduler, called Mist, to move the application to that environment.

    Mist will work with applications that run in Linux containers, which will enable it to move workloads. Users will be able to orchestrate containers manually or automatically, Vapor IO said.

    Both OpenDCRE and DC/OS are open source. Mesosphere open sourced its software this past April.

    This is the second major technology integration Vapor IO announced this year. In March, the company said it was integrating its software and its atypical data center pod, called Vapor Chamber, with data center management software and data center modules by Chandler, Arizona-based Baselayer.

    1:00p
    Linux Foundation Backs HPE’s Open Source Switch OS

    OpenSwitch, the operating system for data center network switches Hewlett-Packard Enterprise launched last year as an open source project together with a number of other networking heavyweights, has become an official Linux Foundation project, the foundation announced today.

    The foundation provides infrastructure and management resources for open source projects it accepts, as well as the exposure to open source developers that may be more inclined to contribute because of the organization’s pedigree. It hosts some of the most influential open source infrastructure projects, such as Cloud Foundry, OpenDaylight, and Zen Project.

    Being under the Linux Foundation’s wing also means a project is administered by a neutral non-profit organization rather than by one or more profit-driven vendors.

    HPE launched the Linux-based OpenSwitch project together with Arista Networks, Intel Corp., Broadcom Corp., VMware, and Accton Technology Corp. It is a play to grab a piece of the growing market for so-called white-box or brite-box data center switches – low-cost commodity switches customers can install software of their choice on. These are a web-scale data center alternative to the pre-integrated, proprietary switching stacks the likes of Cisco Systems, Juniper Networks, as well as HPE, have traditionally sold to enterprise data center operators.

    OpenSwitch followed pioneering stand-alone Linux-based network operating systems for data centers by software startups Cumulus Networks and Big Switch Networks.

    HPE’s switch line for this market is called Altoline. Accton, one of the OpenSwitch project’s founders, manufactures these switches for the vendor.

    One big end-user contributor to the open source project is LinkedIn, which is in the middle of a wholesale overhaul of its data center design and strategy, switching to a web-scale infrastructure that consists of a lot of custom technology created in-house.

    LinkedIn has designed its own data center switches and a networking software stack, but its engineers want to focus their efforts on designing the control plane and features for the application layer and not on building a network OS from scratch, which is why they’ve been involved in OpenSwitch, Zaid Ali Kahn, the company’s senior director of global infrastructure architecture and strategy, explained in a blog post Wednesday.

    OpenSwitch is one of the open source and commercial options LinkedIn is evaluating as a potential OS that will run underneath its custom control plane.

    “While scaling our data centers out, we want to control the complexity of data center fabric by moving toward a fully-automated, self-healing, and purpose-built application-centric network that operates on its own,” Kahn wrote. “By building a native Linux-based network operating system with open interfaces, it is now possible to manage switches and extend visibility, controls, and applications to network elements in the same way we do on servers.”

    1:00p
    CenturyLink Commits to Data Center Efficiency Targets under Federal Challenge

    CenturyLink has committed to improving energy efficiency of its entire US data center portfolio by 25 percent by joining a voluntary US Department of Energy program that promotes investment in more efficient energy use in buildings.

    The Monroe, Louisiana-based telco has been upgrading its sprawling data center portfolio to improve efficiency since last year, despite the possibility that it may sell some or all of those sites. CenturyLink management has been evaluating numerous alternatives to owning its data centers.

    Bill Gast, CenturyLink’s director of global data center energy efficiency, said uncertainty about ownership of the portfolio in the future hasn’t disrupted the current push to improve its efficiency that started last year.

    “We’re still finishing up projects we started in 2015,” he said. “We’re continuing to invest, and we have funding to do that.”

    CenturyLink has more than 30 data centers in the US consuming about 200MW total. Joining a DoE program called the Better Buildings Challenge, the company has committed to improving energy efficiency in those facilities by 25 percent by 2023.

    The benchmark year for the improvements is 2013, meaning efficiency of the portfolio in 2023 will be compared to its efficiency in 2013, prior to the time the company started its portfolio-wide improvement project. In fact, it’s already reached 60 to 70 percent of the goal in Gast’s estimate.

    The company’s data center portfolio required about 150MW of power in 2013. It has since expanded capacity, but its commitment under the challenge is to improve energy efficiency of the power and cooling infrastructure in its facilities, not to reduce their total energy consumption.

    The company and the third-party consultants that will verify its progress for the DoE will use Power Usage Effectiveness (PUE) to measure efficiency.

    The bulk of CenturyLink’s efficiency gains so far have come from upgrading data center cooling systems, he said. The company has started deploying cooling technologies that are relatively new to the data center industry, such as the heat wheel and CenturyLink’s proprietary cooling system design called Chiller in a Box.

    Gast expects to meet the efficiency goal for the entire US portfolio by continuing to implement cooling upgrades.

    The data centers CenturyLink is improving include facilities the company owns as well as the ones it leases from the likes of Digital Realty Trust, the San Francisco-based data center provider that lists the telco is its second-biggest customer, following IBM.

    There are some data centers in the US CenturyLink is in but doesn’t operate, and they are not part of the initiative. But they represent less than 10 percent of the company’s domestic portfolio, Gast said.

    Digital and CenturyLink are two of five data center providers that have joined the DoE’s challenge, according to the program’s website. The others are IO Data Centers, Sabey Data Centers, and Iron Mountain.

    A total of 11 companies and one university have joined the program with a focus specifically on improving efficiency in their data centers, including Facebook, Intel Corp., eBay, and Schneider Electric.

    4:50p
    Data Debunked: The Myths and Truths About File Analysis  

    Kon Leong is President, CEO and Co-founder of ZL Technologies.

    Ideal data management strategies are different for every organization; everyone has his or her own take on the exact definition of what “information governance” is even supposed to be. Full data management has become a very multifaceted business challenge, and analysis tools seem to promise to offer some assistance. However, analytics can’t fix the absence of strategy.

    Myths arise because everyone loves a simple explanation. We’re drawn to them because they offer to (seemingly) to streamline the vast and unwieldy. The more complex a topic, the more likely it is that myths will spread. File analysis and file governance are excellent examples of this. With file share environments being the epicenter for human-generated activity within the business, organizations question how file analysis best fits into the data management strategy. However, with such a broad topic, it’s easy to fall into the occasional trap made by an attractive myth. Here are some of the more common ones.

    Myth #1: File analysis = information governance. Analysis of a file share environment – in isolation – is not information governance; it is simply a snapshot assessment of content at a given point in time. However, file analysis certainly can be used as part of an overarching information governance program. Increasingly, this is exactly the case with file shares, which have often been neglected in governance efforts. Analysis offers a first step for businesses that simply don’t know where to start with sprawling file environments.

    Truth: File analysis is part of information governance. The process of information governance, however, is a journey. The first step will only get you moving. The process that follows the initial analysis – deciding what to do and then doing it – is what pushes the data down the path of information governance. This process, called data remediation, is not an inherent function of most file analysis tools available.

    Myth #2: File Analysis will classify, sort and manage data. File analysis alone cannot make business decisions— only the business can. So before initiating a file analysis effort, the organization needs to be clear in determining what the objectives and desired outcomes are. File analysis for file cleanup is like walking into a messy room and creating a detailed list of what’s there; the cleanup process itself isn’t provided by the initial assessment.

    Truth: File analysis can be the first step in data management. Most file analysis tools will generate an initial report, but the decisions and heavy-lifting remain. If there is a need to move, discard, re-arrange, classify, or otherwise manage the analyzed files, the organization needs to ensure plans for the next steps have been made. Some file analysis tools offer integration with information governance or records management products, and some existing governance platforms can natively analyze file environments. Make sure the business understands what options exist long before a purchase is made.

    Myth #3: File analysis is a one-time task. For many organizations, file analysis IS treated as a one-time project. However, unless something is done with the analyzed content, it becomes necessary to eventually repeat the effort. Just as in our messy room example, file shares are a living ecosystem. There are constantly new items, changes and revisions. If file analysis is used as the first step in a one-time cleanup approach, it’s akin to doing spring-cleaning… and no more cleaning for the rest of the year.

    Truth: File analysis (ideally) is an ongoing process. Singular file analysis followed by singular cleanup must become a recurring project in order to maintain control over data. With the right architecture, it’s possible to conduct full-scale analysis of files once, actively categorize data or get rid of it, and then perpetually analyze the changes and activity in order to manage any subsequently modified or created content. In the end, a file analysis project can be conducted once… but it’s likely not what you’re looking for.

    Myth #4: File analysis can only analyze metadata. Part of the selection process for a file analysis method or tool is deciding how deep the analysis needs to be in order to achieve desired business objectives. Many file analysis products rely on metadata rather than content. While metadata alone can provide a wealth of critical information commonly used to classify and manage content within a records or governance program, it doesn’t give the full picture.

    Truth: File analysis can examine data within documents. Often, the most important data is hidden within document content. Take, for example, sensitive information such as Social Security numbers or financial information. In order to reliably tell if these have been stored within a public file share or other unsafe location, the content of the files need to be assessed. If an organization desires to use file analysis as part of an effort to increase data security, content analysis is critical. It does exist, but the business needs to conduct due diligence to ensure that the tool being purchased offers content capabilities.

    Myth #5: File analysis is an IT problem. File analysis, used properly within a comprehensive information governance strategy, is an “everyone problem.” The failure to reign in control of files affects nearly every business unit.

    Truth: File analysis is a business strategy issue. While IT teams may be key facilitators in the initial file analysis process, the downstream stakeholders are spread across the organization. End users benefit from increased efficiency of file access and productivity. Legal teams benefit from the defensible removal of junk. Compliance and risk managers benefit from better access controls and protection of sensitive content. And IT, of course, benefits from a more streamlined and secure file environment. File analysis should never be conducted simply because of an IT driver.

    The file analysis market offers more options than one might expect, but care is required to ensure that complexity doesn’t muddle decision-making. The trick is fitting file analysis into a comprehensive governance road map. To do so requires not only a deep assessment of business objectives, but also cohesion between business stakeholders.

    Recognize the myths of file analysis, and overcome the challenge of information governance.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:14p
    DigitalOcean Launches India Data Center

    WHIR logo

    Brought to You by The WHIR

    DigitalOcean has launched its new data center in Bangalore, India, to support the growing startup ecosystem in the country. The New York-based cloud infrastructure provider calls India one of the “most important technology markets in the world.”

    The announcement comes shortly after DigitalOcean closed a $130 million credit facility to support its global expansion.

    According to Tuesday’s announcement, the company will continue to offer a single pricing plan across all of its regions, including Bangalore, starting at $5 per month. Bangalore is DigitalOcean’s eighth region, joining New York, San Francisco, London, Amsterdam, Singapore, Frankfurt, and Toronto.

    DigitalOcean has hired a local team and partnered with NASSCOM’s 10,000 Startups initiative in order to support the Indian startup ecosystem. The NASSCOM program brings corporations and early stage Indian tech companies together.

    “India is poised to unleash a tremendous amount of innovation in the next decade,” Ben Uretsky, CEO and co-founder of DigitalOcean, said in a statement. “We want to empower the next generation of software companies by providing them robust and easy-to-use cloud infrastructure they need to grow.”

    This first ran at http://www.thewhir.com/web-hosting-news/digitalocean-brings-bangalore-data-center-online

    7:42p
    Microsoft Opens up about Lawsuits against DOJ Gag Orders

    ITPro logo

    Brought to You by IT Pro

    Over the past several years, Microsoft has become increasingly vocal about its thoughts on the balance between privacy and the law.

    “People shouldn’t lose their rights simply because technology is moving to the cloud,” argues Microsoft chief legal officer Brad Smith. That’s a stance that Microsoft is investing a lot in, whether it’s the German Azure data centers that it has created that it cannot even access easily or an increasingly tough stance on disclosing government data requests.

    Smith is often the face of that push, but he said he has the full support of chief executive Satya Nadella as well as the Microsoft board.

    In a recent Q&A with the Wall Street Journal, he explained the importance of the company’s public and consistent stand.

    “Our employees are going to find themselves needing to make similar decisions in the future, or explaining these decisions to our stakeholders, including our customers,” he said. “We [also] need to get our rationale down to 122 characters so it can be put in a tweet that can be retweeted.”

    Preparing a message to be shared consistently in legal briefs and on social media is no small task, but Smith said it’s important, and a message that resonates.

    “What people want is to see information that is stored digitally in the cloud get the same kind of protection as information that is written down and stored on paper,” he told the Journal.

    Read the full interview at the Wall Street Journal, and let us know what you think about Microsoft’s recent privacy push in the comments.

    This first ran at http://windowsitpro.com/security/microsoft-opens-about-its-lawsuits-against-dojs-gag-orders

    << Previous Day 2016/06/01
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org