Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, December 8th, 2016

    Time Event
    1:00p
    Network Monitoring: Your First Line of Defense

    Richard Rauch is President and CEO of APCON.

    What’s your first line of defense?  In the military, it’s the troops on the front line.  On the football field, it’s the defensive line.  And in the data center, it’s the network monitoring system.

    These days, a business’s most valuable riches – corporate data – need a sound strategy to defend against attack.  That strategy comes in the form of the architecture of the data center itself, with the network monitoring and security solution providing IT managers with complete visibility of potential intrusions to protect data and improve network performance.  The monitoring solution completes the architecture of the data center, and represents the primary consideration when building or updating a network data center.

    The network monitoring system creates three primary benefits:

    • Protects big data and secures private information
    • Enhances the performance of monitoring and analysis tools
    • Provides better insight into critical security, network and business issues

    A reliable, secure network is designed to monitor, aggregate and filter data within large and complex architectures.  When selecting and building a network monitoring solution, IT managers should consider a flexible and scalable solution that can adapt to existing network architectures and grow as your network grows.  You also need a feature-rich solution that gives you the specific functionality to manage the network effectively.  In terms of managing the flow of data through your network, here’s what to consider:

    • Listen in on all network traffic. TAP and SPAN all nodes within the network to ensure 100% network visibility in real time.  TAPs (test access points) are installed between an analysis device and the network, while SPANs (switch port analyzers) mirror the activity in a port so it can be attached to an analysis device.  It’s important to consider both physical and virtual traffic as you structure your solution, and keep in mind that out-of-band monitoring allows nonintrusive visibility without disturbing the flow of data.
    • Aggregate the data.  Once your monitoring system can see all the data, you need aggregation switches to collect it for analysis.  An intelligent network monitoring solution can deliver the right data to the right tools at the right time.  In addition to aggregation, key features you should consider include filtering, port tagging, and load balancing.  With total network traffic visibility, you’ll increase both network security and performance.
    • Groom and filter.  Filtering data traffic is essential to optimizing the performance of your analysis tools, potentially extending the lifespan and utilization of your network tools and minimizing the expense of adding other tool parts.  For example, the process of deduplication removes duplicate packets, saving as much as 55% of total traffic, and doubling the analysis capacity of your security tools.  Ingress and egress filtering reduces or eliminates packet oversubscription.  Other advanced features such as time stamping, packet slicing and header stripping can prime the data for use by specific monitoring tools.
    • Send data to tools. Finally, let the tools do their work.  To design an effective network, you must first understand what tools you need, based on what needs to be analyzed for the purpose of the business.  Different stakeholders will have different needs, typically identified by department or function.  Essential tool functions will include solutions that monitor overall network performance, analyze specific applications such as VoIP traffic or customer experience monitoring, inline security analysis, and monitoring for forensics.

    Scalability and flexibility are key factors in selecting a network monitoring solution and determining the tools you need to manage and secure your network.  Beyond that, advanced features and functionality will help ensure your solution is robust and reliable as you manage greater amounts of data moving at faster speeds.  Your ultimate security is dependent on a strong line of defense and a clear view of all potential intruders.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

     

    5:14p
    Verizon Finds Only Biggest Thrive Down on the Data Farm

    BLOOMBERG – Here’s a curse to hurl at your worst enemy: May you be forced to operate data centers.

    Those buildings in city centers and rural stretches are packed with computer equipment that funnel every email and each pixel of a Netflix movie to their destinations. Data centers are the invisible but essential engines of the digital world, but it’s also an awful business to be in — unless you’re one of a handful of technology superpowers.

    The latest to wave the data-center white flag is Verizon Communications Inc., which said on Tuesday that it planned to sell 29 of its data centers for $3.6 billion. Verizon, like many other telecommunications companies, thought data centers would be a lucrative growth business to serve its business customers. Companies pay Verizon to host and sometimes operate the businesses’ computer equipment in Verizon data centers.

    The problem they all encountered is that running data centers — like many aspects of the tech business — is a game where the superpowers are extending their advantages over everyone else. It’s incredibly expensive and complex to operate big farms of computer servers and internet pipes, and companies including Amazon, Google’s parent company Alphabet Inc. and Microsoft Corp. have found ways large and small to ensure their data centers run as efficiently as possible.

    Verizon is expert at building and running its telecom network, but it can’t hope to match the data-center expertise of internet companies with billions of web users and bulging wallets. Verizon sold its data centers to Equinix Inc., which isn’t an internet giant but essentially a savvy real estate company that has become a key link to superpowers in tech and beyond. Companies like Google and Wall Street banks lease space in Equinix data centers — often next to superfast internet networks — to ensure the efficient operation of their web services or financial transactions.

    If you believe the technology diehards, in the future just a handful of companies will own data centers and everyone else will pay for the computing horsepower they need — the phenomenon known as cloud computing. Executives at Amazon’s cloud-computing business have compared this vision to the way people and businesses pay the power company for the electricity they use rather than running their own electrical grids.

    We’re far from that future, and the scenario may never happen. But in a landmark moment, Intel Corp. — whose computer chips power nearly all data-center gear — said last year that more of its data-center chip revenue in 2016 would come from purchases by cloud-computing companies than by the big corporations like General Electric or Ford that run their own data centers.

    Intel’s statement was a sign of how much the data-center industry is consolidating — to the benefit of the superpowers and to the detriment of the smaller data-centers owners like Verizon that can’t hope to keep pace.

    5:34p
    Google Will Be Powered Completely by Clean Energy Next Year

    BLOOMBERG — Google, the world’s biggest corporate buyer of clean energy, expects to reach a major milestone next year: running the company entirely with wind and solar power.

    The Alphabet Inc. unit has been pursuing the goal since at least 2012, “but I didn’t think it would happen so fast,” Gary Demasi, Google’s director of global infrastructure and energy, said in an interview. “We’ve seen prices come down precipitously, which has helped us ramp up.”

    Google expects to purchase enough clean power in 2017 to meet or exceed all of its consumption at its offices and 13 data centers; it used 5.7 terawatts of energy in 2015. The company signed its first renewable-energy deal in 2010 and now has contracts for 2.6 gigawatts of capacity from 20 wind and solar farms worldwide. The projects required about $3.5 billion to build, and about $2 billion went for power plants in the U.S., Demasi said.

    Businesses are significant drivers of clean power, and 83 major companies worldwide have pledged to power all of their operations with renewable energy as part of the global fight against climate change, under the RE100 initiative. Companies signed deals to procure 1.1 gigawatts of green power the year before Google began its push, according to Bloomberg New Energy Finance. That swelled to a record 5.1 gigawatts in 2015.

    Corporate Buyers

    U.S.-based tech companies are the leading global corporate buyers of clean energy. Amazon.com Inc. is the second-biggest buyer with 1.2 gigawatts, according to New Energy Finance. Microsoft Corp. has signed 500 megawatts of power-purchase agreements.

    “Google led the way in actual activity, but also in broadcasting what they were doing to the public, effectively peer-pressuring other companies to sign corporate PPAs as well,” Nathan Serota, a New York-based analyst at BNEF, said in an interview.

    In 2015, when 44 percent of its electricity needs were met by renewables, Google signed about 1 gigawatt of clean-energy contracts. Many of those projects will be operational next year. The company will sign additional deals to support growth in regions where it has data centers and significant operations, and also plans to use technologies that provide around-the-clock clean energy.

    “We really consider this a first step,” Demasi said. “Climate change is real and a crisis of our time.”

    5:45p
    Microsoft Unveils Windows 10 for Qualcomm Chips, a Blow to Intel
    BLOOMBERG — Microsoft Corp. is creating a version of the Windows 10 operating system that will run on laptops powered by Qualcomm Inc. chips, a move that could erode Intel Corp.’s dominance in PCs and help the software maker gain a bigger foothold in mobile computing.Chips from Qualcomm, the largest maker of semiconductors used in phones, are designed to work on limited battery life and have integrated cellular connections. Bringing Windows 10 to notebooks and tablets that run on these chips will result in sleeker devices that can go days without needing to be plugged in and are always connected, the companies said. The first devices will be available as soon as next year, Microsoft said in a blog post, without specifying which manufacturers have committed to make them.</p>

    If successful, the effort will pose the first major challenge to Intel technology in personal computers since the market’s birth in the early 1980s. It’ll also bring to Windows the Qualcomm chips that have been integral to the explosion of mobile computing, a growing business Microsoft has failed to crack. The tie-up isn’t the first time the world’s largest software maker has tried to bring Windows to computers running on chip technology other than Intel’s — the previous attempt garnered few sales and resulted in huge losses on unsold inventory.

    This time, Microsoft will give Windows 10 for Qualcomm chips the ability to run programs written for the traditional version of the platform, using software called an emulator. That means the computers and tablets will be able to run regular Microsoft Office programs, as well as applications like Adobe Photoshop. Generally an emulator slows down a computer’s performance, but Microsoft and San Diego-based Qualcomm said mobile hardware is now fast enough that users won’t notice a decline.

    “This is a commitment to bring mobility into Windows,” Cristiano Amon, head of Qualcomm’s chip business, said in an interview. Windows users will get thinner and lighter machines that are always connected to the Internet, he said.

    Microsoft, based in Redmond, Washington, will announce the effort early Thursday at the annual Windows Hardware Engineering Community conference in Shenzhen, China.

    Surface Flop

    In 2012, Microsoft released a version of Windows for computers running on chips designed for mobile devices, based on ARM Holdings Plc semiconductor technology. Yet apart from Microsoft itself, which fielded the Surface RT, only four device makers signed up to make tablets to run the software. Only one other company had products available at launch, and two never shipped a compatible device in any significant volume. Within nine months, Microsoft had to slash prices and write down the value of its unsold Surface tablets, and it stopped making the RT version in 2015.

    This time Microsoft will offer an enterprise edition so corporations can manage the tablets and laptops and weave them into information technology infrastructure the same way they do with Intel-based machines. Microsoft’s previous effort targeted consumers. The new focus on business customers and the addition of more compatible applications mean this time Microsoft is more likely to succeed, said Matt Barlow, a Windows vice president.

    “Things are materially different than they were even a few years ago,” he said.

    At the conference in Shenzhen, the software maker also said it will request Chinese permission to sell its HoloLens augmented reality goggles in that country. Microsoft expects to get the OK and begin selling the devices in the first half of 2017. The company began selling its Xbox console there two years ago, ending a lengthy ban on game consoles in the world’s most populous country.

    Microsoft is still leaning heavily on Intel for many parts of its Windows plans. The two companies are working on computers that take advantage of artificial intelligence and Microsoft’s Cortana voice-controlled search agent. As part of the collaboration, called Project Evo, the two companies will work on programs like adding far-field speech communication to devices so users can ask Cortana questions or request it play music from across the room — features popularized by Amazon.com Inc.’s hit Echo voice-activated home devices.

    << Previous Day 2016/12/08
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org