Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, February 17th, 2014

    Time Event
    1:08p
    Germany’s Merkel Calls for Separate European Internet
    angela-merkel

    German Chancellor Angele Merkel has proposed the creation of a separate European Internet to shield users from surveillance. (Photo by Christliches Medienmagazin pro via Flickr)

    There’s been speculation among analysts that the leaks and press coverage of sspying on major Internet companies could lead to a splintering of the global Internet, with other countries seeking to build “NSA-free zones” for their residents and businesses.

    Those worries took a more serious form this weekend when German Chancellor Angela Merkel proposed the creation of a separate European communications network that would keep data within the EU’s borders and not allow it to pass through U.S. based servers.

    “We’ll talk with France about how we can maintain a high level of data protection,” Merkel said in her weekly podcast. “Above all, we’ll talk about European providers that offer security for our citizens, so that one shouldn’t have to send emails and other information across the Atlantic. Rather, one could build up a communication network inside Europe.”

    In November, reports emerged that Deutsche Telekom was working to develop a network. The proposal could be extended to other EU countries, but not the UK, where the national spy agency GCHQ has reportedly tapped undersea cables and service provider networks to access data.

    For Merkel, the surveillance issue has been personalized by reports that the U.S. government had been monitoring her phone for years.

    Playing Politics, or Breaking the Internet?

    It’s hard to know whether Merkel’s proposal is a genuine first step towards a separate EU network, or an effort to rachet up political pressure on President Barack Obama to rein in the spying activities of U.S. intelligence agencies. This type of partitioning of the Internet would be a huge setback for global business. Although it seems far-fetched given the benefits of an open global Internet, the possibility of a Balkanized Internet must be taken more seriously when it is raised by heads of state rather than Internet pundits.

    What are the implications for the data center industry? More than ever, the physical location of infrastructure is taking on political overtones. If a European audience is essential to your business, having a data center in that market may become necessary. Some large players with global ambitions have sought to serve their international users primarily through a U.S.-only data center footprint, including companies like Twitter and LinkedIn.

    A fenced-off European network would also have business implications for providers. It could increase demand for established Euro-centric colocation and wholesale data center providers, and boost the fortunes of markets like Frankfurt, Amsterdam and Paris.

    There are many potential implicatioms to Merkel’s proposal. For now these are purely speculative. But this proposal bears watching, given its ramifications for Internet business and the infrastructure that powers it.

    1:30p
    DCIM Takes Mid-size Data Centers to the Next Level

    Tim Hazzard serves as president of Methode Electronics – Data Solutions Group, which provides Data Center Infrastructure Management (DCIM), active energy, cabinets, cabling and customized data center solutions throughout North America and Europe.

    Tim-Hazzard-tnTIM HAZZARD
    Methode Electronics

    For mid-size data centers, a custom DCIM solution might seem out of reach.

    “Sure, the ‘big boys’ have the resources to install and maintain smart cabinets, but surely DCIM is too complex for our company. Plus, do we really need all that data?”
    The truth is, mid-size data centers stand to gain considerable benefits from DCIM information and efficiencies. In fact, we’ve heard testimonials from mid-size customers reporting major savings after utilizing DCIM intelligence. One customer in the United Kingdom recently reported to our team that it saved more than a million pounds by turning up the data center’s temperature by five degrees – based on information from its DCIM solution.

    The current market provides mid-size data centers with affordable custom options that are turn-key, requiring little installation time and providing analytics that allow managers to make real-time operating decisions.

    There are a lot of myths that could convince managers that their mid-size data centers are too small for such a significant tool. I’d like to clear up a few of the most common DCIM misperceptions that keep companies from implementing DCIM into their data center operations.

    Myth #1: DCIM is too expensive.

    Data center managers of mid-size companies may believe that custom DCIM solutions are reserved for Fortune 500 companies with well-funded IT budgets. In reality, effective DCIM solutions can deliver smaller data centers the same benefits while remaining within financial constraints.

    DCIM is a broad application. Getting involved in DCIM can be as much as you want it to be. There are a lot of suppliers that will provide various features and functions based on what your company needs. It just depends on what you want to accomplish.

    In order to keep costs manageable, review your data center’s needs and implement pieces of DCIM that will yield the biggest returns for your unique setting. Be sure to select tools with enough depth to provide the desired information without going overboard. For example, choosing applications that are manually updated might save money up front, but those initial savings could be lost, in the end, to expensive errors and miskeys.

    Next, align and enforce standards that support the DCIM applications you choose. This will ensure quick integration, saving considerable time and money.

    You also can save by looking for suppliers that offer more turn-key solutions. By going with a pre-integrated product, you can save on integration fees. Plus, some suppliers will design, install and service your DCIM solution, ensuring your needs are met every step of the way.

    Myth #2: DCIM is too complicated.

    DCIM implementation doesn’t mean you need a hoard of internal resources. Mid-size data centers need a complete toolset and the assistance to install and maintain the tools with minimal effort.

    Yes, DCIM can appear complicated. One system monitors temperature, humidity, air flow, server capacity, security, asset tracking. . .that’s a lot of moving parts. But, when you break the processes down to finite pieces and tackle each individually, it’s not as daunting. It’s important to approach it comprehensively from a facilities point of view. Work down to the core aspects of each feature of a DCIM solution. That’s how you break apart the complexity. After that, it’s simple.

    Good service can help with the complexities, too. Find a provider with a robust tool that integrates the hardware, software and service support to set you on the right track. Full-service providers will make sure it pre-integrates with your existing systems, and service technicians will ensure that the hardware and software are in synch. So, you’re not in it alone.

    Myth #3: DCIM can’t scale.

    Data center cabinets aren’t “one size fits all.” Each company has unique infrastructure and information needs, which can be accommodated through a custom DCIM solution. No matter the size, each data center can have resources that fit its own business requirements.

    Scalable assets are important. Hardware companies often offer multiple pricing models based on the tools selected and can scale to smaller facilities. Find a solutions provider that will work to understand your data center’s unique needs and match the functionality level you’re looking to leverage.

    A fully functional and effective DCIM solution allows human resources to be more productive. Employees will spend less time analyzing data and instead can repair equipment and provide other internal services. In addition, DCIM equips data center managers with more predictive analysis so they can resolve potential failures before they happen. For example, a server may get too hot due to lack of sufficient air flow. Proactive maintenance responds and alleviates an issue before it becomes a critical situation.

    These benefits aren’t just for large data centers, though. Mid-size data centers can utilize custom DCIM solutions to improve efficiencies and effectiveness of their operations, particularly with less space, fewer resources and smaller staffs.

    Take Your Data Center to the Next Level

    DCIM certainly is within reach for mid-size data centers. But, when considering your options, evaluate the features and services that best fit your company’s needs. Look for a customized solution that includes the features that will enhance real-time monitoring and deliver actionable metrics.

    The idea is to equip mid-size data centers with just the right amount of information – enough to make fast, smart decisions to better allocate resources, but not too much that it can’t be easily digested and used effectively.

    With the right DCIM solution and provider, you can take your data center to the next level, no matter what size it is.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    1:31p
    DCK Video: ARM Partners To Bring SOC Solutions to the Data Center

    ARM, the company that licenses low-power chip architecture to multiple companies, especially in the mobile phone market, is making a move into the networking and server areas of the data center. At the recent Open Compute Summit, Data Center Knowledge spoke with Lakshmi Mandyam, Director, Server Systems & Ecosystems, ARM, about ARM’s history and its future. More details after the jump. The video runs 4:14 minutes.

    At the event, ARM announced the collaborative development and immediate availability of a platform standard for ARMv8-A based (64-bit) servers, known as the ARM ‘Server Base System Architecture’ (SBSA) specification. ARM worked with many vendors in this effort, including Canonical, Citrix, Linaro, Microsoft, Red Hat and SUSE, and original equipment manufacturers (OEMs) including Dell and HP along with a broad set of silicon partners.

    This specification provides a framework for the deployment of ARM architecture-based solutions in data center applications, and it will help accelerate software development and enable portability between ARM-based platforms. Data centers need standards-based software and hardware offerings to ensure ease of deployment and manageability.

    For related posts, see Applied Micro Presents Server on a Chip and AMD, Dell See Opportunity in ARM Server Market.

    For additional video, check out our DCK video archive and the Data Center Videos channel on YouTube.

    3:01p
    Debunking the Top 9 Hybrid Cloud Myths

    The cloud model is here to stay. In fact, many organizations are expanding their infrastructure directly into some type of cloud platform. Initially, there were concerns about the operation of a cloud. It was a new technology, adoption was low, and many businesses were worried about resources. The modern cloud has come a long way. Organizations clearly see the direct benefits of working with a cloud infrastructure.

    Here’s the reality: More solutions become available every day, but with a learning curve and numerous management challenges and considerations.

    So when it comes to a hybrid cloud model – what are some of the fears that enterprises still have? In this white paper from Equinix, we take a look at the hybrid cloud infrastructure, the types of benefits it can bring to an organization – and where this type of environment can truly make an impact with your organization.

    But it’s not just about looking at benefits and best practices. It’s also about understanding the real implications of the hybrid cloud model versus myths and fears which can prevent proper adoption. This white paper explains the top 9 hybrid cloud myths that could potentially be slowing down your organization’s IT goals.

    equinix2

    These myths include:

    • Myth: “You can use the public cloud for everything.”
    • Myth: “You can’t be secure in the cloud.”
    • Myth: “The cloud is not reliable.”
    • Myth: “The public cloud will always be cheaper.”
    • Myth: “You can’t get enough bandwidth in the cloud.”
    • Myth: “You can’t move large amounts of data in the cloud.”
    • Myth: “You lose control of data in the cloud.”
    • Myth: “We can wait—nobody got fired for not moving to the cloud.”
    • Myth: “You can’t get regulatory compliance in the cloud.”

    Remember, there are many myths about the hybrid cloud, but they are just that: myths. Download this whitepaper today to learn the truth and become more informed about the full range of cloud options that can impact your enterprise. The flexibility and economy of scale in the cloud con­tinue to improve – and the evolution of the cloud model has actually removed some of the biggest deployment and management challenges.

    3:30p
    Intel Launches Hadoop-Powered Big Data Platform
    bigdata-470

    Intel has launched a Big Data platform atop its Hadoop distribtuion.

    Expanding its investment in a growing a big data portfolio Intel (INTC) launched the Intel Data Platform, a software suite based on open source technologies designed to make it easier and faster for companies to move from big data to big discoveries. Built on the Intel distribution of Apache Hadoop, the new platform features several new data processing capabilities including streaming data processing, interactive analytics, and graph processing.

    “As big data shifts from hype to reality, Intel is helping to break down the barriers to adoption by easing complexity and creating more value,” said Boyd Davis, vice president and general manager of Intel’s Datacenter Software Division. “Much like an operating system for big data processing, the Intel Data Platform supports a wide variety of applications while providing improved security, reliability and peace of mind to customers using open source software.”

    Intel entered the Hadoop software market about one year ago, citing its potential as a transformational tool for big data. It launched with immediate support from many vendors, such as Cisco, Red Hat, Cray and Supermicro.

    Two Deployment Options

    The new Intel Data Platform will be available in Enterprise Edition and Premium Editions. The Enterprise Edition will offer full platform capabilities as a free software product to customers who can support their deployment. The Premium Edition will be available for purchase on an annual subscription basis and will provide additional technical features including enhanced automation, proactive security fixes and alerts, ongoing feature enhancements, and live telephone technical support.

    A new Intel Data Platform: Analytics Toolkit (AT) provides a graph analytics and predictive modeling environment to help businesses uncover valuable insights from hidden relationships within data. The toolkit provides a foundation of common algorithms, such as graphs and network-based clustering, that IT teams can build on and customize with domain-specific code. The easy-to-deploy algorithms are broad enough to be applied to multiple industries, including financial services, healthcare and retail. The toolkit will also provide an enhanced development framework for unifying graph analytics and classical machine learning to ease the programming effort. The Intel Data Platform AT is available in beta now and expected to be commercially available by the end of the second quarter.

    Real World Examples

    Intel has already worked with companies of all sizes on implementing its new platform. Using  Intel-based hardware and software solutions, China Mobile Guangdong was able to improve billing processes and customer service by enabling online bill payment as well as the retrieval of up to six months’ worth of call data records in near real time. China Mobile Guangdong’s detailed billing statement inquiry system can now retrieve 300,000 records per second and insert 800,000 records per second or the equivalent of 30 terabytes of subscriber billing data per month.

    Intel also worked with Living Naturally, a retail technology provider, to develop business analytics algorithms based on the Intel Distribution for Apache Hadoop to help retailers better manage supply chain and product promotions. The algorithms analyze a mix of internal and external data, such as social media, search engines and weather sites, to provide retailers with better insight and help determine when to reorder products in optimal quantities to minimize surpluses, shortages and shelf life expirations.

    4:00p
    Riverbed Launches Steelhead Appliance for the Data Center

    Riverbed launches a new Steelhead DX Edition 8000 appliance for wide area network acceleration, Citrix introduces CloudBridge CSX Solution to consolidate servers at branch locations, and Dell introduces thin client infrastructure support for Windows Server 2012 R2 and vWorkspace.

    Riverbed introduces Steelhead DX 8000 appliance. Riverbed (RVBD) announced that it has expanded its Steelhead wide area network (WAN) optimization product family, with a new DX Edition 8000 Series – to address the unique needs of datacenter-to-datacenter data replication workloads. The appliance delivers 60 times the WAN performance acceleration and up to 99 percent bandwidth reduction. It supports up to 2 Gbps of optimized WAN capacity and up to 10 Gbps of optimized LAN capacity while optimizing up to 10,000 TCP and UDP application flows. In addition, the solution provides unique optimizations for data replication applications such as NetApp SnapMirror and EMC Symmetrix Remote Data Facility (SRDF), providing advanced performance, enhanced visibility and fine-grain control of data replication processes end-to-end across the WAN. “The Steelhead product family is already the most complete WAN optimization solution in the market, and with this announcement we added another important and business-critical use case,” said Paul O’Farrell, senior vice president and general manager, Steelhead Products Group, Riverbed. “The new Steelhead DX Edition eliminates distance between datacenters as a barrier to achieving true location-independent computing, which in this case means that data can be moved, stored and backed-up regardless of where facilities are located, but with performance similar to a local area network. Only Riverbed offers a complete solution for both large-scale branch-to-datacenter and datacenter-to-datacenter environments.”

    Citrix launches CloudBridge CSX Solution. Citrix (CTXS)  announced the CloudBridge CSX solution, powered by the new CloudBridge 2000WS appliance.  The new CloudBridge appliance helps consolidate the number of physical servers needed at the branch location by uniting full wide area network (WAN) Optimization capabilities with a new, pre-configured Windows Server.  The CSX solution combines the new CloudBridge appliance with applications provided by partners Qumu, Talon and Cortado to offer the most critical services needed at the branch  – video distribution and streaming, multi-site file collaboration, and localized print services.  ”CIOs today are thinking about how to address the transition to the cloud, particularly at branch locations,” said Chalan Aras, Vice President and General Manager, CloudBridge Product Group at Citrix. “As enterprise IT tries to become more efficient by moving some non-critical apps at the branch to the cloud there’s a need to address application delivery. Traditional WAN Op as a stand-alone technology must evolve to become a holistic cloud services delivery model. Citrix is leading the path toward this new cloud services model with the capabilities included in today’s announcement of the CloudBridge CSX Solution.”

    Dell launches Wyse Thin CLient for Windows Server and vWorkspace.  Dell announced expansion of its Cloud Client-Computing solutions portfolio to incorporate support for Windows Server 2012 R2 and vWorkspace, combining existing infrastructure offerings for Windows Server 2012 R2 and for vWorkspace into a single, end-to-end Hyver-V based desktop virtualization solution. The solution supports Microsoft Windows Server 2012 R2, Windows 8.1 and Intel Ivy Bridge processors, and Microsoft RemoeFX(RDP), with support for a broad range of RDP 8.0 features. It also has virtualized shared graphics using AMD FirePro GPUs, and features unified communications support with the Microsoft Lync 2013 VDI plug-in for Microsoft Windows 7 and Windows 8 end points increases user productivity without affecting server user density. New VDI features include support for iOS, Mac OS and Android end points. “We are excited to work with Dell on this offering for customers with Microsoft-based IT environments,” said Klaas Langhout, principal director project management, Windows Server, Microsoft. “Dell Wyse Datacenter for Microsoft VDI and vWorkspace will allow IT to use their existing skills on Windows Server 2012 R2 with Hyper-V to reduce operating and capital expenditures with RDP deployments.”

    4:30p
    LSI, EMC, Mellanox and Supermicro Join for Hyper-Converged VDI Appliance

    LSI collaborates with EMC, Supermicro and Mellanox for a hyper-converged appliance supporting VMware virtual desktops, F5 and VMware collaborate on a number of efforts to provide secure access control for virtual desktop deployments, and Silver Peak enables VMware vCloud Hybrid Service with accelerated data transfers to the cloud.

    LSI, EMC, Mellanox and Supermicro deliver converged appliance for virtual desktops. LSI announced it has collaborated with EMC, Mellanox and Supermicro to offer a high-performance virtual desktop infrastructure (VDI) appliance for VMware Horizon View deployments. The appliance combines storage, compute and networking hardware in a single box to deliver a scalable hyper-converged system to meet the growing customer demand for desktop virtualization. It is designed to provide customers starting out with a few servers and a few hundred desktops with the ability to simply and efficiently scale-out VDI environments to meet changing business requirements. One hyper-converged appliance combines LSI Nytro 3.2TB PCIe flash accelerator cards, a Mellanox 40GbE interconnect solution, and Supermicro servers, powered by EMC ScaleIO software. The appliance is designed to accelerate VDI deployments, increase desktop performance and maximize VDI density per node at a fraction of the cost and complexity associated with traditional SAN-based storage infrastructures. “Maximizing the number of virtual desktops per server while providing real-time response to mobile users is essential to build an efficient VDI infrastructure that meets enterprise business needs,” said Kevin Deierling, vice president of marketing at Mellanox Technologies. “Using Mellanox’s end-to-end 40GbE interconnect solution helps to eliminate the I/O bottlenecks that can hamper traditional VDI deployments, allowing organizations to achieve their business processes at a lower total cost of ownership.”

    F5 and VMware collaborate. F5 Networks and VMware (VMW) announced a number of collaborative efforts designed to provide secure access control for customers’ virtual desktop deployments. New versions of the F5 BIG-IP Access Policy Manager (APM) solution will be brought to market which have been tailor-made to deliver secure access and optimized performance for VMware Horizon View. F5 also introduced a dedicated iApp and reference architecture to significantly speed deployments and provide prescriptive guidance on how these new solutions support VMware technologies. “F5 shares our commitment of helping customers prepare for and embrace the mobile-cloud era,” said Sanjay Poonen, executive vice president and general manager, End-User Computing, VMware. ”Adoption of VMware’s virtual desktop solutions continues to grow at a rapid pace, and with F5’s new offerings, customers have options of how to best combine technologies to enable mobile workforces to access their desktops and apps securely anytime and anywhere. I am excited about the strategic alignment and collaboration between VMware and F5, which will bring even richer solutions to our customers.”

    Silver Peak powers vCloud Hybrid Service over long distances.  Silver Peak announced it has collaborated with VMware to offer its virtual WAN optimization software to support deployments on VMware vCloud Hybrid Service. Silver Peak accelerates data transfers to the cloud and provides an on-going LAN-like experience for cloud-hosted applications, enabling VMware vCloud Hybrid Service users to more quickly and easily transition to a public cloud or hybrid cloud environment.“The performance of any cloud application is directly related to the performance of the network,” said Marc Trimuschat, VP of alliances, Silver Peak. “Our unique competency in virtualized data acceleration has allowed us to quickly support vCloud Hybrid Service environments, and we are excited to be working with VMware to make the WAN cloud-ready for vCloud Hybrid Service customers.  Whether you plan to use vCloud Hybrid Service for hosting business applications or simply backing up data, our software installs quickly and easily to make your cloud resources feel like they are in the same building.”

    5:30p
    Google May Expand Dublin Data Center Footprint
    Rows of networking equipment inside a Google data center in Council Bluffs, Iowa. Monday's Gmail outage was attributed to a software update that caused performance problems for network load balancers. (Photo for Google by Connie Zhou)

    Rows of networking equipment inside a Google data center. The company has filed plans to expand its data center campus outside Dublin, Ireland. (Photo for Google by Connie Zhou)

    Google is planning to expand its data center footprint in Dublin, Ireland with a new $200 million facility, according to reports in the Irish press. The project continues the building boom in the western outskirts of Dublin, reinforcing the city’s status as a key infrastructure hub for the world’s cloud computing infrastructure.

    The giant search company may invest up to €150m ($205 million U.S.) in a massive new data center, according to The Independent, which said Google has filed planning documents with local officials. The project will create up to 300 construction jobs over a year or more and up to 60 new full-time jobs once it is operational.

    “The data centre that we built in Dublin in 2012 has worked well for us and created around 30 full-time jobs,” a Google spokesperson told the paper. “We’re now considering whether to expand our operations, and so we’re submitting a planning application. This will ensure that we’ve taken into account local opinion and rules, if we do decide to build in the future.”

    In December, Microsoft announced plans for a $230 million expansion of its existing cloud hub in Dublin, where it has now invested more than $806 million (€594 million). Microsoft is several miles down the road from Google’s Dublin campus in Profile Park, where it completed a $100 million expansion in 2012.

    Dublin is unique amongst major European data center hubs in that its appeal is based on climate, rather than connectivity. While the thriving data center communities in London, Amsterdam and Frankfurt are built atop network intersections in key business cities, Dublin has become one of the world’s favored locations for free cooling – the use of fresh air to cool servers. It is a prime example of how free cooling is giving rise to clusters of energy-efficient facilities in cool climates.

    << Previous Day 2014/02/17
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org