Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, January 8th, 2015

    Time Event
    12:30p
    GraphLab Becomes Dato, Raises $18M for Machine Learning

    Internet giants like Google and Amazon have benefited greatly from using machine learning algorithms to enhance their online services. Other, smaller businesses want similar tools but can’t afford the scientific and engineering muscle required to do it right. Both new and established high tech players are now racing to make machine learning technology accessible to more companies, and there doesn’t seem to be a shortage of investor cash for smart startups in the field.

    One such startup is Seattle-based Dato, which until today was called GraphLab. The company announced the name change this morning, combining it with the announcement of an $18.5 million Series B funding round. This is a huge round for Dato. The company raised $6.75 million before it.

    Its customers include Adobe, PayPal, and Cisco. Online music radio service Pandora uses Dato’s software to power its recommendation engine. Real estate database Zillow uses it to generate price estimates.

    Roots in Academia

    Dato leadership’s goal is to democratize machine learning, Carlos Guestrin, its founder and CEO, said. About seven years ago, when he was associate professor at Carnegie Mellon University, Guestrin and a group of students started an open source project called GraphLab. The goal was to develop large-scale machine learning algorithms to analyze graphs. They tried doing it on Hadoop, but it proved too slow, so they built a new system that allowed them to write those algorithms faster and more easily.

    GraphLab the company was formed only in 2013, after Guestrin moved from Pittsburgh, Pennsylvania, to Seattle to join the faculty of the University of Washington as Amazon Professor of Machine Learning. The word “Amazon” is in his title because Jeff Bezos has donated money to the university to attract academic talent like Guestrin, and his appointment was a result of the grant.

    Machine Learning at Scale

    The company launched its flagship product, called GraphLab Create, in October 2014. As Guestrin himself describes it, the product is meant to enable software engineers and data scientists to make creative intelligent applications that can transform their businesses.

    While it has some elements of the original open source project, GraphLab Create is its own animal, built from scratch, he said. It is also open source. The company’s technology is not limited to graphs, like the original academic project at Carnegie Mellon was. It can use a variety of data types, including text and images, which is why the name was changed.

    “Dato” is a Spanish and Portuguese word for “datum,” a single piece of information, the singular form of “data.” Guestrin is Brazilian, and Spanish and Portuguese are both his first languages. He likes the word, because it’s short and simple but has a meaning – a rarity in the startup world. “I find it very beautiful,” he said. “GraphLab didn’t capture where we were today.”

    The company has given a lot of thought to infrastructure and scale when engineering GraphLab Create. It supports every stage of the application lifecycle, from development to production. A developer can prototype, build, and debug an application using its machine learning capabilities on a desktop, but deploy it on a single Linux server, or (if it needs to run at scale) on a Hadoop YARN cluster in a public cloud.

    Dato founder and CEO Carlos Guestrin is Brazilian. The word "dato" is Portuguese and Spanish for "datum." (Photo: Dato)

    Dato founder and CEO Carlos Guestrin is Brazilian. The word “dato” is Portuguese and Spanish for “datum.” (Photo: Dato)

    Money Pouring Into Machine Learning

    Guestrin says Dato’s only competition is customers trying to build something similar themselves. And that may be the case for the specific type of technology the company has built, but the machine learning market in general is buzzing with activity.

    Venture capital has been pouring into the space. Vulcan Capital, which led Dato’s recent round, and Madrona Venture Group, which participated in the round, also took part in a $21 million Series B for a Seattle-based machine learning startup called Context Relevant last May. Another example was a CRM startup called Clari, which raised $20 million in June to invest in machine learning capabilities for its product.

    In December, a company called Scaled Inference announced a $13.6 million Series A round led by Khosla Ventures. Vinod Khosla, the famous Silicon Valley investor, likes to say in public appearances how much better the world would be had some functions currently performed by humans been taken over by machines.

    Established high tech giants have also been investing a lot of cash in machine learning. After years of having used machine learning technology to fuel its online services, Google last year launched a neural network that recommends ways to optimize its global data center fleet for efficiency. Facebook has had a dedicated artificial intelligence lab since 2013.

    Microsoft is working on real-time language translation during Skype conversations (the company launched a preview version in December); CERN and Yandex used a machine learning system to recognize certain particle collisions in the Large Hadron Collider in a simulation of the moments that followed the Big Bang; a system created by IBM Research was recently able to distinguish images of malignant skin cancer by scanning 3,000 images with 95 percent accuracy.

    Going After the Generalist AI Niche

    There is a myriad of applications for machine learning. Some companies are building machine learning capabilities for specific purposes, while others, like Dato, want to enable developers to make machine learning part of their applications, whatever each individual application’s function may be. The goal is along the same lines as that of IBM Watson, which is perhaps the most widely publicized machine learning technology.

    Ultimately, Dato wants to enable developers to be creative with their data on any type of machine or cloud, Guestrin said. “You can be super creative, explore, and build a cool intelligent application on a laptop and deploy it as a service.”

    4:30p
    5 CIO Trends Driving DCIM Adoption

    Mark Gaydos is the Chief Marketing Officer for Nlyte Software, the leading developer of Data Center Infrastructure Management (DCIM) software focused on the management and optimization of data centers.

    The data center today looks very different than it did just 10 years ago. Massive shifts in how companies conduct business have increased the importance of applications, IT and the data center, and as a result the pace of change inside data centers has massively increased. New trends are causing organizations to maintain more data centers across varying geographical locations.

    In the past, companies managed data center infrastructure and the processes surrounding those resources using spreadsheets. This approach, albeit cheap and easy, was a stopgap for using a real enterprise management platform. It is to fill this need that data center infrastructure management (DCIM) materialized.

    What is Data Center Infrastructure Management (DCIM)?

    A DCIM solution allows an organization to plan, manage and optimize the physical elements and processes that exist in a data center. But the key to DCIM is its capacity to help a team plan and to operate disciplined processes around management of their assets and environmental resources, while tying this information seamlessly into their ITSM systems.

    Although DCIM solutions have been around for more than a decade, there are five worldwide trends that have rapidly increased its adoption in recent years.

    1. Software is Eating the World. Apps Define the Enterprise.

    In 2011, Marc Andreessen said in The Wall Street Journal, “Software is eating the world.” He discussed how new software services are threatening well-established businesses across a wide variety of industries. You don’t have to look far to see how Amazon completely re-invented/devastated the book buying industry or how Pixar revolutionized how movies are made. Most companies throughout the globe have taken notice and are trying to adapt by developing their own software applications.

    Thus, organizations are investing in new applications and the underlying infrastructure to support those applications. CIOs have made investments in virtualization, CMDBs, service management software, and a variety of other IT service management (ITSM) technologies to automate the technologies on top of their data center infrastructure. Yet while they have automated and implemented new technologies in the layer between the data center and the applications (application infrastructure), their data center resources are often being under-managed with antiquated approaches.

    As a result, many CIOs are finding their ability to deliver service levels to the business are hampered by the weak link in the chain – the management of their actual physical data center infrastructure. They are now implementing DCIM in order to manage this infrastructure more efficiently.

    2. The Data Center Boat May Get Swamped. Mobile, Big Data and Internet of Things.

    Another trend worldwide trend is the adoption of mobile and handheld, virtual desktop, big data and the Internet of Things initiatives. The growing use of these technologies is generating massive increases in network traffic, data processing, data storage, and data analysis. These new demands are causing such an increase in all data center resources that they threatening to swamp the “data center boat in a sea of information.”

    DCIM solutions not only provide the ability to perform more effective capacity planning and cost reductions through tech refreshes around resources, but also enable organizations to extract efficiencies from limited systems and personnel so organizations can stay ahead of these new applications.

    3. What Are We Buying With All this Money? Increasing IT Budgets Forcing Financial Discipline.

    With the increased emphasis on new applications and technologies, overall spend on IT is growing for many organizations. CFOs are now compelled by their stakeholders and shareholders to understand how past investments are being used before they approve future purchases.

    A DCIM solution not only enables data center professionals to more efficiently plan and manage data center infrastructure, but the use of these systems empowers and drives more discipline around all aspects of managing data center resources. This enables the data center manager to know what systems are supporting which business initiatives, where those assets are located, what resources the assets are using and answer a myriad of questions that finance may have in order to see efficient use of IT spend.

    4. We Don’t Want to Be on the Front Page of The Wall Street Journal! Security Breaches Drive Data Center Compliance.

    You don’t have to wait long to hear about the latest major security breach in the news. It’s every CIO’s greatest fear that their company will end up on the front page of The Wall Street Journal because of a data breach.

    The strength of any data security strategy is only as good as the weakest link in the chain. Thus, organizations are starting to examine their entire technology stack; from the data center floor all the way up through their applications and Internet connectivity.

    An enterprise-level DCIM solution will not only allow an organization to implement discipline and rigor around how assets are added, refreshed, and decommissioned, but they will also provide a comprehensive audit trail of what is happening to these resources.

    5. We’re Buying Yet Another Company? Acquisition-mania Demands Improved Data Center Processes.

    Acquisitions are now a way of life amongst large organizations worldwide. This typically leads to CIOs being asked to integrate and manage new IT assets, some of which may be in entirely new geographical locations around the globe.

    With DCIM in place, an organization can effectively absorb new assets while also directing changes to existing assets, as required. As changes are requested and approved in IT change management systems, these changes can then be pushed into a DCIM system, which can then direct personnel worldwide to perform a multitude of specific sub-tasks.

    DCIM, Keeping Up With the Times

    In the past, organizations could keep up with change by using server automation and virtualization technologies while the underlying infrastructure of the data center remained relatively constant. DCIM adoption is on the rise around the world, as organizations realize if they are to keep up with the pace of change, adhere to corporate policies, stay agile and keep the finance department and auditors content, they need to also automate processes around the management of their data center infrastructure.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:30p
    Understanding the Benefits of Dynamic Cooling Optimization

    Cooling optimization in a data center provides a significant opportunity for reducing operating costs since cooling systems consume 34 percent of the power used by a data center, according to Lawrence Berkley National Laboratory.

    Due to the need to maintain five nines (99.999 percent) reliability, data centers are too often overdesigned in terms of the cooling capacity, and operating budgets pay the price for this practice. When you look at all of the other components within the modern data center, it’s no wonder so many administrators are looking for better ways reduce costs and control cooling efficiency.

    The best practice for cooling optimization in a data center starts with a thorough computational fluid dynamics (CFD) analysis of the cooling design. Today’s common practice is to apply CFD tools at the design or major retrofit stage in order to specify and optimize the cooling system components and layout, based on a proposed IT equipment environment. Yet this approach is unsuccessful because data centers are dynamic environments where the equipment population and layout changes over time, and where the heat signature changes constantly in response to computing traffic. So a typical CFD analysis represents only a snapshot in time and rarely reflects the dynamic operations of the data center 24/7/365.

    Now data center managers can use a more effective, next generation analysis and planning tool, ActiveCFD software, to dynamically manage their cooling infrastructure and optimize performance and reliability while reducing operating costs.

    In this white paper from CES Group, you’re able to see the benefits of using the ActiveCFD program in the day-to-day monitoring and optimization of a data center’s cooling infrastructure.

    The ultimate idea is to create true cooling efficiency across all systems. The only way to do this is with real-time monitoring and dynamic optimization. This real-time CFD model of the data center is used to drive each air conditioner, chiller, and other components of the cooling infrastructure through its entire operating range. The goal of these simulations is to gauge the sensitivity of the inlet temperature of every server and rack equipment to changes in the performance of each critical component of the cooling infrastructure. The cooling parameters analyzed by the System Performance Analyzer (SPA) are:

    • Supply temperature of each cooling unit (i.e. temperature setting)
    • Volumetric flow rate of each air mover (i.e. fan speed)
    • On/off setting of each air conditioner
    • Chilled water temperature
    • Chilled water flow rate
    • Ambient temperature (for systems equipped with a water-side or air-side economizer)
    • Ambient relative humidity (for systems equipped with a water-side or air-side economizer)
    • IT rack heat load

    Download this white paper today to learn how the Dynamic Cooling Optimization module with ActiveCFD can help a data center manager proactively monitor and control the facility’s cooling infrastructure by dynamically matching the cooling output to the prevailing heat load. As a result, operating expenditure can be reduced because wasted or unnecessary cooling capacity is eliminated.

    6:34p
    Intel’s James Named Vice Chair of U.S. President’s Telco Security Committee

    Intel President Renée James has been appointed as vice chair of the industry committee that advises the U.S. president on reliability and security of the nation’s telecommunications services, the company announced this week.

    President Barack Obama appointed James as member of the National Security Telecommunications Advisory Committee in 2013. She is now in the second-highest role on the committee after committee chair Mark McLaughlin, president and CEO of Palo Alto Networks, a Silicon Valley network security company.

    James has been at Intel for 25 years. Before becoming the semiconductor giant’s president, she led its software and services group, and prior to that, she served as chief operating officer of the company’s data center services business.

    Early in her career, James was chief of staff of former Intel CEO Andy Grove, a legendary Silicon Valley figure credited with being one of the leading minds that made Intel into the giant that it is today.

    Former president Ronald Reagan created NSTAC in 1982. It is essentially a panel of telecommunications industry experts that evaluate effectiveness of the government’s programs that have to do with protecting communications infrastructure that affects national security and to examine technological feasibility of new programs.

    One recent example of NSTAC’s work is a 2014 report on the potential impact of the Internet of Things, or Industrial Internet, on the government’s national security or emergency preparedness telecommunications systems. Also in 2014, the committee started work on a set of recommendations to improve the government’s ability to respond to what the government refers to as “low-probability, high-impact” cyber-attacks.

    All panel members are appointed by the president. Other members on the current panel include Scott Charney, a corporate vice president at Microsoft, Glen Post, president and CEO of CenturyLink, and Akamai Executive Vice Chairman Paul Sagan, in addition to top execs from Verizon, AT&T, Ciena, Ericsson, and Avaya, among others.

    7:48p
    Microsoft Azure Launches Monster Cloud Instances

    Microsoft Azure has launched its most powerful cloud instances to date. The new G-series instances go up to 32 cores, 448 GiB of RAM, and 6,596 GB of local SSD storage. (GB is 10003 and GiB is 10243)

    The company claims they are the mightiest instances in the public cloud today. Amazon Web Services, Azure’s biggest public cloud competitor and market leader, provides 244 GiB of memory for its highest-memory instance and 6,400 GB of SSD storage for its highest-SSD-capacity instance. AWS also offers cloud VMs with 32 CPU cores.

    Google Compute Engine, the other big contender, does not have 32-core instances. The highest-memory instance available on GCE is 104 GB. Google does not include instance-tied storage capacity in the individual instance parameters.

    The Azure announcement comes before the expected roll-out of new high-octane cloud instances by AWS. Intel customized its Xeon E5 chips specifically for Amazon to support the upcoming C4 instances, which will go up to 36 virtual CPU cores and 60 GB of RAM.

    Intel designs custom CPUs for lots of big high tech firms. The ones the company has mentioned publicly, besides AWS, are Facebook eBay, and Oracle. But there more than 30 custom CPU orders for different clients in the chipmaker’s pipeline in 2014, Diane Bryant, general manager of Intel’s data center group, said last year.

    The latest Azure cloud instances are powered by chips from the Xeon E5 v3 family. The company did not say whether they were custom or off-the-shelf.

    In a blog post, Drew McDaniel, Azure principal program manager, wrote that the new instances were designed for mission critical workloads. “This powerful VM size easily handles deployments of mission critical applications such as large relational database servers (SQL Server, MySQL etc.,) and large NoSQL databases (MongoDB, Cloudera, Cassandra etc.),” he wrote.

    Docker Comes to Azure Marketplace

    Microsoft also rolled out availability of the first Ubuntu image fully integrated with Docker on the Azure Marketplace. Docker is a popular application container technology.

    Microsoft announced it would support Docker on Azure in 2013. The company deepened its partnership with the San Francisco-based startup last year, announcing that the next release of Windows Server would support Docker natively and rolling out a Docker command line interface for Windows.

    The latest addition in Azure means a user can select a Docker on Ubuntu Server image and provision a VM with the latest Docker engine pre-installed, running on Azure.

    8:04p
    Datacloud Global Congress & Exhibition (DCG) 2015

    Datacloud Global Congress & Exhibition (DCG) will be held June 3-4, 2015 at The Grimaldi Forum in Monaco.

    This two day international congress and exhibition uniquely approaches data center and cloud challenges – integration, converged infrastructure, IT procurement, cloud management, security, among many other enterprise critical topics – and assesses the implications for their data center, energy needs, applications, platforms, services and technologies which continue to sustain a core role in the content and exhibition.

    Why you can’t afford to miss this event:

    • Meet people who matter
      1800+ executives from 50 countries for 2015 with 30pct enterprise end users. The audience profile also includes heads of datacenter, cloud and hosting service provider businesses, cloud brokers, systems integrators, managed services companies, fibre and connectivity owners and operators, telcos, power specialists and critical infrastructure equipment vendors; 82pct of attendees were from Europe.
    • It’s where deals are done
      Transactions concluded at or as a result of the 2014 event ran into many hundreds of millions of dollars; 2015 will include a range of amazing opportunities for datacentre construction companies, enterprises shifting to colocation, enterprises assessing low cost clean energy facilties, buyers for solutions to upgrade facilities, sellers of dark fibre, providers and seekers of finance and investment …
    • Engage with prospects
      Monaco provides a unique conference location and hosts a series of informal networking opportunities outside of conference hours where you can meet, discuss and learn from expert speakers, delegates & engage with the 100+ leading industry vendors.
    • Discover the latest and best cloud and datacentre content outside of the USA
      Datacloud Global Congress is built on the participation and expertise of 150 speakers including enterprise end users. An exceptional programme will deliver the most important content and insight for an international audience outside of the US with Plenary, Cloud, Datacenter and Energy theatres, hands on labs, Demo Theatre and workshops

    For more information about this year’s event, visit the Datacloud Global Congress & Exhibition (DCG) 2015 website.

    To view additional events, return to the Data Center Knowledge Events Calendar.

    8:14p
    Dutch Data Center Trade Group Formed

    A new European data center trade group called Dutch Datacenter Association has been formed in a bid to stimulate the country’s data center the industry. The association represents 20 top Dutch data center operators.

    The Netherlands is one of the world’s leading data center markets and DDA wants it to continue to be so. The organization combines both national and international players in the market.

    Data centers are good for the Dutch economy so the DDA wants to help spur as much activity as it can. It will act as a voice of the data center industry in negotiations with stakeholders and the government, support projects related to the environment, society and education, chime in on regulation and policy issues, and develop best practices.

    Several international data center, network, and cloud providers have established themselves in the Netherlands, which the new data center trade association argues has both a direct and indirect positive effect on employment and state income.

    The Netherlands is strategically important, and acts as a logical point of entry into Europe for international providers. It is well connected and business friendly.

    All major international players have presence in the market, including colocation providers like Equinix, wholesale providers like Digital Realty Trust, and big Internet companies, such as Google, which is building a data center there. Google’s interest is in part spurred by availability of renewable energy. The Netherlands is home to some of the most environmentally sound data centers in the world.

    Apple is reportedly considering a data center in the Netherlands as well.

    The twenty data center organizations that comprise the country’s new data center trade group have over seventy data center locations in the Netherlands. The DDA hopes to expand membership in early 2015.

    The founding DDA members are: Alticom, BT, BIT, Cofely, Datacenter Brabant, Datacenter Groningen, Dataplace, Digital Realty Trust, Equinix, Eurofiber, EvoSwitch, Global-E, KPN, Interconnect, Interxion, Previder, SmartDC, TelecityGroup, The Datacenter Group, and TCN.

    8:30p
    CES 2015 Unveils a Cloud and Hosting-Powered Future

    logo-WHIR

    This article originally appeared at The WHIR

    The International Consumer Electronics Show this week has showcased countless gadgets and services designed to transform the way we work, live and play, and online connectivity is more-or-less a common feature across the board.

    Many of the developments presented at CES have been focused around including online services in product categories that weren’t connected before, or that relied on other technologies. These have touched upon the following themes: a shift towards reconciling internet-connected devices and the workplace, cloud streaming replacing traditional media, and the growth of the Internet of Things.

    Tackling the BYOD Dilemma

    AT&T showcased its “AT&T Work Platform”, which provides options for bring-your-own-device (BYOD). The Work Platform lets organizations add data, voice and messaging services to a variety of mobile enterprise management solutions including MobileIron, AirWatch by VMware, and Good Technology.

    It also tracks work-related usage on an employee’s personal device, allowing them to use the device they want, and expense that added corporate usage rather than providing them a new “work” device.

    While this mostly applies to phones and tablets, as more devices such as internet-connected wearable devices enter corporate environments management capabilities will have to keep up to ensure these devices aren’t harmful.

    Steaming Media

    While new TVs are a major draw at CES (including Android-TV based offerings from Sharp, Sony and Philips), one of the huge developments in how people consume media has been the steady move from cable and satellite subscriptions over to streaming services.

    Cisco demonstrated its cloud-based broadcast, DVR, on-demand, and pay-per-view services in a new cloud-powered solution. This “video-hub for the home” is being rolled out in Germany in partnership with the country’s largest cable company, Kabel Deutschland.

    Satellite TV provider Dish Network has unveiled its web TV app “Sling TV” (not associated with the Slingbox). This allows subscribers access to live channels such as ESPN and CNN on computers, smart TVs, consoles, and smartphones for $20 per month. Streaming services like Netflix have underserved sports and news viewers, and services like Sling TV could appeal to those who need to see programs in real time.

    Also, Tablo, a maker of a DVR that records and streams over-the-air HD video, unveiled the new Tablo Metro at CES. The new version has a self-contained digital antenna, DVR and wireless streaming appliance – for $250 and with an expected Q1 2015 launch date.

    The Internet of Things Takes Shape

    While it might be some time before self-driving cars become a usual sight, theconnected car seems to be more quickly becoming a reality. BMW showcased its cloud-connected BMW i3 at CES, showing that digital devices such as a smartphone can be synchronized with the vehicle’s navigation system. This might enable a driver to automatically send a text message if they’re going to be late.

    Automatic unveiled a system that sends car diagnostic data to Android or iOS device, which also logs the vehicle’s miles, hours, MPG and fuel cost. It’s designed to help users save up to 30 percent on gas and repairs. It will also communicate with the Nest thermostat, telling Nest to start turning on the heat when the driver will be home soon.

    And even things as seemingly low-tech as plants are becoming internet-connected. The $100 Edyn Garden Sensor One tracks humidity and the soil’s moisture, electrical conductivity, and nutrients to gauge the plant’s health. For $60, the Edyn Water Valve can water plants when soil moisture is low.

    Finally, with the prevalence of security incidents – and that new devices might open up new security risks – Bitdefender is hoping to bring strong security to IoT. The Bitdefender Box is a hardware security platform that can act as a router that encrypts the communication of internet-connected devices.

    We often think of CES as presenting a vision of the future. Clearly not everything succeeds in the real world, but what we see at CES 2015 is that product makers are intent on producing more connected devices and services that connect us in smarter ways. Cloud services and hosting underpin many of these new services – although it’s becoming less of a headline feature, and more of a matter-of-course.

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/ces-2015-unveils-cloud-hosting-powered-future

    9:00p
    Pro-Russian Hackers Take Down German Government Websites with DDoS Attack

    logo-WHIR

    This article originally appeared at The WHIR

    Several websites belonging to the German government were down on Wednesday. Pro-Russian hacker group CyberBerkut has claimed responsibility for the DDoS attacks.

    According to a report by Bloomberg on Thursday, while most of the websites are back online, it is unclear why some components of the government’s main website remain offline.

    CyberBerkut launched the attack, urging “Germany to stop financial and political support of criminal regime in Kiev” in a message on its website on Wednesday.

    Websites belonging to the lower house of parliament, the foreign ministry, and Chancellor Angela Merkel’s page all suffered downtime related to the attack, which targeted the data center belonging to its service provider, Dusseldorf-based Babiel GmBH.

    According to a report by SC Magazine, German intelligence officials told Reuters that on average there are 3,000 hacker attacks daily on government websites, but this is the first time that hackers have been successful in bringing the sites down.

    Large-scale DDoS attacks are on the rise. A recent report by VeriSign showed that DDoS attacks 10Gbps and up grew significantly in Q3 2014. It is unclear just how big the attacks against the German websites were.

    On Wednesday, German Chancellor Angela Merkel met with Ukrainaian Prime Minister Arseny Yatseniuk to sign a loan agreement.

    In an interview with German broadcaster ZDF, Yatseniuk said: “I strongly recommend that the Russian secret services stop spending taxpayer money for cyber-attacks against the Bundestag and Chancellor Merkel’s office.”

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/pro-russian-hackers-take-german-government-websites-ddos-attack

    << Previous Day 2015/01/08
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org