Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, May 12th, 2017
Time |
Event |
12:00p |
Top 10 Data Center Stories of the Month: April 2017 Here are the 10 most popular stories that appeared on Data Center Knowledge in April:
Google Reveals Espresso, Its Edge Data Center SDN – Espresso is a Software-Defined Networking stack that speeds up Google’s services for its end users as the company’s network hands traffic off to third-party internet service providers who take it the last mile.
How Amazon Prevents Data Center Outages Like Delta’s $150M Meltdown – One piece of technology Amazon built in-house is meant to circumvent what one of the company’s top infrastructure engineers described as misplaced priorities in the way electrical switchgear vendors design their products.
VMware Sells Off Cloud Services Business to OVH – The data center software giant (now a subsidiary of Dell Technologies, following its merger with EMC) changed its cloud strategy last year, acknowledging that it was better off focusing on technology and leaving the costly business of infrastructure operations to those who do it best.
 The OVH data center in Gravelines, France (Photo: OVH)
Top AWS Engineer Calls Hurd’s Cloud Data Center Bluff – Is Oracle prepared to spend big on cloud infrastructure to really compete? Hurd appears to be reluctant to make that commitment, at least in public.
 Oracle Co-CEO Mark Hurd speaking at Oracle Open World in September 2013 in San Francisco. Hurd was then the company’s president. (Photo by Justin Sullivan/Getty Images)
The New York Times to Replace Data Centers with Google Cloud, AWS – As it continues to modernize its infrastructure, the publisher is planning to shut down three of the four data centers hosting its content and internal applications in the near future, migrating most of the workloads to Google Cloud Platform and Amazon Web Services.
 The New York Times building in Manhattan (Photo by Mike Coppola/Getty Images)
How The New York Times Handled Unprecedented Election-Night Traffic Spike – When he woke up the morning of October 21, 2016, Nick Rockwell did the same thing he had done first thing every morning since The New York Times hired him as CTO: he opened The Times’ app on his phone. Nothing loaded…
Equinix Exec: We Spent $17B on Data Centers, but Cloud Giants Spend Much More – Equinix, one of the world’s two largest data center providers, has spent $17 billion over its 18 years in business to expand its global data center empire, including construction and acquisitions, but cloud giants like Microsoft, Google, and Amazon each easily beat that number in two years’ time.
This Hacker Can Talk His Way inside a Data Center – While finding a technological exploit to break into a system is just a matter of time for sophisticated hackers, people are still the weakest link in any cybersecurity scheme today.
 Legendary hacker Kevin Mitnick on stage at Data Center World 2017 at the Los Angeles Convention Center (Photo: Kyle Espeta for Data Center World)
IT Certifications: How Valuable Are They? – There seems to be two schools of thought on the value of IT certifications.
Apple’s Leased Data Center Use Skyrocketed Over Last Four Years – The rate of growth illustrates just how much hyper-scale cloud platforms still rely on leased data centers, despite also spending enormous sums on building out their own server farms around the world every year.
Stay current on data center industry news by subscribing to our RSS feed and daily e-mail updates, or by following us on Twitter or Facebook or join our LinkedIn Group – Data Center Knowledge | 3:30p |
Perimeter Security: Strategies for Data Center Protection Data centers are under attack. Hardly a day goes by without some kind of hack being uncovered. Intellectual property is stolen, cash ripped off from bank systems, websites brought down and millions of identities stolen.
It might seem to some that the IT people they trusted for decades to look after their data are no longer up to the task. But that isn’t a fair assessment. What’s happened is that the size and volume of attacks has exploded, as well as the number of potential attack vectors. It’s a bit like a fortified city that is under attack from insurgents already inside—and meanwhile, officials won’t let them close the gates due to an ongoing boom in trade.
That’s how it looks from the data center perspective. Line of business managers demand cloud apps NOW. They aren’t willing to wait a year for it to be developed internally, or even a month or two for that app to be approved by IT.
“It’s a fool’s errand to be able to block or vet the thousands of cloud apps out there,” says Sanjay Beri, CEO and co-founder of security firm Netskope. “Further, much of the information you’re trying to safeguard is being shared by apps in a way that never touches the network perimeter device—direct to the cloud in places like airports and coffee shops.”
That means that a firewall with an exhaustive list of blocked apps never gets the chance to act when the usage of the app is remote or mobile. Similarly, anti-virus (AV) software is struggling to cope with today’s threats.
The New Perimeter
Perimeter defense has traditionally been about controlling traffic flowing in and out of a data center network. Best practices include the implementation of a layered set of complementary defenses. Beyond a router, which connects the internal and external networks, the primary technology that underpins perimeter protection is a firewall, which filters out potentially dangerous or unknown traffic that may constitute a threat based on a set of rules about the types of traffic and permitted source/destination addresses on the network. Most organizations also deploy intrusion detection or intrusion prevention systems (IDS/IPS), which look for suspicious traffic once it has passed through the firewall.
“The most effective strategies implement multiple layers of complementary controls, all of which a potential intruder must circumvent to gain access,” says Rob Sadowski, director of technology solutions at RSA, the security division of EMC. “However, perimeter defense alone is not enough to protect against sophisticated threats. Organizations need to develop intelligence-driven network monitoring, threat detection, and incident response capabilities as well.”
While firewalls, network perimeter appliances and AV may have lost some of their potency, that doesn’t mean they should be abandoned. They still have a role to play in preventing a direct attempt to “storm the ramparts.”
“Firewalls should still play a role, but the ‘human firewall’ should be given more attention,” says Stu Sjouwerman, CEO of security firm KnowBe4. “The perimeter has to be extended to every device and every employee.”
Boisvert concurs.
“Think about how easy it still is to exploit phishing emails,” he says. “Cyber security is as much about people as it is about technology, so training is a big part of prevention.”
A recent phishing attack on one company in the northeastern seaboard, for example, had data center staff scrambling for days. It all started with someone opening a cleverly engineered link in an email. That let the bad guys into the company address books. Shortly thereafter, employees were receiving emails from trusted internal sources asking them to open an attached fax. Many did. The infection spread rapidly and brought down several systems.
Such incidents make it clear that staff training is a vital element of the data center security arsenal. According to the Cybercrime Survey, companies that train employees spend 76 percent less on security incidents compared to those who don’t. The savings amounted to $500,000 per year compared to those who didn’t.
The data center perimeter, then, must be protected at all modern entrance gates. This extends from the network edge and the corporate firewall outward to mobile applications and the cloud, and inward to every employee and every device. But that’s a daunting task for anyone. It’s a bit like trying to protect the president on a visit to Manhattan. The only option is to place the city in virtual lockdown, and spend a fortune to deploy an army of Secret Service staff backed up by drones in the air as well as jet fighters on standby. Few data centers can afford that level of protection.
The good news is that they may not need to. Boisvert thinks that prioritization is essential, not only to contain costs, but to increase effectiveness in the fight against cyber-attacks.
“Stop trying to protect everything,” he says. “Protect what’s vital and accept that the rest may be compromised.”
Threat Intelligence
Just as it is possible to contain costs by concentrating on the data center’s “crown jewels,” similarly data centers can make the job easier by incorporating analytics and intelligence techniques.
“State-of-the-art tools such as network forensics and analytics can help the incident management and response teams get the information they need when time is of the essence,” says Sadowski.
What is evolving is a big data approach to analytics. The idea is to use software to do the heavy lifting to combat cyber-threats.
Analytics vendor SAS already has products in this space, but it has a project ongoing that aims to analyze data at scale far more effectively. The goal is to detect how normal something is behaving.
“The hacker is deviating from normal by communicating with machines they don’t normally communicate with,” says Bryan Harris, director of R&D for cyber analytics at SAS. “With the context of what machines should be doing, and the hosts, ports and protocols they interact with, you can identify outliers.”
If one machine is doing something even a little different, the data center manager is alerted. He or she can then determine if an actual threat is present. This approach to security is expanding. Expect the Symantecs, RSAs and McAfees of this world to either partner with analytics firms like SAS or to develop their own analytics engines.
“Real-time, high-speed advanced analytics will be the best solution for high-level resilience,” says Boisvert.
He also advocates what he calls threat intelligence. One aspect is the sharing of data on attempted incursions among companies or industries as a means of leveling the playing field. After all, the bad guys have gotten very organized. They can buy code for Distributed Denial of Service (DDoS) attacks online. In Eastern Europe and perhaps areas of Asia, there appears to be a convergence of government interest and organized crime.
“Organized crime has been a major threat actor, acting on the behest of the state in some cases and even getting some direction on targets,” said Boisvert. “If you mess up our banking and retail industries, for example, it disrupts the U.S. economy.”
The take away is that data centers can no longer act in isolation. They should be actively pooling resources and providing more of a united front against the black hats.
Management and Response
Many data centers are heavily focused on responding quickly to immediate threats. While this is certainly important, it isn’t a winning long-term approach. Jake Williams, a certified instructor for SANS Institute thinks some data center managers need to understand the difference between security incident management and incident response. While they are closely related, incident management is more of a business function while incident response is more technical.
“Those that attempt incident response without good incident management processes tend to be overwhelmed by constant requests for status updates,” says Williams. “Neither of these roles works well without the other.”
Best practices in incident response call for a documented process that is always followed. Doing so requires drilling and testing. It may be easy to recall all of the steps required to contain an incident today, but stress levels rise substantially during an actual breach. One answer, says Williams, is the creation of checklists to ensure that all tasks are accomplished in the order intended.
“Documentation during the incident is key and checklists can help,” says Williams. (Free incident response checklists are available at sans.org).
Another crucial aspect of becoming better organized is to install a Security Information and Event Management (SIEM) program to collect, correlate, automate and analyze logs. Though a SIEM can be a costly investment, there are open source SIEM products that can be deployed. The SecurityOnion Linux distribution, for example, includes OSSIM, which is a free SIEM product.
Like Boisvert, Williams is a fan of training, emphasizing the education of data center staff in incident response.
“Incident responders and managers alike need training and periodic drilling in their own environments,” he says.
Some of the most effective ingredients are incident dry runs, where incident responders and managers work through a mock incident. These exercises often highlight deficiencies in training, procedures or availability of resources.
With so many cautions, best practices, technologies and attack vectors to take into account, Rajneesh Chopra, vice president of product management at Netskope, reminds data center managers not to leave end users out of the loop. Take the case of a group of users that have had their user credentials stolen.
“Immediately inform affected users that they should change their passwords,” says Chopra. “You might also inform them of apps with weak password controls and that they’re at risk if they continue to use the app. In extreme circumstances, you might even have to lock down that app entirely.”
Piero DePaoli, senior director for Global Product Marketing at Symantec, says the best way to protect data center infrastructure is to assume the perimeter doesn’t exist and protect each component inside the data center.
“Organizations need server-specific security with default-deny policies on every server in the data center,” he says. “Simply applying antivirus or the same security that’s on laptops is not enough. Laptop security by default allows all and attempts to block malicious items. Security on a server needs to be applied in the exact opposite fashion: block everything and only allow approved items to run.”
This entails hardening the infrastructure so physical and virtual servers are only authorized to communicate over specific ports, protocols and IP addresses. Secondly, use application whitelisting to only allow specific, approved applications to run and deny all others. Additionally, use file integrity and configuration monitoring to identify attempted changes and even suspicious administrator actions in real time, says DePaoli.
No Stone Unturned
One final word of advice: If a serious breach occurs, leave no stone unturned in the investigation. A tactic used recently by attackers is to bury malware deep within the data center and have it stay inactive for a while after it is inserted. That way, even if the incursion is discovered and mop up efforts are carried out, the malware can remain inside. Several banks, for example, fell prey to this approach. The attackers quietly withdrew funds little by little over many months from various accounts—not quite enough to draw much attention but amounting to millions over time.
“Follow every last piece of evidence you have until you are certain that you have uncovered all of the attackers, and clearly identified the hosts they have compromised and understood the tactics and tools used against you,” says Scott Crane, director of product management for Arbor Networks. “This analysis can be time consuming, but it is the best way to learn from an incident and ensure you are properly prepared to deal with the next one.”
Drew Robb is a freelance writer based in Florida. | 4:41p |
Apple Doubling Down on Reno Data Center It’s official: the Apple data center campus in Reno, Nevada, is going to get a lot bigger than it is today in another win-win for Apple and one of the biggest little data center hubs in the country.
Apple announced that it will pour another $1 billion into the data campus at the Reno Technology Park in the Truckee River canyon east of Sparks, bringing “hundreds of jobs in operations and construction,” according to Apple executive Mike Foulkes.
In turn, Apple’s $89 million in state and property and sales tax abatements awarded in 2012 with its original $1 billion investment will increase — and that’s not all.
Part of the addition will include a $4 million, 27,000 square-foot warehouse for shipping and receiving built on a vacant lot in downtown Reno, originally part of a tourism improvement district created in 2009. Because the Reno City Council voted on Wednesday to allow Apple to buy the land instead of leasing it, the company is now eligible to take advantage of a tourism tax break that will lower its sales tax rate to 0.5 percent from 8.265 percent, according to the Reno Gazette Journal.
Nevada Governor Brian Sandoval said in a statement that the Reno facility was “the first major economic development success in northern Nevada and helped place this region on the technology and innovation map.” He continued, “Apple’s decision to increase their local investment by $1 billion is a testament to our successful partnership and a demonstration that the best companies in the world are coming to Nevada, creating hundreds of jobs, investing in our communities and making our state their permanent home.”
See also: How Reno Became a Data Center Hub: A Timeline
The Apple data center is not the only massive server farm that feels at home and appreciated in Nevada. Massive Switch SuperNap is in Reno with eBay as the anchor tenant, and news came out in April that Google had acquired 1,210 acres in Nevada’s Tahoe Reno Industrial Center that will house a future data center.
In a show of support for creating more manufacturing jobs in the US, Apple announced a new $1 billion fund last week to make its vital role known in the national economy. The company broke down employment by state, showing where its 80,000 employees work. As you might have figured, more than half are in Silicon Valley. | 4:49p |
Furious Land War Erupts Outside CME Data Center Brian Louis (Bloomberg) — It was an odd transaction from the outset: $14 million, double the going rate, for a 31-acre plot of flat, undeveloped land just west of Chicago. In the nine months since, the curious use of the space has only added to the intrigue. A single, nondescript pole with two antennas was erected by a row of shrubs. Some supporting equipment was rolled in. That’s it.
But those aren’t ordinary antennas. And the buyer of the property isn’t your typical land investor. It’s an affiliate of a company called Jump Trading LLC, a legendary and secretive trading firm that’s a major player in some of the most important financial markets. Just across the street, it turns out, lies the data center for CME Group Inc., the world’s biggest futures exchange. By placing its antennas so close to CME’s servers, Jump may be trying to shave maybe a microsecond — one-millionth of a second — off its reaction time, potentially enough to separate a winning from a losing bid in trading that takes place at almost the speed of light.
It’s the latest, and perhaps boldest, salvo in an escalating war that’s being waged to stay competitive in the high-speed trading business. The war is one of proximity — to see who can get data in and out of CME the quickest. A company called McKay Brothers LLC recently won approval to build the tallest microwave tower in the area while another, Webline Holdings LLC, has installed microwave dishes on a utility pole just outside the data center.
“It tells you how valuable being just a little bit faster is,” said Michael Goldstein, a finance professor at Babson College in Babson Park, Massachusetts. “People say seconds matter. This is microseconds matter.”
Platform Shoes
Traders have long fought ferociously to gain an edge, even to the point of wearing ultra-high platform shoes to stand out in the era when they shouted and waved their hands to execute an order. The dubious fashion was mercifully ended in 2000 by CME’s predecessor, the Chicago Mercantile Exchange, which cited a rash of injuries in banning shoes with soles higher than 2 inches.
See also: CyrusOne Plans Huge Expansion at CME Data Center Campus
The battle for speed was later waged over fiber-optic cable and then, within the past decade, microwave technology, which can convey data in nearly half the time.
Jump Trading declined to comment, but in Aurora it appears that it, too, was reacting to competitors in the latest round of jockeying. In October 2015, McKay Brothers, a company that sells access to its microwave network to high-speed traders, leased land diagonal to the CME data center, under the name Pierce Broadband LLC, according to DuPage County property records.
Last month, the county gave McKay approval to erect a 350-foot high microwave tower that could be 600 feet closer to the data center than its current location, records show. Two trading firms, IMC BV and Tower Research Capital LLC, own minority stakes in McKay. Co-founder Stephane Tyc said his firm may never build the tower but it would be part of the firm’s continual efforts to speed transmission time.
Utility Pole
Then there’s Webline Holdings. In November 2015, it was granted a license to operate microwave equipment on a utility pole just outside the data center, according to Federal Communications Commission records. Webline has licenses for a microwave network stretching from Aurora to Carteret, New Jersey, where Nasdaq Inc.’s data center is located. Messages left for Webline were not returned.
Last year, the Jump Trading affiliate World Class Wireless purchased the 31-acre lot for $14 million, according to county records. “They paid probably twice as much as it’s worth,” said David Friedland, an executive director in commercial real estate firm Cushman & Wakefield’s Rosemont, Illinois, office. “I don’t see anyone else paying close to that price.”
The license for the transmission dishes is held by a joint venture between World Class and a unit of KCG Holdings Inc., a trading firm that Virtu Financial Inc. is acquiring.
Fiber Cable
It’s unclear which firm is now closest to CME servers. Trading data first leaves CME computers via fiber cable, and then to nearby antennas that send it by microwave to other towers until it reaches New Jersey, where all the major U.S. stock exchanges house their computers. The moves in Aurora are intended to reduce the time that the data is conveyed through cable.
Sending data back and forth between the U.S. Midwest and East Coast allows high-frequency traders to profit from price differences for related assets, including S&P 500 Index futures in Illinois and stock prices in New Jersey. Those money-making opportunities often last only tiny fractions of a second.
There may be a simple way to avoid the skirmishing among traders. A microwave tower could be installed on the roof of the CME data center to eliminate the need for jockeying around the site. The exchange is indeed looking at allowing roof access, along with CyrusOne Inc., the company that bought the data center last year, CME said in a statement. Traders being traders, however, they may continue to battle, this time for the most advantageous position on the microwave tower itself.
“We are confident the CME can provide an alternate and better solution which offers a level playing field to all participants,” said McKay’s Tyc. | 9:01p |
The Hidden Bias in Machines Vatsal Patel is Software Engineer for Accusoft.
Michael Archambault is the Software Development Manager for Accusoft.
Despite technology milestones in recent years, artificial intelligence (AI) can still suffer from unintended consequences. In 2016, Microsoft’s Twitter bot, Tay, tweeted racist and highly inappropriate statements after learning from her peers on the platform. Unable to overwrite the malicious information from social learning, Tay’s account was shut down and still remains private.
Other incidents like Tay have also surfaced. One of the most notable was with Google’s image recognition software. Google Photos was accused of discrimination after labeling some non-white users as “gorillas.” Google claims the incident was unintentional, yet it remains an issue of concern that turned an overlooked piece of code into an issue of race.
Unfortunately, algorithms like Microsoft’s and Google’s are still dependent on human input and their context is limited by the algorithm’s parameters. This is how Tay is unable to determine truth from internet trolling, and why Google Photos doesn’t have the ability to differentiate some non-white users from gorillas. However, this issue exists in more algorithms than we are aware of.
This machine-based bias stems from the point when humans program artificial intelligence to automate machine learning. Because humans build the datasets used to train artificial intelligence biases, limitations and human error can affect the output, so the fault lies in how humans are training these machines from the beginning.
Machines Are the Products of Human Interaction
With AI, humans are the puppet masters. It is the human input that guides machines when they process the information used to classify datasets. In its simplest form, AI analyzes unfamiliar inputs within a database of known values to arrive at the correct output. Just like in human learning, the more algorithms are fed indexed images, the more accurate the software processing becomes. If you train an algorithm with hundreds of cat photos, it will be able to classify a photo of a Siamese that it’s never seen before as “cat.” However, issues can form when algorithms are trained with typical or perfect images in a controlled environment. If developers do not train these machines with data that represents diverse conditions, complications can arise.
These issues can even affect unexpected applications of image processing software, like barcode recognition. Standard 1D barcodes consist of alternating black and white bars of various widths that contain encoded value. The scanners process the width by analyzing the width between the bars and matching it to a preselected set of parameters. If the bar is ambiguous due to poor lighting or print quality, the computer is unable to decipher the encoded data. In these cases, a computer is capable of detecting a variety of potential matches, but would need additional information to identify the correct value.
A misread barcode can go beyond someone receiving an incorrect product from an online order. In hospitals, barcodes identify patients’ critical health information, like medication-specific allergies; an incorrect or partial scan could lead to serious consequences like anaphylactic shock or even death. Constantly having to correct machine mistakes leaves users vulnerable to these errors.
Accuracy Requires Holistic Inputs
Scanners are trained to recognize images using perfect examples, like a well-lit, clear photo. In reality, barcodes are often imperfect. Barcodes found on shipping labels can easily become distorted in transport, leading to errors when processing. In order to preempt this, developers need to use a variety of conditions and expand the range of inputs when building algorithms.
In the case of barcode scanners, algorithms need to be trained using codes in an imperfect condition. For applications like Google’s Photo assistant, exposing the software to a diverse range of subjects can allow it to correctly identify and achieve the intended results. Like any good teacher, developers must create a realistic environment that the computer can use to process and compare features. For example, a person may see both a tiger and zebra and be able to differentiate the two based on the knowledge that they are different species. However, a computer, if not properly trained, will see the stripes and assume they are a similar classification. Humans know that it is illogical that a zebra and a tiger would be classified the same, but a computer needs to feed holistic inputs in order the clearly decipher the differences.
While it seems logical to create a comprehensive database with clear datasets, in reality, most situations have some ambiguity. AI-powered machines have the capability for accuracy when algorithms encompass as many inputs as possible, but this is not a fix-all solution. More input will also expose the same bias present in humans, so how the machines decipher the input and features is an essential factor. As image recognition technology continues to be refined, developers need to be conscious of the images they use in their solutions and the implications it will have on their technology, and in the case of Google, society.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 9:15p |
Extortionists Mount Global Hacking Attack Seeking Ransom
Robert Hutton, Jeremy Kahn and Jordan Robertson (Bloomberg) — Extortionist hackers who may be using leaked computer exploits from the U.S. National Security Agency infiltrated computers in dozens of countries in a fast-spreading attack that forced British hospitals to turn away patients and breached systems at Spain’s Telefonica SA and organizations from Russia to Taiwan.The ransomware used in Friday’s cyber-attacks encrypts files and demands that victims pay $300 in bitcoin for them to be decrypted, the latest in a vexing style of security breaches that, at the very least, forces organizations to revert to backup systems to keep critical systems running. The malicious software has infected more than 75,000 computers in 99 countries worldwide on Friday, most of them concentrated in Russia, Ukraine and Taiwan, according to Dutch cybersecurity company Avast Software BV.
The attackers were exploiting a vulnerability in Microsoft Corp. software that was patched in March, according to cybersecurity researchers. Attack code targeting that vulnerability was released publicly by Shadow Brokers, a group that has been leaking stolen hacking tools purportedly from the NSA. That connection has given critics of U.S. hacking ammunition for their argument that governments finding flaws in commercial technologies and keeping them secret for the purpose of exploiting them can carry a public risk.
See also: This Hacker Can Talk His Way inside a Data Center
“These attacks underscore the fact that vulnerabilities will be exploited not just by our security agencies, but by hackers and criminals around the world,” said Patrick Toomey, a staff attorney at the American Civil Liberties Union’s National Security Project. “It is past time for Congress to enhance cybersecurity by passing a law that requires the government to disclose vulnerabilities to companies in a timely manner. Patching security holes immediately, not stockpiling them, is the best way to make everyone’s digital life safer.”
While the victim tally is likely to grow, the ransomware, called WanaCrypt0r, only affects computers that haven’t applied Microsoft’s two-month-old fix, a reminder that individuals and organizations that don’t routinely update their machines are vulnerable. Hospitals are notoriously slow in applying security fixes, in part because of how disruptive it is to take patient-facing equipment and databases offline. That has made them a reliable target of ransomware and identity-theft attacks, and why they routinely fall victim even to random mass attacks.
See also: Global Hacking Operation is Targeting MSPs, Stealing Customer Data
Hospital Warnings
In the U.K. on Friday, hospitals urged people with non-emergency conditions to stay away after the cyber-attack affected large parts of the country’s National Health Service. Sixteen NHS organizations were hit, while a large number of Spanish companies were also attacked using ransomware.
“A number of NHS organizations have reported that they have suffered from a ransomware attack,” U.K. Prime Minister Theresa May told reporters. “It’s an international attack and a number of countries and organizations have been affected. We’re not aware of any evidence that patient data has been compromised.”
See also: Tips for Disinfecting Your Data Center
Hospitals in London, North West England and Central England have all been affected, according to the BBC. Mid-Essex Clinical Commissioning Group, which runs hospitals and ambulances in an area east of London, said on Twitter that it had “an IT issue affecting some NHS computer systems,” adding “Please do not attend Accident And Emergency unless it’s an emergency!”
 A message informing visitors of a cyber attack is displayed on the NHS website on May 12, 2017 in London, England. (Photo by Carl Court/Getty Images)
The impact on services is not due to the ransomware itself, but due to NHS Trusts shutting down systems to prevent it from spreading, said Brian Lord, a former deputy director of Government Communications Headquarters (GCHQ), the U.K.’s signals intelligence agency, who is now managing director of cybersecurity firm PGI Cyber. Lord, who described an attack of this type as “inevitable,” said the impact was exacerbated because most NHS Trusts had “a poor understanding of network configuration meaning everything has to shut down.”
Ransom Message
A screen-shot of an apparent ransom message, sent to a hospital, showed a demand for $300 in bitcoin for files that had been encrypted to be decrypted.
Workers across the NHS have since been sent emails from the health service’s IT teams warning not to open or click on suspicious attachments or links.
Spain’s National Cryptologic Center, which is part of the country’s intelligence agency, said on its website that there had been a “massive ransomware attack” against a big number of Spanish organizations affecting Microsoft Corp.’s Windows operating system. El Mundo reported that the attackers sought a ransom in bitcoin.
“We’re aware of reports and are looking into the situation,” said a Microsoft spokesman.
While Friday’s attack could damage the reputation of Microsoft’s security, it’s likely to be limited, said Sid Parakh, a fund manager at Becker Capital Management, which owns Microsoft stock. There have been so many high-profile hacks that if a fix is available it’s the user’s responsibility to download it, he said.
“Every time this happens it hurts the underlying product’s reputation,” Parakh said. But Microsoft has “been in a worse state in the past.”
Unsuspecting Victims
Ransomware typically gets onto a computer when a person unsuspectingly downloads a file that looks like a normal attachment or web link. A hacker can then trigger the malware to freeze the computer, prompting a person to pay a ransom or lose all their files.
Hospitals have been a common target because the culprits know how critical digital records are for treating patients. There have been several incidents in the U.S., including one in Indiana where a hospital’s IT system was taken down and patients had to be diverted to other facilities, according to a local news report.
Ransomware attacks have also been soaring. The number of such attacks increased 50 percent in 2016, according to an April report from Verizon Communications Inc. These types of attacks account for 72 percent of all the malware incidents involving the health-care industry in 2016, according to Verizon.
“The large-scale cyber-attack on our NHS today is a huge wake-up call,” said Jamie Graves, chief executive officer of cybersecurity company ZoneFox.
Andrew Barratt, managing principal of Coalfire, a company which provides cybersecurity risk assessments to the health-care sector, said that many NHS hospitals used personal computers with outdated Windows-based operating systems, which have makes them easy to attack. He said many of these systems were too old to patch and that many NHS Trusts did not spend enough time on technical best practices and audits, leaving them vulnerable to a variety of potential cyber-attacks, including ransomware.
|
|