Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, July 13th, 2017
Time |
Event |
1:00p |
How Data Center Space Will Grow Despite Decreasing Number of Data Centers Erich Hamilton is Director of Engineering for DAMAC.
The number of data centers have been continuously increasing since 2009, yet this is all about to change. Experts predict that after peaking at roughly 8.6 million data centers in 2017, the number of data centers will begin to decline. The driving factor of this decline is the migration from smaller, in-house IT centers to data centers operated by larger service providers. Although the number of data centers will decrease, the data center space itself will not, because data center capacity will continue to grow.
Due to the shift to larger service providers, the role of the corporate data center will change as well. In the past, corporate data centers solely supported operations. Today, they have a variety of functions, including: testing new business models; developing and improving new products and retaining lasting relationships with customers. Because a data center must now be able to support a variety of functions, its infrastructure must have the ability to continuously change. This is much harder to accomplish for smaller data centers, which is why you will see these smaller data centers disappear in the coming years. Today’s data centers need to be flexible, dependable and easily scalable.
The shift to larger data centers provides organizations with the infrastructure needed to adapt to a variety of different needs. It is predicted that over the course of the next five years, most organizations will stop managing their own data centers. This will result in the steady decline of in-house data centers, and lead to a higher demand for larger service providers. According to International Data Corporation (IDC), by 2018, ‘mega data centers’ will account for 72.6 percent of all service provider data center construction projects.
Virtualization is also eradicating the need for smaller data centers. With the increase in virtualization, investments in better power and energy efficiency are a top priority for data centers. Cloud data centers are currently in high demand. While cloud data centers are not necessarily the less expensive option, they provide speed, flexibility and efficiency. New technologies are pushing more organizations to operate using the cloud to improve overall operational efficiency.
Organizations are getting more creative in how they attempt to design a more energy efficient data center, compared to the traditional brick-and-mortar data centers. From the design of the server racks and power distribution to the placement and quantity of CRAC units, data center designers and manufacturers are exploring new and innovative ways to be energy efficient. The other piece of the efficiency factor would be the availability of the Internet. The areas of the country that have the larger internet types are becoming attractive to organizations, because more and more data is being pushed through the cloud. Keeping the latency at an acceptable time requires thicker internet pipes. Furthermore, if you have a bunch of small things like transactions, social networking or things of that nature, those are just single point and clicks and can fly over small internet pipes, whereas cloud services require a database to send a large packet through the Internet. The latter requires an increased level of connectivity and larger internet pipes, which is another driving factor in the shift to larger service providers/data centers.
The need for flexible, dependable and faster technology has created the shift from smaller to larger data centers. While the initial number of data centers is decreasing, virtualization and the cloud are becoming the new standard among data centers to extend value and scale. With numerous changes and trends currently impacting data centers, the future of the industry will continue to expand, but will most likely fall to a few key players.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 4:33p |
Millions of Verizon Customer Records Exposed through Open Amazon S3 Bucket  Brought to you by IT Pro
An organization is only as secure as its most vulnerable partner, which is a lesson Verizon is learning the hard way this week as 14 million records belonging to its subscribers were found on an unprotected Amazon S3 storage server managed by Israeli software company Nice Systems.
The customer records include a customer’s name, cell phone number, and their account pin, which could be used to grant access to a subscriber’s account, according to a report by ZDNet. Verizon said there is no evidence that the information in the data set had been compromised.
UpGuard researcher Chris Vickery found the data and told Verizon of the exposure at the end of June. It took over a week before the data was secured. Last month, UpGuard discovered exposed personal information belonging to more than 198 million registered U.S. voters stored in an Amazon S3 bucket belonging to data firm Deep Root Analytics.
The records also included data such as a customer’s home address, email address, their account balance, and if a subscriber has a Verizon federal government account, among other data. There were no audio files found on the server.
The data was contained in log files that generated over the last six months as Verizon customers called customer service. Nice stores the records to analyze “intent, and extract and leverage to deliver impact in real time” which allows Verizon to improve customer service.
A Verizon spokesperson told ZDNet that the company had provided the vendor with “certain data to perform this work and authorized the vendor to set up AWS storage as part of this project. Unfortunately, the vendor’s employee incorrectly set their AWS storage to allow external access.”
Nice has remained quiet on the issue, noting that the data was part of a “demo system” and that no other Nice customer has been impacted. The company said it will continue to investigate the exposure as Verizon does the same.veri | 5:02p |
Cisco Beefs Up Security Arsenal With Observable Networks Acquisition As businesses are expected to spend $1 trillion by 2021 to combat cybercrime, Cisco has been taking measures to beef up its security offerings with a series of acquisitions in this arena.
Today, the San Jose-based networking announced that it has obtained cloud security startup Observable Networks for an undisclosed amount, reported the St. Louis Business Journal. Based in St. Louis, the company which monitors what people are doing on in the cloud to help detect security breaches, had raised less than $5 million, according to Crunchbase.
Cisco also bought Georgia-based Lancope in 2015 for $453 million and Massachusetts-based Cloudlock last year for $293 million. Microsoft and Oracle have also made recent key security acquisitions.
“The ability to dramatically improve visibility, security and response capabilities across an entire IT surface, including highly distributed branch environments and public cloud infrastructures, is becoming increasingly important as companies and organizations continue their digital transformation,” Rob Salvagno, Cisco’s head of mergers and acquisitions, said in a blog announcing the acquisition.
Also read: Is a Retreat from Private Cloud Also Under Way? Cisco Weighs In
Some say the foray into new security systems goes hand-in-hand with Cisco’s attempt to make up for its stumbling legacy business.
“The acquisition of Observable Networks supports Cisco’s strategic transition toward software-centric solutions,” Salvagno said.
Observable Networks’s technology will be folded into Cisco’s Stealthwatch product, which is part of its Security Business Group run by David Ulevitch. Cisco bought his security company, OpenDNS, in 2015 for $635. | 5:10p |
CenturyLink Sued by Minnesota Amid $12 Billion Class Action (Bloomberg) — CenturyLink Inc., already facing a $12 billion lawsuit alleging that consumers were saddled with costly unwanted services, now must deal with a complaint by Minnesota, the first state to sue the telecommunications company over its billing practices since the class action was filed.
The state attorney general’s suit, filed Wednesday, accuses CenturyLink of consumer fraud and deceptive trade practices. “CenturyLink has regularly misquoted the price of its internet and television services to Minnesota consumers,” it alleges. “In a response to a complaint from the Minnesota Attorney General’s Office on behalf of a consumer, a CenturyLink employee stated that, of the sales recordings she reviews, ‘maybe 1 out of 5 are quoted correctly or close enough.’ ”
“Shopping for internet and cable TV service isn’t easy if companies don’t give straight answers about the prices they will charge,” Attorney General Lori Swanson said in a statement.
The complaint comes as the Monroe, La., company is in the midst of a $34 billion merger with Level 3 Communications Inc. that will put CenturyLink up against powerhouses such as AT&T Inc. in bidding to provide communications services to businesses.
“We have been cooperating with the Minnesota Attorney General’s Office since its inquiry began and have provided all information requested,” Mark Molzen, a CenturyLink spokesman, said in an e-mail. “We are disappointed that the Attorney General has chosen a press conference to communicate her concerns instead of contacting CenturyLink directly. We take these allegations seriously and will review and respond in due course.”
Shares of CenturyLink closed down 3.2 percent, to $22.51, on the news.
The Minnesota attorney general’s office received hundreds of complaints about billing from CenturyLink customers, Swanson said in a phone interview.
“We’ve been investigating for over a year,” she said, first serving the company with a civil investigative demand. In one instance, her office asked CenturyLink for recordings of customer calls, and the company replied that they didn’t exist, she said. She said she got them only after her office subpoenaed a third-party vendor that had custody of the recordings. According to the complaint, CenturyLink called the attorney general’s request for full price information “unduly burdensome.”
“First and foremost, we want the company to stop doing what it’s doing to people,” Swanson said. “We would like to see injunctive relief so the company quotes accurate pricing. And, second, we want to see money back for people.”
CenturyLink’s Molzen said the company “produced a significant volume of documents and information, including audio tapes. We have kept an open line of communication with the [attorney general’s] Office throughout the investigation. At no time did the Attorney General’s Office suggest that CenturyLink was not providing information or responding adequately to the Office’s requests.”
The earlier suit, seeking class action status and damages as high as $12 billion, is making its way through the courts. It was brought by the Geragos & Geragos law firm, led by celebrity attorney Mark J. Geragos. Class actions are common after contentious allegations against large companies.
CenturyLink was accused of inappropriately charging customers by Heidi Heiser, a former employee who claims she was fired for whistleblowing and who sued the company in Arizona for wrongful termination in June. Her complaint alleged that CenturyLink “allowed persons who had a personal incentive to add services or lines to customer accounts to falsely indicate on the CenturyLink system the approval by a customer of new lines or services.”
Additional cases have been filed in California, Oregon, Idaho, Colorado, Nevada, and Washington.
“We are proud to have exposed this massive fraud on consumers, which has unjustly enriched CenturyLink to the tune of billions of dollars nationwide,” said Ben Meiselas, the attorney with Geragos & Geragos who represents plaintiffs in those cases and is seeking to have them treated as a class. “We applaud Attorney General Swanson and call on all Attorney Generals across the nation to protect its consumers against CenturyLink.”
Of the class action complaint, CenturyLink’s Molzen said last month, “The fact that a law firm is trying to leverage a wrongful termination suit into a putative class action lawsuit does not change our original position. He said Heiser failed to report her allegations to the company’s 24-hour Integrity Line, that they are “are completely inconsistent” with company policy and culture, and that “we take these allegations seriously and are diligently investigating this matter.”
Heiser didn’t report her concerns to the Federal Communications Commission or other authorities.
CenturyLink, which provides data services nationwide, including hosting, cloud, and information technology services, booked $816 million in net income on $17.5 billion in sales last year. | 5:22p |
Intel’s Xeon Scalable Designed From the Ground Up for Data Centers While the focus early this week might have been on Microsoft’s Inspire in the nation’s capital, Intel was having an event of its own in New York City on Tuesday.
Promising it will revolutionize the data center, Intel launched its latest Xeon Scalable line based on its Skylake architecture.
“Today Intel is bringing the industry the biggest data center platform advancement in a decade,” said Navin Shenoy, vice president and general manager of Intel’s data center group. “A platform that brings breakthrough performance, advanced security, and unmatched agility for the broadest set of workloads — from professional business applications, to cloud and network services, to emerging workloads like artificial intelligence and automated driving.”
Intel wants to tighten its grip on the data center market, where workloads are accelerating as new technologies — blockchain and IoT, for example — are competing for bandwidth. The new line promises to bring a quantum leap in performance, with Platinum level processors supporting two, four, or eight sockets, and offering up to 28 cores with 56 threads and up to three 10.4 GT/s UPI links. Add to this a clock speed of 3.6GHz, 48 PCIe 3.0 lanes, six memory channels utilizing DDR4-2666MHz DRAM and up to 1.5TB topline memory channel bandwidth, and you have a server ready for some heavy lifting.
If that sounds like overkill, it isn’t.
The launch comes just as AMD seems to be becoming a player that matters again with the recent launch of its less expensive Epyc server line, which offers up to 32 cores, 64 threads and 3.2GHz bandwidth. Benchmarks probably aren’t as good as with Xeon Scalable, but that also depends on who’s talking. There’s additional competition from power sipping, and even cheaper, ARM-based server chips in the pipeline, that will undoubtedly find increased data center uptake for some workloads.
The company showed no indication that it’s looking over its shoulder, however, as it launched the Xeon Scalable line. There was no mention of any competition, just a forward-looking focus on the new offering and what it can do for data centers.
“The Intel Xeon Scalable Platform includes a completely rearchitected microprocessor designed from the ground up for the data center specifically,” Shenoy explained early on at the launch event, “offering greater levels of integration and workload specific accelerators. The Xeon Scalable Platform is the industry’s highest performance per watt platform.”
The platform offers more, however, than a mere increase in performance or faster speeds, which was pointed out by Lisa Spelman, vice president and general manager of Xeon Products and Data Center Marketing.
“We’re delivering on traditional performance drivers, like more cores, higher performing cores, more memory, and more I/O,” she said. “Intel’s taking it far beyond that. We’re moving beyond the basics and we’re adding unique innovations that truly address the unique requirements of data center workloads.”
These innovations include AVX-512, an instruction set in the Xeon Scalable processor that accelerates data processing.
“This has traditionally been used or valued in the high performance computing space,” she said, “but is actually expanding and growing into new workloads as well. It takes vector performance to a new level by doubling the width of the registers and doubling again the number of registers available. This results in a 2X flops-per-clock efficiency compared to the previous generation, and it makes vector computation a lot more effective and more applicable across those wider workloads.”
Quick Assist is also integrated into the system, a technology that was originally used to accelerate compute-intensive operations by reducing the size of data packets. Here, it’s used as something of a security tool, to reduce any server performance hits caused by encryption.
“We’ve also worked with our customers across the cloud service providers and across enterprises and realized that Quick Assist technology delivers great speedups for security algorithms,” she explained. “With Quick Assist integrated into the chipset in this generation, you get a hundred gigabits of cryptography and a hundred gigabits of compression, which allows you to increase the overall workload performance, while freeing up that precious compute capacity on the cores for higher order function.”
There’s also a new architectural design for the processor itself, utilizing a mesh rather than a ring design, which was necessitated by the higher number of cores available in Xeon Scalable.
“In the previous design, data would have to go around the ring — if you’re starting at the last core and need to get to the one on the other side — through a buffer and all the way back around the second ring to get to the final destination. With Mesh, the data simply cuts across the improved and increased number of data pathways and avoids the buffer. Also, CPUs don’t actually process one cache line at a time. A real data center CPU needs to have all cores accessing all memory, all I/O, at the same time, and the mesh fundamentally provides this.”
The benchmarks offered by Intel were impressive, showing performance improvements of up to 1.65X over Xeon’s last generation and a data protection performance increase of 2X. The company also claims a 4.2X greater VM capacity, with total cost of ownership dropping 65 percent.

More interesting than Intel’s own benchmarks were performance numbers reported by software companies that were given early access to Xeon scalable. Running its cloud based video stitching application optimized for the new platform, Tencent saw a 72 percent gain, and SAS, running its business analytics stack, reported a doubling of performance over Xeon’s last generation.
The platform has been field tested under production workloads as well.
“We took our most aggressive early ship program that we’ve ever delivered in the data center group,” Spelman said, “delivering production processors ahead of launch to cloud service providers, enterprises, high performance computer users, and com service providers like AT&T. Starting in November of last year, we have sold over 500,000 processors, optimized for data center workloads to over thirty customers.”
Google was the first of these customers to launch instances of Xeon Scalable — on its Google Compute platform in February — and now has customers in retail, financial services, oil and gas, and education running in production.
Bart Sano, Google’s vice president of platforms, spoke briefly at the event by video. “Our customers have reported up to 40 percent improvement in many cases versus previous platforms,” he said. “In some cases, where the customers tuned for the AVX-512, they saw more than 100 percent improvement.” | 5:44p |
Microsoft-Backed Security Startups Outsmart Hackers with Hackers  By WindowsITPro
To beat a hacker, you have to think like a hacker, and in some cases, even work with them, according to two Microsoft Ventures’ startups who spoke during the Microsoft Inspire 2017 event this week.
Microsoft Ventures launched last year to focus on startup investments, unlike Microsoft Accelerator, which focuses on startup enablement through its accelerators around the world. Two companies that were added to its portfolio this year were on hand at Microsoft Inspire to talk about their approaches to cybersecurity, which remained a hot topic over the week dominated by news about Microsoft 365 and Azure Stack.
Synack Crowdsourced Cybersecurity Platform Helps Close Talent Gap
In a breakout session on Wednesday, Synack CEO and co-founder Jay Kaplan talked about its crowdsourced approach to cybersecurity, which he said is much more scalable than hiring security pros, particularly in light of the massive cybersecurity talent gap.
“Clearly there’s a big issue here; we need to recruit more talent and frankly most organizations are not going to be able to fill the positions they need,” he said.
The company is based in Redwood City, Calif., and was launched in 2013 when Kaplan and Mark Kuhr left the NSA. In March, it closed $21 million in Series C funding, with Microsoft Ventures participating in the round.
Kaplan said that crowdsourced security is not a new concept, pointing to bug bounty programs, which have been around since the mid-nineties. But he says Synack lets customers reap the benefits of crowdsourced cybersecurity expertise without worrying about negotiating payouts or verifying that the research is legit.
Its security platform offers crowdsourced penetration testing, vulnerability orchestration, analytics and risk reporting. According to Kaplan, 75 percent of its programs find a severe vulnerability in the first 24 hours.
“From a process perspective it is much more scalable, it’s on-demand, you’re not waiting for someone to come on board, and you have a whole stable of people not working on these problems,” he said.
As the gap in cybersecurity talent continues to mount, Kaplan said that Synack helps organizations close this by leveraging cybersecurity expertise from researchers and experts all over the world. This diversity is critical in helping companies find threats faster, he said – much faster than relying on consultants or a handful of on-premise staff.
“You can crowdsource security; to beat a Russian you have to have one on your team,” he said, noting that the cybersecurity talent that you need may not be available in your local market.
Through its platform, Synack is able to address some of the potential pitfalls of a crowdsourced cybersecurity approach, namely verifying that random people on the internet are who they say they are.
“We were able to solve these problems with process and technology,” he said.
Synack does background checks on its security researchers, and they sign an NDA. Its platform also handles triage and prioritization of reports, which takes a considerable amount of work, particularly if a company was to do that in-house, Kaplan said.
Illusive Networks: Beating Hackers at their Own Game
Israel-based Illusive Networks became a Microsoft Ventures portfolio company in Q1 2017. It was the first company launched out of the Israeli cybersecurity incubator Team8, whose founders are alums of the Israeli Defense Forces.
Tracy Pallas, VP of channel sales and strategy, who joined Illusive Networks from roles within Extreme Networks and Palo Alto Networks, spoke at Microsoft Inspire on Wednesday about its platform which rethinks the common post-breach strategy.
Post-breach technologies can be problematic for security operations centers because there are a lot of false positives making it hard to determine which ones should get priority. Vallas claims that Illusive Networks solution has no false positives. She said its motto is if it is positive, it’s positive.
“A new post-breach detection strategy is a must,” she says, “We approach the problem by thinking like an attacker.” Part of this approach includes understanding the psychology of an attacker and what they are looking for through watching how they move latterly within a network, she said.
“We don’t focus on signatures; we focus on lateral movements,” she said. Once a hacker breaches a perimeter, it needs to determine where it has landed, where it can go from there, and how it will get there. Illusive Networks thwarts this approach by making the network look much larger than it already is, Pallas said, basing its methodology on the OODA Loop.
Illusive Networks deploys deceptions such as fake browser history and false credentials, which resolve back to a trap server. These deceptions are used to track the hacker and send the information it collects back to the incident response teams.
It uses an agentless approach, which means it doesn’t require any additional IP space, and is distributed through a 100 percent channel model. |
|