Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, March 29th, 2016
| Time |
Event |
| 12:00p |
How Open Source is Changing Data Center Networking 
This month, we focus on the open source data center. From innovation at every physical layer of the data center coming out of Facebook’s Open Compute Project to the revolution in the way developers treat IT infrastructure that’s being driven by application containers, open source is changing the data center throughout the entire stack. This March, we’ll zero in on some of those changes to get a better understanding of the pervasive open source data center.
The perfect data center masks the complexity of the hardware it houses from the requirements of the software it hosts. Compute, memory, and storage capacities are all presented to applications and services as contiguous pools. Provisioning these resources has become so automated that it’s approaching turnkey simplicity.
This is the part of the story where the open source movement stands up, takes a bow, and thanks its various supporters, agents, and parents for making everything possible for it. To say that open source efforts are responsible for the current state of data center networking would be like crediting earthquakes for the shapes of continents. Yes, they play an undeniably formative role. But the outcome is more often the result of all the elements — many of them random — they put into play.
One of these elements is a trend started by virtualization — the decoupling of software from the infrastructure that supports it. Certain open source factions may be taking credit for the trend of disaggregation now, but the next few years of history may record it as something more akin to a gathering storm.
“A very fundamental architectural principle that we believe in is, first of all, we want a platform in the future that allows hardware innovation and software innovation to grow independently,” said Equinix CTO Ihab Tarazi, in a keynote address to the Linux Foundation’s Open Networking Summit in Santa Clara, California, earlier this month.
 Ihab Tarazi, CTO, Equinix (Photo: Scott Fulton III)
“I don’t think the industry has that today,” Tarazi continued. “Today, if you innovate for hardware, you’re still stuck with this specific software platform, and vice versa; and all the new open source software may not have… a data center, without customized adoption in specific areas by service providers. So what we want to create in our data center is the platform that allows the new explosion of hardware innovation that’s coming up everywhere, in optics and switching and top-of-rack — all of that, to have a home, to be able to connect to a platform independently of software. And also we want all the software innovation to happen independently of the hardware, and be able to be supported.”
From Many Routes to One CORD
It isn’t so much that open source, in and of itself, is enabling this perfect decoupling of hardware from software that Tarazi envisions. Moreover, it’s the innovation in data center automation and workload orchestration happening within the open source space in just the past three years that is compelling the proprietors of the world’s largest data centers to change their entire philosophy about the architecture, dynamics, and purpose of networks. Telecommunications providers especially now perceive their networks as data centers — not just in the figurative sense.
Read more: Telco Central Offices Get Second Life as Cloud Data Centers
“We want to scale out [the network] in the same way that we scale out compute and storage,” explained AT&T SDN and NFV engineer Tom Anschutz, speaking at the event. It’s part of AT&T’s clearest signal to date that it’s impressed by the inroads made by Docker and the open source champions of containerization at orchestrating colossal enterprise workloads at scale. But it wants to orchestrate traffic in a similar way, or as similar as physics will allow, and it wants open source to solve that problem, too.
Last June, AT&T went all-in on this bet, joining with the Open Networking Lab (ON.Lab) and the Open Network Operating System (ONOS) Project to form what’s now called Central Office Re-imagined as a Datacenter (CORD, formerly “Re-architected”). Its mission is to make telco infrastructure available as a service in an analogous fashion to IaaS for cloud service providers.
Anschutz showed how a CORD architecture could, conceivably, enable traffic management with the same dexterity that CSPs manage workloads. Network traffic enters and exits the fabric of these re-imagined data centers using standardized interfaces, he explained, and may take any number of paths within the fabric whose logic is adjusted in real-time to suit the needs of the application.
“Because there’s multiple paths, you can also have LAN links that exceed the individual thread of capacity within the fabric,” he said, “so you can create very high-speed interfaces with modest infrastructure. We can add intelligence to these types of switches and devices that mimic what came before, so control planes, management planes, and so forth can be run in virtual machines, with standard, Intel-type processors.” | | 12:00p |
Resolve to Make More Data-Driven Decisions: 5 Trends to Consider Lance Walter works in global marketing for Datameer.
In our new data economy, the best business decisions are data driven. To stay ahead of the curve, companies must use data to better understand and reach customers, develop new revenue streams, and dramatically improve operational efficiencies.
Without data they will fall behind. As companies map out their goals for the next year, they should commit to making data analytics the crux of their strategies. In order to build a comprehensive data strategy, businesses should be aware of these top five trends shaping the industry:
The Internet of Things
As the complexity of data accelerates, the barrier to entry for data analytics grows bigger and bigger. This barrier will exponentially increase with the Internet of Things (IoT) as the number of connected devices is predicted to rise by 285 percent, according to a recent report from Juniper Research. Organizations that want to bridge the technical gap to enable data-driven decisions across teams must find a way to deliver user simplicity while remaining under the umbrella of data governance.
While the IoT has already emerged as the next mega-trend, it will continue to grow beyond just hype. Companies must actively change their strategy and infrastructure to harness the power and insight of IoT technologies and their data.
New Big Data Case Studies are Emerging
Big data also allows companies to find new revenue streams with new data-driven products and services. By combining, integrating and analyzing data – regardless of source, type, size, or format – business leaders can quickly and affordably scale to huge volumes of data and analyze them for insights.
Customer analytics is traditionally considered the primary use case for big data analytics. However, as companies better understand their customer base, use cases will expand from external to internal – making operational analytics the main use case.
Cloud Analytics is a Game Changer
Right now, a few cloud-only Hadoop players exist, while other vendors offer distinct on-premise and cloud editions of their products. As companies recognize the advantage of sidestepping Hadoop hardware requirements, which become outdated every 18 months, cloud adoption will surge. A Gartner report cites the skills gaps continue to be a major adoption inhibitor for 57 percent of respondents. Vendors, particularly distributors, will pivot their offerings in order to keep up with demand for cloud-based solutions.
Streaming Data Goes Mainstream
The desire to gather data – and analyze it – in real-time is growing immensely, driven by IoT applications and our increasing tendency toward instant gratification. Mitigating the complexities of mashing up real-time and historical data will become a focal point – pointing toward technologies like Kafka and others.
The Big Data Industry Will Undergo Major Transitions
Best-of-breed solutions are starting to lose their luster. As big data has grown as a technology category, multiple companies emerged with individual products that provided a solution for a portion of the space. This has forced customers to buy multiple tools and learn how to use them together. Customers are seeing that multiple tools often slow down the process, create data redundancies, and complicate compliance. This is driving them toward more integrated, comprehensive stacks.
Likewise, industry consolidation is reaching critical mass. A number of acquisitions have taken place over the last two years as larger organizations try to fill holes in their offerings, shrinking the number of pure-play, standalone big data vendors.
Companies not using data to guide decision-making are at a serious competitive disadvantage. As more businesses resolve to make data-driven decisions, staying on top of these trends will help guide the most successful data strategies.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 1:45p |
Crypto War Cease Fire: DOJ Finds Way into iPhone Without Apple’s Help, Order Vacated  By WindowsITPro
It appears the mysterious third party that promised a way to access locked iPhones has come through: The order stating Apple must decrypt the iPhone used by one of the San Bernardino shooters has been vacated, at the request of the Department of Justice.
A week ago, the DOJ filed asking for a delay in a hearing on the order, stating that during the ongoing court battles (a court initially ruled in the Department of Justice’s favor, which Apple appealed), a third party had reached out to them with a method it believed could successfully unlock the iPhone of one of the San Bernardino shooters.
This came after the DOJ had previously stated that it had exhausted all other options regarding accessing the contents of the county-issued phone.
What that method was — or who approached the DOJ — is still unknown. Officials, speaking on background to USA Today, denied that the company was Cellebrite, as was widely reported elsewhere.
Whatever relief to the tensions this provides is likely to be short-lived: A number of other cases in which the government is demanding assistance in accessing encrypted devices are working their way through the courts, and neither side appears willing to back down.
Original article appeared here: Crypto War Cease Fire: DOJ Finds Way into iPhone Without Apple’s Help, Order Vacated | | 5:51p |
Oracle Wants $9.3B from Google as 6-Year-Old Java Copyright Battle Continues  By The WHIR
Oracle is looking for $9.3 billion in damages from Google as its long-running copyright battle goes back to court on May 9.
The case centers on Oracle’s claim that Google needs a license to use parts of the Java platform in Android. In 2012, the jury was split on whether the search giant’s use of Java was protected by fair use. The new trial will cover six additional versions of Android, up to and including Lollipop.
Oracle is asking for about 10 times the amount it sought when the case initially went to trial in 2012. It could be lowered before the case gets to trial in less than six weeks.
Read more: Oracle Pitches On-Prem Cloud for Compliance-Conscious Enterprises
IDG said that $9.3 billion “reflects the dramatic growth of both Android and the smartphone market in the intervening years.” Consider this: in Q2 2012 Android accounted for 69.3 percent of the smartphone market; in Q2 2016 that number has increased to 82.8 percent.
On Friday, US District Judge William Alsup called on Google and Oracle to explain how they plan to use the information available on social media profiles of potential jurors. Prospective jurors will be given a chance to adjust the privacy settings on their social media accounts prior to the jury selection process.
In the last trial, Alsup found Oracle’s Java programming interfaces were ineligible to be copyrighted, but the Federal Circuit reversed his ruling in May 2014. The US Supreme Court refused to hear an appeal of the reversal in June 2015, according to Courthouse News.
Original article appeared here: Oracle Wants $9.3B from Google as 6-Year-Old Java Copyright Battle Continues | | 8:50p |
Survey: Containers and Microservices Now Entering Production Environments  By Talkin’ Cloud
Containers — the data center technology you’ve heard all about, but probably not seen often in production — are finally making their way into the real world. That’s according to a survey out this week from NGINX.
Titled “2016 Future of Application Development and Delivery Survey,” the report reveals data based on responses from more than 1,800 IT professionals. It covers a broad range of topics related to application development and deployment, ranging from cloud trends to security issues.
But the most interesting finding was about container adoption. According to NGINX, 20 percent of respondents say they are now using containers in production. And those that have already adopted containers are using them in a big way, with one-third reporting running them for more than 80 percent of workloads. This suggests that, while the move to containers may have been slow to get underway, companies are doing much more than just dabbling with them once they finally start putting them into production.
Overall, two-thirds of organizations say they are “either investigating containers or are using them in development or production,” according to the survey. About 70 percent of respondents said they are doing the same thing with other types of microservices, like Unikernels, the technology Docker recently acquired.
NGINX didn’t provide details on which types of containers and microservices are now entering production environments. It seems a safe bet that Docker containers top the list, however, since the company’s container platform has enjoyed the greatest amount of funding and attention in past years.
But the container ecosystem is diverse. You can’t write off the competing container platform from CoreOS. And Docker containers can be combined with tools from a variety of other vendors, such as container orchestration platforms like Kubernetes. For these reasons, no one should make bets yet about any one container solution dominating production environments.
Other notable — if unsurprising — findings from the survey include Amazon AWS’s continuing dominance in the public cloud, and organizations reporting that they are still struggling to deliver satisfactory app performance and security. Full details are available from NGINX’s website.
This first ran at http://talkincloud.com/cloud-computing/survey-containers-and-microservices-now-entering-production-environments |
|