Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, March 27th, 2015
| Time |
Event |
| 12:00p |
Ayasdi Raises $55M to Blend AI and Machine Learning People talk about the challenges of big data, but the real problem is not size, it’s the growing number of variables within that data that make it complex, Ayasdi CEO Gurjeet Singh said.
Ayasdi provides machine-intelligence software, a combination of machine learning and artificial intelligence that solves very complex problems for the largest institutions. The company recently raised a $55 million round led by Kleiner Perkins Caufield & Byers for more than $100 million total raised to date.
Just as data is becoming increasingly complex, yet essential for operations, the number of possible insights is also growing and rendering the traditional approach to analytics ineffective. So, it’s no surprise that the number of data scientist or data analyst positions is on the rise. However, Ayasdi doesn’t think it’s a problem you can hire away: the fundamental approach needs to change.
The story of Ayasdi starts with the Defense Advanced Research Projects Agency (DARPA) and the National Science Foundation (NSF) awarding $10 million in grants to Stanford professor Gunnar Carlsson to apply Topological Data Analysis (TDA) to real-world problems. Ph.D. mathematics student Gurjeet Singh joined his then professor Gunnar Carlsson on the TDA research project, which continued to grow and ultimately became Ayasdi (Ayasdi means “to seek” in Cherokee).
Its customer base has grown significantly over the years as many pilot deployments are maturing, said Singh. Ayasdi deals exclusively with Fortune 1000-type companies in healthcare, financial services and others. These are sizable, seven-figure deals with customers showing even more substantial savings. Some publicly disclosed customers include Lockheed Martin and Citi Group. Ayasdi plays in the big leagues of analytics and is doing well as illustrated by what the company claims was a ninefold increase in annual recurring revenue last year.
Automating Insights
A typical analytics approach is very manual, time-consuming, and inefficient.
“Traditional analytics have hit the wall,” said Ayasdi Chief Marketing Officer Patrick Rogers. “It starts with an analyst asking questions, and then applying them against data that may or may not find insight. You must then go back and reformulate until you find something impactful. There are a lot of tools, but it’s still fundamentally a human-driven process. That model is not going to scale—the number of possible questions grows exponentially with data sets.”
Rather than the hypothesis/test approach, Ayasdi takes a very machine-driven one to address complex data. At the heart of Ayasdi’s machine intelligence is topological math, which is building a more automated discovery process and eliminating manual processes. Ayasdi identifies what you should be looking for through patterns in complex data.
The old methodology is essentially guesswork,” said Singh. “There are many tools to validate or invalidate hypotheses. For large businesses, relying on luck is not the best strategy. We use this concept of topological math. We exploit TDA, we look at the shape of the data and accelerate the discovery process. It’s why we’re able to do it so quickly and accurately.”
Plugged Into Heart of Data Center
Automating this requires a lot of compute, according to Lawrence Spracklen, the company’s vice president of engineering. “We’ve spun up a force around high-performance computing,” he said. “We’ve focused on looking at algorithms, so we can scale linearly on a single node and across a cluster.”
The company has relationships with Intel and others that help teach it tricks of the trade. It is continuously looking at ways it can leverage things like NVIDIA GPUs, super dense accelerator cards, in a way that is space- and power-efficient, said Spracklen.
It can horizontally scale to datasets that take up billions of rows and do so in a way that minimizes bottlenecks. It doesn’t require a lot of memory, so the company can deploy on public clouds. However, given the very sensitive nature of the data it works on, most deployments are on premises.
The company does spin up dedicated clusters in its own private cloud from time to time used mostly by the healthcare industry because of HIPAA certification.
A lot of customers want Ayasdi to deploy on their Hadoop data lake, said Spracklen. It natively runs on YARN and has a library of connectors to things like Teradata machines. “We’re plugged into the heart of the data center,” he said.
Mercy, the fifth largest Catholic health care system in the U.S., used Ayasdi to build a decision system from their patient data. It projects a savings of well over $100 million over the next three years and recently selected Ayasdi for another difficult problem in claims denial.
Lockheed uses it to better manage program outcomes. Citi Group, which previously failed Federal Reserve stress tests, used Ayasdi to take down thousands of variables and distill them to a handful, passing just two weeks ago.
The majority of funding is going toward marketing and sales expansion. Going after multiple verticals, the company will focus on product enhancement and engineering.
In addition to KPCB, the latest round (Series C) was supplied by existing investors Institutional Venture Partners, Khosla Ventures, Floodgate, and Citi Ventures as well as new investors Centerview Capital Technology and Draper Nexus. | | 3:30p |
Migrating to Windows Server 2012 and Updating Your Infrastructure in 2015 Ravi Pendekanti is Vice President of Server Solutions Marketing at Dell, Inc. His organization is responsible for developing and bringing Dell’s flagship line of PowerEdge Servers and Converged Infrastructure systems to market.
In 2003—the same year a Harvard undergrad named Mark Zuckerberg began development on Facebook and the first convergent Blackberry smartphone was introduced—the rise of cloud computing was still years away. It would be even longer before the terms “big data” and “BYOD” reached the common parlance.
It was during that simpler time that Microsoft released the Windows Server 2003 OS, on a CD, for Intel Xeon and Itanium 2 processors. At the time a somewhat “beefy” hardware configuration for a server running the OS was a processor in excess of 900MHz, with 512MB of RAM and a SCSI drive array with at least three 20GB drives.
Needless to say, the IT landscape has changed drastically since then. Explosive data growth, increased mobility, globalization, virtualization and regulatory demands are forcing IT to rethink how they address challenges within their organizations. And with extended support for Windows Server 2003 ending on July 14, 2015, there is no better time than now to build a new, more flexible environment with Windows Server 2012 R2 and modern server infrastructure.
More Powerful Software and Hardware
While migrating to Windows Server 2012 R2 may be inevitable, it’s also a good move – and not just because of the potentially costly regulatory and compliance issues and heightened security threats from remaining on an unsupported OS. It’s a good opportunity to benefit from advances in new technologies.
Some of the new features added in Windows Server 2012 R2 – that were not available in Windows Server 2003 – include scalable, feature-rich virtualization via Hyper-V, the addition of network virtualization to isolate network traffic from different business units or customers on a shared infrastructure, high-availability, affordable storage, improved management, dynamic access control and support for hybrid applications.
To get the most out of this software investment, customers should look to pair it with the best of breed hardware. Windows Server 2012 R2 is optimized for some of the hardware features in the latest Intel Xeon processor platforms such as increased density, greater memory capacity, I/O optimization, enhanced reliability and security, and energy efficiency. As a result, today’s servers can speed application performance, empower workloads for any scale and simplify systems management.
Where to Begin
Even with Windows Server 2003 end of extended support rapidly approaching, many data centers have yet to upgrade their operating environment. Migration can appear to be a daunting process but once organizations decide to refresh their server infrastructure and migrate to Windows Server 2012 R2, there is a general three-step process to follow: Assess, Integrate and Migrate (AIM).
- The Assessment stage is used to better understand the existing server environment and to identify and catalog all the software and workloads that are running on earlier versions of Windows Server. To get started, organizations can leverage complimentary tools such as the Microsoft Assessment and Planning (MAP) Toolkit that provides an agentless and network-wide inventory. The inventory will provide a view of all the applications running on earlier Windows Server environments in several ways – by type, by criticality, by complexity and by risk. For many organizations, this phase presents an opportunity to retire under-utilized or redundant applications, consolidate licenses across the organization, and update older applications to reap the benefits in more current versions. After the assessment organizations can prioritize the applications and workloads for migration to help determine the sequence of updates that need to be performed.
- Next, consider the integration options for each workload. Some applications may be best suited to move to new hardware running Windows Server 2012 R2, others may be deployed and managed within a hybrid cloud environment, while others may be suitable for the public cloud. When it comes to new hardware, organizations that replace aging infrastructure with future-ready servers with the latest Intel Xeon processors can increase performance by up to 585 percent, decrease idle power consumption by up to 66 percent and reduce latency by up to 30 percent. Choosing servers that also integrate systems management capabilities with Microsoft Systems Center can also provide enhanced automation and data center efficiencies.
- The final step is to conduct the actual migration. Data center migrations can be a complex and time consuming activity that must be executed without affecting business operations. Part of this process includes migrating foundation services such as Active Directory, DNS, etc. to Windows Server 2012 R2. As anyone who has been involved in previous migration efforts knows, moving services like Active Directory can bring its own challenges, so organizations should look for automated software that will ensure users have the same access to resources after the migration and promote security and compliance in the new Windows Server environment.
Modern Technology Will Protect and Maximize Your Investments
Although some organizations might approach migration with the sense that their hand has been forced, it’s undeniably an advantageous move. The data center of today is highly flexible, with efficient and reliable systems perfectly geared toward making life easier for administrators, employees and customers. It not only enables cost savings, but is optimized for growing into the future and creating new business opportunities.
The time to invest in modern infrastructure is now. Even if migration isn’t exactly your choice this year, it’s a move you definitely won’t regret.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 4:00p |
Friday Funny Caption Contest: Bunny Eggs Data centers store more than just switches and servers – if you’re lucky you may even find eggs! Help us figure out where the heck these came from by submitting your caption below.
Diane Alber, the Arizona artist who created Kip and Gary, has a new cartoon for Data Center Knowledge’s cartoon caption contest. We challenge you to submit a humorous and clever caption that fits the comedic situation. Then, next week, our readers will vote for the best submission.
Here’s what Diane had to say about this week’s cartoon, “I always thought the raised floor would be a good hiding spot for Easter eggs!”
Congratulations to the last cartoon winner, Emily, who won with, No, Sir Charms, I said “China Unicom”!
For more cartoons on DCK, see our Humor Channel. For more of Diane’s work, visit Kip and Gary’s website. | | 5:12p |
Vendors Push Telco Cloud NFVs Atop OpenStack A two pair of technology giants has teamed up on Network Function Virtualization offerings. Oracle is using Intel’s Open Network Platform, and Canonical and Ericsson are teaming up to target the telecom cloud space. Both pairings are focusing their efforts atop OpenStack.
Network Function Virtualization is a way to package functions traditionally performed by specialized physical appliances into virtual machines that can run on any physical server. As part of a wider software-defined movement, many functions are moving away from the appliance model to providing functions through software. The data center revolution that was server virtualization over the past 10 years or so is now occurring across network, as well as storage.
Cloud NFVs as services hold a lot of market potential for companies that provide them and the end users themselves, who gain unprecedented network flexibility. Cloud NFV for telcos is an especially hot spot in the market, as network carriers are one of the groups standing to benefit from the technology the most.
In other recent NFV news, Brocade beefed up its play with the acquisition of Riverbed’s virtual Application Delivery Controller, and VMware rolled out 30 or so cloud NFVs delivered as services on top of its cloud infrastructure by partners.
Carrier-Grade Ubuntu to Enable Ericsson’s Cloud NFV
Canonical, known primarily for its Linux distribution Ubuntu, formed an alliance with the Swedish telco-gear vendor Ericsson to move into the cloud NFV space. The two firms will align engineering and go-to-market efforts over the next three years.
Ericsson will incorporate Ubuntu as the operating system of choice for its cloud. The partnership gives Canonical a foot in the door with CSPs to further push Ubuntu, and Ericsson can leverage Canonical’s rich technology ecosystem.
Ericsson’s networks see 40 percent of the world’s mobile traffic and connect more than 2.5 billion subscribers globally. Built on an OpenNFV platform, it promotes an industry standard for NFV. Ericsson, HP, and others launched ONFV last year.
“Cloud platforms for the network have to be secure, resilient, robust, and high-performing,” said Magnus Furustam, a vice president of cloud systems at Ericsson, in an Ubuntu blog post. “Partnering with Canonical for carrier-grade operating systems provides an opportunity for innovation to meet stringent telecom requirements in these areas.”
Oracle, Intel Team Up on OpenStack for Telcos
Oracle worked with Intel and leveraged Enhanced Platform Awareness in OpenStack, the popular open source cloud software suite, to deliver what it said was carrier-grade network performance. Oracle used Intel’s Open Network Platform, which combined Intel server architecture with open source software.
Network activity is directed to Intel processors providing more capacity in software than in limited hardware. CSPs can leverage the combo to develop new services and maintain service quality levels.
Oracle optimized products in its network orchestration framework, including its Communications Network Service Orchestration Solution. Customers can dynamically establish data center resource pools that mimic the specialized characteristics of a network appliance, such as large memory pages. The orchestration software routes work to the correct pool based on each function’s unique needs. Multi-vendor support is baked in.
“This initiative does more than just optimize Oracle Communications products for the Intel Open Network Platform,” said Liam Maxwell, vice president of products for Oracle Communications, in a press release. “It takes the theory of delivering carrier-grade capabilities in a commercial data center and turns it into reality. We’ve proven that we can orchestrate services and network functions from the top of the management and orchestration stack all the way to individual network processors, and we can do it at scale.” | | 6:01p |
Open Source Cloud Firm GreenQloud to Stop Offering IaaS Icelandic cloud provider GreenQloud, which has been a major open source cloud supporter, has informed customers it is closing its public cloud service. The company will go on focusing on selling Apache CloudStack cloud called QStack to be managed by others. The public compute and and storage services are ending in October 2015.
By offering both public cloud services of its own and providing QStack infrastructure to be managed by others, the company was competing with some of its own customers. That will no longer be the case. QStack is the provider’s own distribution of the open source cloud software CloudStack.
In a letter to its customers GreenQloud explained: “As we do not wish to hinder this adoption [of QStack] by competing against our own customers, we have decided to focus our expertise and resources on QStack – continuing to bolster its position as the best IT infrastructure management solution available.”
Founded in 2010, GreenQloud offered a public cloud powered 100 percent by renewable energy. Iceland touts renewable energy as part of its appeal as a data center location. In 2013, GreenQloud entered the U.S. market, allowing customers to replicate their data between Iceland and its Seattle data center. The U.S. cloud was also powered with renewable energy bought from Seattle City Light.
There is increasing pressure on service providers to extend services as customer hybrid needs grow, and they look to hand over a larger part of the relationship. The options for service providers are to build, buy or partner – GreenQloud is shifting from building to partnering. Limited resources means it no longer makes sense to manage a public cloud.
There is also enormous competition in the public Infrastructure-as-a-Service space from the giants such as Amazon Web Services, Microsoft Azure, and Google Cloud Platform.
Public cloud is a game of scale. GreenQloud had an interesting differentiator in its green take on cloud, but the internet giants have been making massive investments in renewable energy for their data centers.
Such competition puts small cloud providers like GreenQloud in a tough position, and some analysts are forecasting that more of them will have to pivot or shut down in the near future.
The Icelanding company has decided to focus its resources on QStack. It was an early investor in Cloud.com, the startup that eventually became CloudStack.
Following acquisition by Citrix, CloudStack did lose some ground to OpenStack, the other big open source cloud suite, due to uncertainty of how Citrix would handle CloudStack. But Citrix kept it open and community-driven. It has remained an Apache project. | | 9:01p |
Telstra Customers Can Access Metadata for the Same Fee as Law Enforcement 
This article originally appeared at The WHIR
Telstra will let customers pay to access their own metadata as part of its ongoing efforts to improve transparency about what information is tracked and provided to law enforcement agencies without warrants. Australia’s largest telecom, the company revealed the new policy in a blog post by Chief Risk Officer Kate Hughes.
A form will be made available to customers starting April 1st through Telstra’s Privacy Portal, which will allow simple requests to be made and processed for “around $25.” Requests covering multiple services or extended periods “will be charged at an hourly rate.”
“This is the same practice of cost recovery that is applied to requests from law enforcement agencies,” Hughes says.
Telstra published its first ever transparency report in March 2014, and has also posted information on metadata and its corporate responsibilities to law enforcement.
The company presumably hopes to restore customer trust by “offering the same access to a customer’s own metadata as we are required to offer to law enforcement agencies.”
“This new approach is all about giving you a clearer picture of the data we provide in response to lawful requests today” Hughes says. “As new technologies evolve and data management practices change (including potentially through the introduction of a data retention regime), we see this principle as continuing to apply.”
According to The Register, the new measures indicate that Telstra’s metadata retention efforts are “well advanced,” and the use of the word “data” rather than “metadata” in some cases may indicate uncertainty (or worse; an admission) about what personal records can reveal.
Australia’s Privacy Act also contains stipulations that individuals must be allowed access to personal information companies keep about them, so the new policy may be motivated by compliance, The Register notes.
Telstra is rumoured to be considering acquiring Pacnet. Acquiring Pacnet, with its network of undersea cables and data center business, would give Telstra a lot more personal data to manage, and more international considerations.
While Telstra was well behind many companies in issuing transparency reports, it may be pushing the transparency envelope by granting the same privileges to customers as law enforcement has. Some may disagree, such as fans of Swedish ISP Bahnhof, which simply stopped retaining some customer metadata before being threatened with a fine in October.
Dutch court threw out ISPs’ legal requirement to retain metadata earlier in the month.
This article originally appeared at http://www.thewhir.com/web-hosting-news/telstra-customers-can-access-metadata-fee-law-enforcement |
|