Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, April 3rd, 2014
Time |
Event |
12:00p |
Getting a Handle on Supply Chain Security Risk Winston Saunders has worked at Intel for nearly two decades and currently leads server and data center efficiency initiatives. Winston is a graduate of UC Berkeley and the University of Washington. You can find him online at “Winston on Energy” on Twitter
Asset and privacy protection face increased challenges as devices in our homes and workplace become “pervasively connected.” In my own home, the number of connected devices appears to be doubling about every two to three years - with little sign of relenting .
Pervasively Connected in the DC
The same trend is true in the data center. IP-enabled power strips, temperature sensors, humidity sensors and cameras are just a few of the “connected” devices available. Add to that servers, routers and storage devices and other devices feeding software-based intelligence in the data center and you start to get the picture.
More connectivity brings benefits, but also the potential for security vulnerabilities and even malicious code, whether intentional or unintentional.
As an example, consider the recent disclosure that user video-chat data was insecure. I’ll ignore the “Big Brother” overtones and rather focus on the case as a good paradigm for a “supply chain” vulnerability. You select a product based on supplier reputation and capability, perhaps without thought to potential vulnerabilities. And then you use it with confidence, only to discover you are not as private as thought. If users or the supplier had only asked the question, “What is my risk if basic security controls like encryption are not implemented?” Some exposure might have been avoided.
Evaluating Risk
security posture (“built in, not bolted on”) and has acted on that by integrating Supply Chain Risk into NIST SP800-53r4 “Security and Privacy Controls for Federal information Systems and Organizations.”
As a testimonial to the robustness of their approach, the U.S military recently announced their convergence to the NIST Framework.
A growing industry practice is to use the framework to review supply chain risks based on the controls assessments in NIST SP800-161. I believe it will continue to grow rapidly in importance.
So it’s worth a look. While the publication is long (almost 300 pages), even the highest level key concepts make solid sense. And from there one can drill in to specific areas of vuletnerability.
At the highest level, the assessment deconstructs in to three tiers:
- System components
- Development and Operational Environment
- Logistics and Transportation.
In the above camera example, un-encrypted camera data is both a system component and potentially a development environment vulnerability depending on perspective. Two questions SP800-161 asks is “What are the vulnerabilities?” and “Was vulnerability testing done?” Depending on where you are in the supply chain, either question may be relevant to ask.
Now turn that into a set of questions about the connected infrastructure in your data center. While the risk framework, at first glance, may seem daunting, I’d strongly encourage someone in your organization to read through it to understand and select a subset of risks. Have your suppliers considered appropriate controls? Understanding your vulnerabilities and getting a handle on what it takes to address them is a good way to stay ahead of the “bad guys.”
Cybersecurity Threats Are Real
As FBI director James Clapper recently updated Congress, it’s true that the security threats of destructive cyber-attack on critical infrastructure is real and growing.
Your data center is your company’s critical infrastructure. If we all embrace risk assessments and insist on good practices throughout our supplier chain, the whole industry will benefit. Your challenge, and the industry’s challenge, is to begin that journey.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 12:42p |
Pivotal Launches Pay-as-You-Go Big Data Bundle Aiming to redefine the economics of big data through a customer lens Pivotal has launched a BigData bundle of annual subscription based software, support and maintenance that combines Pivotal Greenplum database, Pivotal GemFire, Pivotal SQLFire, Pivotal GemFire XD, Pivotal HAWQ, and Pivotal HD into a flexible pool of products. Raising the ante one more, with a cumulative contract minimum Pivotal will make Pivotal HD available on an unlimited basis at no extra cost, including support.
Pivotal was launched from EMC and VMware about a year ago as a platform company for a new era – complete with a $105 million investment from GE. The demand has grown immensely and the value in big data solutions has been validated tremendously in the past year with open source and commercial ventures around Apache Hadoop.
The Pivotal Big Data Suite ties all of the company’s leading data technologies together, enabling customers to leverage any one, whenever and wherever they need it. It fills a much-needed gap in the market, helping enterprise companies to capitalize on explosive data growth by offering a multi-faceted data portfolio with a “use it as you need it” pricing model. This enables organizations to flexibly move their investment from one technology to another within the suite, at any time. Per core pricing ensures that data that is simply being stored is not taxed, which will be important as enterprises seek to consolidate more and more information into a Business Data Lake. With the Pivotal Big Data Suite, unlimited Pivotal HD empowers enterprises to stretch their Business Data Lakes to store everything and analyze anything, without fear of runaway license and support costs.
“Enterprise customers are ready to move onto the next generation of data computing that gives them the speed, scale and efficiency they need to stay relevant in the market,” said Paul Maritz, CEO of Pivotal. “They should be able to take advantage of modern database technologies and use them collectively without fear of penalty or waste of investment. With Pivotal Big Data Suite, we are taking the lead for the industry by removing the technical barriers to data off the plates of our customers. Now the choice isn’t about Hadoop or a SQL database, in-memory or real-time processing, but efficiency and value.”
Pivotal Senior Vice President of Research and Development Hugh Williams spoke at the Big Data Paris event Wednesday, announcing the Pivotal big data bundle. | 1:52p |
Apple, Google, Facebook Earn High Marks from Greenpeace Apple, Facebook and Google are leading the shift to a greener Internet, according to a new report from Greenpeace that examines the use of renewable power in cloud computing. All three companies are working to power their data center operations with 100 percent renewable energy. Earning lower marks is Amazon Web Services, which currently sources just 15 percent of its electricity demand with clean energy.
Amazon scored low marks, as did wholesale data center providers Dupont Fabros Technology and Digital Realty, which all lost points for lack of transparency about their energy sourcing.
The Greenpeace reports examines a single component of sustainability, focusing on the use of electricity from renewable sources rather than coal or nuclear energy (which Greenpeace opposes). The data center industry has historically focused on energy efficiency measures that reduce the volume of electricity consumed.
Rising Profile for Renewables?
The three companies with the highest rankings from Greenpeace have all made very public moves to make their operations greener, but also to raise the profile of green energy in the industry as a whole.
“Apple, Facebook and Google are powering our online lives with clean energy, and building a greener offline world for everyone in the process,” said Gary Cook, Greenpeace Senior IT Analyst. “These companies have proven over the past 24 months that wind and solar energy are ready and waiting to power the internet, and the rest of our economy, with clean electricity.”
If the Internet were a country, it’s electricity demand would rank sixth globally. Industry research estimates that Internet data will triple from 2012 to 2017, meaning the push is on to make sure renewable and green energy is powering the data centers storing this mountain of information.
The report examines 19 leading internet companies, surveying their electricity supply chains of over 200 data centers. Five of these companies have committed to going 100 percent renewable.
Apple gets Kudos
Apple hit the 100 percent renewable energy mark for its data centers roughly a year ago. This followed sustained pressure on the part of Greenpeace for Apple to set an example in the tech community. Apple has continued its green ways, operating the largest privately owned solar installation in the US at its North Carolina data center, which also contributed to its high ranking.
“Apple’s rapid shift to renewable energy over the past 24 months has made it clear why it’s one of the world’s most innovative and popular companies,” Cook said. “By continuing to buy dirty energy, Amazon Web Services not only can’t seem to keep up with Apple, but is dragging much of the Internet down with it.”
Facebook used its leverage to push its utility in Iowa, convincing MidAmerican Energy to power its data center with wind energy. MidAmerican is investing $1.9 billion in wind power generation, placing the largest order of onshore wind turbines to help meet Facebook’s demands. It also uses a solar array for ancillary on-site power in Oregon and taps hydro electricity for its new Swedish server farm.
Google’s the kind of power purchase agreements for wind energy, for example, it recently bought even more wind power for its Finnish Data Center.
Amazon and ‘Dirty’ Power
Amazon Web Services looks to be a potential next focal point for Greenpeace. The company received an “F” for transparency, policy and advocacy. Only two AWS regions – Oregon (US-West) and AWS GovCloud (US) - currently offer 100 percent renewable energy options. Coal powers 28 percent of the company’s cloud, with nuclear as 27 percent, and gas as 25 percent.
The major factor dragging down Amazon Web Services, Dupont Fabros and Digital Realty in the “stuck in dirty energy past” section was a lack of transparency. Dupont Fabros and Digital Realty’s report card was littered with straight Ds, with DFT getting a C for transparency.
Good transparency kept Equinix out of the category and in the “middle of the road.” Several brands evaluated that ranked higher use Equinix or have made 100 percent renewable commitments, so the report suggests that its important place in the digital economy might be a motivating factor to make big public moves and commitments to green.
Also in the middle was eBay, which has promoted data center efficiency via its DSE dashboard as well as has taken several steps to committing to green energy. Rackspace was also evaluated to be in the middle of the pack, though the company committed to becoming renewably powered back in 2012 and is on its way.
Data Centers as Instruments of Change
One takeaway is that large brands have the power to change the power grid. Greenpeace wants large utility customers like Facebook to use their leverage to institute changes beyond their own infrastructure, influencing utility companies to provide more renewable options.
In the best example of this to date, Google has helped nudge Duke Energy, the largest utility in the US, to offer a new renewable energy rate plan for large electricity buyers in North Carolina. | 2:15p |
Oracle Beefs Up Its NoSQL Database Offering Introducing the latest version of its distributed key-value database, Oracle (ORCL) announced Oracle NoSQL Database 3.0. With this latest release, Oracle offers developers an enhanced NoSQL solution for building high performance, next generation applications. The combination of security, availability, scalability, and data model flexibility delivers a comprehensive high performance NoSQL solution for application developers.
Enhancements in version 3.0 include increased security, with OS-independent, cluster-wide password-based user authentication and Oracle Wallet integration enabling greater protection from unauthorized access to sensitive data. It also supports tabular data models for simplifying application design and enabling seamless integration with familiar SQL-based applications. Data center performance enhancements include automatic failover to metro-area secondary data centers and secondary server zones. Oracle NoSQL Database 3.0 Enterprise Edition and Oracle NoSQL Database 3.0 Community Edition are now available for download.
“As the leading provider of data management solutions, Oracle is committed to providing customers with the most comprehensive solutions to address the evolving needs of enterprise data management including NoSQL,” said Andrew Mendelsohn, Executive Vice President, Database Server Technologies, Oracle. “Oracle NoSQL 3.0 helps organizations fill the gap in skills, security and performance by delivering the industry’s best enterprise-class NoSQL database that empowers database developers and DBAs to easily, intuitively and securely build and deploy next-generation applications with confidence.”
Oracle also recently reached a development milestone with the release of MySQL 5.7, along with other MySQL product releases of MySQL Fabric, MySQL Workbench 6.1 and early access to features under development such as Geographic Information Systems and multi-source replication improvements.
Oracle recently swapped places with IBM, moving up a spot as the second largest software company in the world according to Gartner rankings. | 2:30p |
Interxion, OnApp Team Up for Speedy Cloud Deployments Data center provider Interxion and Infrastructure as a Service platform provider OnApp have teamed up for CloudPOD, a package for the rapid test and launch of new IaaS cloud services. The CloudPOD is available in partnership with Dell and Custom Connect, and gives service providers a turnkey way to deploy ready-to-run clouds at Interxion data centers across Europe, reducing time to market for new cloud services.
“With Interxion we’re making it easier than ever to bring new clouds online, by combining the OnApp platform with best-of-breed hardware, hosting and connectivity services,” said Ditlev Bredahl, CEO of OnApp. “With data centers in 11 countries, and a choice of Connectivity providers, the Interxion CloudPOD is particularly attractive to large hosts and telcos looking for an easy way to expand their cloud footprint in Europe.”
CloudPOD is an attempt to cut out the complexity and time it normally takes to launch an enterprise-grade cloud platform or launch into new geographies. For those looking to launch such services in Europe, it’s a turnkey way to do so. This complete cloud management system enables service providers to offer high availability service-level agreements (SLA) and can be delivered at any of Interxion’s 36 data centers in 11 European countries.
“The cloud market is crowded with various providers vying for service superiority and market share, with many providers uncertain about how their cloud will perform,” said Jelle Frank van der Zwet, Interxion’s Cloud Marketing Manager. “To help service providers improve their service development and stay competitive, CloudPOD ensures they can instantly deploy a new cloud service when and where it’s market-ready.”
Service providers can try it out for free before they buy at the CloudPOD Test Lab in Interxion’s City of London data center campus. It’s a way to familiarize and test different scenarios and integrations. After testing, service providers can launch via Interxion’s Cloud Hubs, a community of hosting, infrastructure providers, hyper scale platforms, systems integrators, software providers and networks. | 5:43p |
Big Data News: Hortonworks, MapR, SAS Hortonworks innovates the latest open enterprise Hadoop with release 2.1 of its Hortonworks Data Platform, MapR expands big data search with Elasticsearch, and SAS updates its Master Data Management and SAS Federation Server software.
Hortonworks launches HDP 2.1. Hortonworks announced the availability of its open enterprise Hadoop platform Hortonworks Data Platform (HDP) 2.1. HDP 2.1 delivers required enterprise functionality for data management, data access, data governance, integration, security and operations developed and delivered completely in the open. Incorporating the very latest community innovations across all Apache Hadoop projects, HDP 2.1 provides the foundational platform for organizations looking to incorporate Hadoop in a modern data architecture. HDP 2.1 features Apache Hive 0.13 for interactive SQL Query, new enterprise capabilities for data governance and security, new processing engines for streaming and search, and enhanced management and operations capabilities. “Hortonworks remains committed to innovating and delivering a certified, stable, and completely open source enterprise Hadoop platform comprised of the most recent Apache project releases,” said Shaun Connolly, vice president of corporate strategy, Hortonworks. “This HDP 2.1 release delivers a comprehensive set of enterprise Hadoop capabilities that span data management, data access, data governance, security, and operations, and our completely open source approach ensures our customers can confidently deploy a Hadoop platform that is not only built for the modern data architectures of tomorrow, but also deeply integrated with existing datacenter technologies.”
MapR expands big data search with Elasticsearch. MapR Technologies and analytics company Elasticsearch announced the integration of Elasticsearch’s real-time search and analytics capability with the MapR Distribution for Apache Hadoop, enabling customers to search and store tremendous amounts of information in real time. The combination gives developers a scalable, distributed architecture to quickly perform search and discovery across tremendous amounts of information. It is already in use at leading enterprise companies including Solutionary, the leading pure-play managed security service provider, and several Fortune 100 financial services institutions. “Starting with the Elasticsearch-Hadoop project announced six months ago and underscored by this partnership with MapR, we are helping Hadoop users enhance their workflows with a full-blown search engine that scales no matter the amount of data,” said Shay Banon, Elasticsearch creator, founder, and CTO. “With Elasticsearch, developers and data-hungry businesses now have a way to ask better questions of their data and get clearer answers, significantly faster, in return.”
SAS updates data management software. To help organizations better access, manage and use data from any source, SAS has updated its SAS Master Data Management (MDM) and SAS Federation Server software. SAS Federation Server, shipping this quarter, will include big data virtualization, providing access to Hadoop, SAP HANA, Oracle, DB2, and other data sources, creating visual representations without physically moving data. The latest SAS MDM includes embedded data quality to ensure clean and accurate data; improved usability to drive business user engagement; and pervasive data governance to improve collaboration between business and IT. Invacare Corp., a manufacturer and distributor of non-acute medical equipment, uses SAS to support its global MDM efforts. “We are creating master product and supplier data hubs with the SAS MDM and SAS Data Management solutions by integrating disparate product data from 25 ERP systems,” said Greg Rossiter, Director of Global Business Information and Data Management for Invacare. “We’re able to perform de-duplication and data classification – and enhance the data around our vast product set.” | 5:58p |
Blimp Spreads Encouragement, Shame in Silicon Valley Flyover Greenpeace’s campaign for renewably-powered Internet took to the skies over Silicon Valley Thursday, conducting a flyover of major tech company headquarters with a blimp festooned with messages highlighting the environmental group’s new report on energy sourcing and sustainability.
The Greenpeace thermal airship bore several messages – one praising Facebook and Google for their use of renewable energy, and one challenging four other tech companies that it cited for their continued use of coal-derived power. The “Who’s Next to Go Green?” side bore the Twitter hashtag #clickclean below the logos of Amazon, Twitter, Netflix and Pinterest, four companies Greenpeace says are relying upon “polluting energy” in their data center operations.
The 135-foot long, 41-foot diameter thermal airship took off from the Palo Alto airport at 8:00 am and flew over the 101 highway rush hour commute on its way to Facebook’s campus, then over to Google’s campus, before returning to the airport to land after the 1 hour and 10 minute flight. Another Bay Area flight is planned for next week, depending on weather conditions.
Previous Greenpeace campaigns have called out Facebook and Apple for their data center energy sourcing. The group has clearly shifted its focus to Amazon Web Services, who Greenpeace says used just 15 percent renewable energy in their data centers. Netflix and Pinterest both host all of their data with Amazon Web Services, and this “have tethered their fast-growing services to polluting sources of energy,” Greenpeace said.
 The Greenpeace blimp hovers over the Googleplex in Mountain View, Calif. Thursday morning. The sign on the airship challenges Amazon Web Services and its customers to make greater use of renewable energy in their data centers. (Photo: George Nikitin, Greenpeace) | 7:15p |
Akamai, Vubiquity Partner for Content-as-a-Service Offering Global provider of multiplatform video services Vubiquity and Akamai (AKAM) announced a partnership designed to help network service providers or content owners efficiently and cost-effectively launch over-the-top (OTT) video services. Combining Vubiquity’s AnyVU Cloud and the Akamai Intelligent Platform will offer a robust licensed content library along with cloud-based content workflows, storage and delivery at a global scale.
As part of the new content-as-a-service offering, Akamai’s Media and Delivery Solutions, including cloud storage and efficient high-speed content delivery network (CDN), are designed to help ensure high-quality video delivery and superior user experiences across networks and devices. The content-as-a-service offering can also be deployed using Akamai’s Aura licensed or managed CDN solutions to provide network operators an on-net and off-net OTT solution to help manage traffic, reduce network costs, and engage their subscribers with compelling new video services. Together, the companies expect to deliver content directly to consumer devices on behalf of their joint service provider customers in over 90 countries.
“AnyVU Cloud was a natural evolution for Vubiquity which supports more than 700 customers on a global scale. Coupled with Akamai’s globally deployed network that carries up to 30 percent of the world’s web traffic at any time, we are providing a platform of unparalleled reach,” noted Vubiquity’s CEO Darcy Antonellis. “We’ve architected a highly scalable platform that supports the growing needs for deep library content, with very flexible integration paths to enable new consumer-facing product and service offers. The modularity of the platform means clients can pick and choose services from content licensing to delivery, marketing and data analytics, to fill existing gaps and experiment more rapidly with new products to support growth.”
Akamai is showcasing its full range of digital media solutions, including cloud-based storage, processing, analytics and delivery services at the 2014 NAB Show in Las Vegas next week.
Akamai accelerates content availability with Aspera
Akamai and IBM company Aspera announced a new high-speed option for Akamai customers to accelerate upload and distribution of rich media content on the Akamai Intelligent Platform. The Aspera Upload Acceleration option integrates Aspera’s proven transfer technology directly into Akamai’s NetStorage cloud-based online storage platform, providing a significantly faster means for Akamai customers to upload files regardless of file size or geographic distance. The integration of Aspera’s patented FASP transfer software into Akamai’s NetStorage is designed to significantly increase upload transfer speeds while creating a seamless end-user experience with no need to re-architect the workflow or build custom API calls.
“Combining the global scale and reach of the Akamai Intelligent Platform with the speed of Aspera’s FASP technology can dramatically increase the rate at which content is uploaded and ultimately delivered to viewers,” said Michelle Munson, founder and CEO, Aspera. “Our two organizations worked extensively to build a tightly integrated and cohesive solution for owners of content of all sizes so they can realize new efficiencies in asset management and support higher quality delivery and playback of online video.” |
|