Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, April 19th, 2013
| Time |
Event |
| 11:00a |
OpenStack Summit: Focus on Hadoop Support, File-Sharing As the Portland OpenStack Summit draws to a close this week, Hortonworks, VMware, and NetApp all had announcements.
Hortonworks, Mirantis and Red Hat boost Project Savanna. Leading Hadoop contributor Hortonworks announced that it was working with OpenStack systems integrator Mirantis, and large OpenStack contributor RedHat (RHT) to contribute significantly to Project Savanna under the OpenStack community guidelines to deliver Apache Hadoop on OpenStack. This collaboration aims to provide many benefits including providing open source APIs and simpler transitions when moving Hadoop workloads between public and private clouds. Getting its start as an OpenStack project by Mirantis, project Savanna enables users to easily provision and manage elastic Hadoop clusters to speed the development and deployment of cost-effective Hadoop on OpenStack. “With its efficient use of hardware and unparalleled agility, the cloud is a logical deployment platform for Apache Hadoop and one that we see many of our customers preferring,” said Bob Page, vice president, products, Hortonworks. “Coupled with the fact that Hadoop is a net new workload for many organizations, deployment on OpenStack is a logical fit. By committing efforts to simplify the deployment and management of Hadoop on OpenStack through linkage with Ambari, we believe we can meaningfully accelerate the time to production for organizations building out new Hadoop projects.”
Cananical and VMware Collaborate for OpenStack clouds. Canonical and VMware (VMW) announced a collaboration that will enable organizations to deploy VMware technologies, including VMware vSphere and Nicira NVP, with Canonical’s OpenStack distribution. The Canonical Ubuntu Cloud Infrastructure will now include the plugins required to use OpenStack with vSphere and NVP. VMware reaffirms its support of Ubuntu as a fully supported guest operating system (OS) on vSphere. “By fulfilling our promise to deliver VMware vSphere support in OpenStack, and teaming with Canonical to serve our collective customers, we’re delivering customer choice by providing a powerful platform for those interested in OpenStack clouds,” Joshua Goodman, vice president, Product Management, vSphere, VMware. “Canonical’s Ubuntu technology is widely used by those deploying OpenStack, and joint customers will be able leverage the familiar and proven capabilities of the vSphere infrastructure in which they’ve already invested.”
NetApp Proposes FileShare service for OpenStack. NetApp (NTAP) announced it has submitted a prototype and proposal for a file share service capability for consideration by the OpenStack Foundation Technical Committee and community at large. The proposal will be a topic for discussion for inclusion in the Havana release. Native management support for file-based storage systems is not a part of OpenStack, and NetApp is proposing adding a “file share service” that is broad enough to address a range of file system types, either as an extension to the existing Cinder project or implemented as a separate project. ”NetApp is eager to work with the OpenStack community to establish the optimal path for bringing critical shared file services capabilities into the core of OpenStack,” said Jeffrey O’Neal, senior director, Solutions Integration Group, NetApp. “We have received good feedback to our blueprint through the Grizzly development cycle and look forward to identifying the best path for adoption in the Havana release. Our proposal is constructed to be broadly applicable, with the file system type abstracted to address any number of shared or distributed file system types, from CIFS and NFS/pNFS to something such as Gluster or Ceph.” | | 11:39a |
Michigan County Offline After Data Center Fire IT services in Macomb County, Michigan are offline after a fire damaged the building that houses the county’s data center. Macomb County, which is just west of Detroit and has 850,000 resident, did not have a backup data center.
The fire Wednesday was in the basement of the Old County Building. The data center is on an upper floor, but the county is unable to assess damage to the equipment because the building has no electricity. County Executive Mark Hackel declared a state of emergency Thursday, saying the building could be closed for months and require millions of dollars’ worth of repairs and upgrades
The fire left county staff without Internet or phone service. “We ask the public to be patient with us while we assess the damage to our IT, internet and phone systems to determine the extent of damage,” the county said on its web site.
Email is available due to a recent shift to Gmail, but with many other county computer systems unavailable, Macomb officials are resorting to pen and paper, carbon copies, and makeshift networks of laptops to try and maintain services. “The computers are down. What to do?” County Clerk Carmella Sabaugh told the Macomb Daily. “We have to go old-school and do everything on paper.”
Old Buildings, Inadequate DR Planning
The outage in Macomb County is the latest in a series of incidents that have underscored the vulnerability of local governments, who often have data centers in older buildings and maintain inadequate backup and disaster recovery plans.
Last year a data center fire in a Shaw Communications facility in Calgary, Alberta crippled city services and delayed hundreds of surgeries at local hospitals. The incident knocked out both the primary and backup systems that supported key public services, providing a wake-up call for government agencies to ensure that the data centers that manage emergency services have recovery and failover systems that can survive a series of adversities.
Macomb County was in the process of building a new data center, but never established a backup site for the existing facility at the Old County Building, a 13-story structure that was built in the 1930s and lacked a modern fire suppression system.
Hackel, the county administrator, said he had warned county officials about the need for a backup facility, but the county was unable to implement a plan before the fire.
The Macomb County Circuit Court is operating, but its case management system also is down, Court Administrator Jennifer Phillips said. “We’re up and running, but we’re asking people to be very patient,” Phillips told the Detroit Free Press. “We’re reverting back to processes not as efficient and not used in a long time.” | | 12:31p |
Technology Proves its Value in Wake of Boston Bombings Bill Kleyman is a virtualization and cloud solutions architect at MTM Technologies where he works extensively with data center, cloud, networking, and storage design projects. You can find him on LinkedIn. Also, you can find more of his regular contributions here, on Data Center Knowledge.
 Bill Kleyman
MTM Technologies
In light of this week’s events in Boston and elsewhere, one of the strongest statements we can live by is “the good guys will always outnumber the bad ones.” While some people have said that these types of events bring people to live close to the edge (as in You Only Live Once or YOLO), the reality is that these horrible events actually bring people closer together and deepen our appreciation of each others’ humanity.
In the wake of the Boston Marathon bombings – which brought many reminders of 9/11 – we saw a new hero emerge: technology. The fast responses of medical professionals already onsite likely saved numerous lives. Furthermore, those runners that finished the race and then continued on to donate blood at the local hospital should be praised as well. The human element, no doubt, played a vital role in minimizing casualties and helping people get medical attention quickly. Still, as the smoke clears and we begin to analyze and understand the situation, law enforcement and the officials working on this case have some interesting new tools at their disposal.
Technology: The ‘Good Guy’ Multiplier
According to Boston Police Commissioner Ed Davis, the site of the bombing and the surrounding area – Bolyston Street which serves as the finish line of the Boston Marathon – is one of the most well-photographed and documented areas in the country. Although the crime scene is complex, the use of technology can help pinpoint the line of events that led to this horrible incident. In 2001, the prevalence of recording and digital equipment wasn’t anywhere as high as it is today. On April 15, 2013, a lot more “eyes” were watching the course of the day unfold. Let’s look at some of the areas where technology was involved in the event and aftermath.
- IT consumerization. This is a common marketing term; along with BYOD. But the true magnitude of IT consumerization was on display on Monday. Because so many people have cell phones or other devices with cameras, thousands of high-quality photos were taken from almost every angle and vantage point during the Boston Marathon. People were on the ground, in buildings, at the finish line and everywhere in between. Within minutes, photos of the bombing were circulated and analyzed. These digital photos were used to piece together a very difficult puzzle. Modern phones are capable of taking 8-12 megapixel images. Compare that with phones from 2002 which could only do 0.3MP. As people took videos and photos documenting the event, these digital images are higher resolution than ever before, capturing more image and allowing details to potentially be used by authorities to find those responsible.
- Everyone is a “digital technician.” In the aftermath, the numerous high-quality images being produced from the event have helped law enforcement in their efforts to bring light to the situation, and citizens are helping to analyze them. All across the web, amateur digital technicians are examining photos and processing individual video frames to catch inconsistencies. Just like law enforcement, these technicians have an eagle-eye for technology and can actually help officials pinpoint irregularities. Cloud computing and social media have been busy sharing pictures; discussing analysis of the event and helping everyone involved understand what happened.
- CCTV and surveillance. As runners approached the finish line, they made their way through 26 miles of very public road. Along the way, there were hundreds of cameras and CCTV instances where live video was recorded and documented. A department store’s high-quality outdoor security camera has already helped police identify people of interest. The ability to zoom into a face or feature is paramount to helping bring those responsible to justice. These technologies are becoming more and more prevalent where high-quality feeds are capable of doing so much more than ever before. As the picture becomes clearer, officials will use every single frame from every source that they can. This means that if the perpetrators took public transportation, video surveillance from around the city can help show the footsteps.
- Social media and the cloud. The events of April 15were captured both on video and, simultaneously, on the Internet. Social media reports were posted as quickly as professional reporters brought the news on TV, radio or Internet. Twitter, Facebook, and other heavily used sites became hot spots for conversation. Social media served as a way to determine if friends and family on the ground were alright. In fact, I found out that a dear friend who was only 2 blocks away from the blast was alright – via a Facebook update. Furthermore, valuable pictures, recordings, and new vantage points have helped people put together the course of events that happened that day. Above anything else, social media (and cloud computing) helped bring people together. Whether it was words of support, an image of a friend, or just a though posted on Facebook, technology pushed aside human differences and the sharing through social networks brought everyone closer together.
Today’s always-on, always-connected world strives to bring people and information closer together. During these types of events, technologists all over the world offer their support and work to utilize technological advancements to help people progress. Everyone from pro photographers to ordinary people with their high megapixel smartphone cameras can help authorities solve one of the most complex crime scenes in recent history. During these difficult times, the IT community has continued to offer its support to anyone who needs it.
As a technologist, journalist and writer for Data Center Knowledge – I say with my whole heart – Boston, we stand with you.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 3:38p |
Google Expands in North Carolina, Will Boost Renewables  The Google data center campus in Lenoir, North Carolina at night. Google is investing an additional $600 million to expand the campus. (Photo: Connie Zhou for Google)
Google today announced a major expansion of its data center campus in Lenoir, North Carolina, saying it will spend $600 million to build new server farms and populate them with IT equipment. The search giant also said it will use its purchasing power to jump-start a renewable energy program for Duke Energy, the utility that provides electricity to the Lenoir facility.
The announcement brings Google’s investment in Lenoir to $1.2 billion. The small town in western North Carolina, where the economy was once driven by the furniture industry, is now a major conduit for Internet traffic for Google search, Gmail and YouTube videos.
“When Google builds a data center, it chooses a site large enough to accommodate growth and a site where we want to establish a long-term commitment to the local community,” said Data Center Operations Manager Enoch Moeller. “We are proud to be a part of the City of Lenoir and Caldwell County community where our employees live, work and play. North Carolina and the Lenoir community are great places in which to work and grow.”
Google also said that it will be the pilot customer in a new program from Duke Energy, which create a new service tier to deliver entirely renewable energy to large customers. Duke will file the plan with regulators within 90 days, according to Gary DeMasi, Director of Global Infrastructure for Google.
Prompting Utilities to be Greener
Google’s approach is significant for several reasons: it meets Google’s goal of using programs that create new generation of renewable energy, and prompts utilities to boost their focus on clean energy sources. This has been a major focus in North Carolina, where Greenpeace has pressured Apple over the energy sourcing for its huge data center in Maiden, which is about 30 miles south of Google’s Lenoir facility. Apple has made a huge commitment to on-site renewable generation in Maiden, building a huge solar power array and a fuel cell farm power by gases from nearby landfills.
Google is taking a different approach, tracking a model championed by Greenpeace, in which the data center industry uses its purchasing power to encourage utilities to offer more renewable options. In North Carolina, Duke is pledging to create a new rate plan (or “tariff”) for customers that want energy sourced solely from renewable sources.
“The concept of a ‘renewable energy tariff’ is simple,” Google explained in a white paper. “Utilities would offer companies like Google the choice to buy renewable energy through a new class of service. The service would be voluntary, provided only to those companies that request it but open to all customers that want it and meet basic criteria. A key aspect of the tariff is that the costs of procuring the renewable power would be passed on to the customer that has elected this option, so the goal would be to avoid impact on other ratepayers.”
Data Centers Greening the Grid
Google has been considering this type of approach for some time. In March 2012, Google data center executive Joe Kava said the data center industry could use its leverage to prompt greener practices from major utilities.
“I’d like to challenge the industry to pool its resources,” Kava said at an industry conference. “Why can’t we, as an industry, form a consortium to buy renewable power and push it to the grid. That way we can green the power we are all using … “To us, it’s about increasing the content and percentage of renewables on the grid. If we can increase the green content on that grid, we’ll also green our data center.”
When asked whether Google had contemplated such an approach in North Carolina, Kava said there had been no active effort to organize other players. But if Duke succeeds in creating an all-green service tier, other data center companies could participate, further boosting demand for renewable generation.
“Offering companies like Google a renewable energy option has many advantages,” DeMasi wrote in a post on the Google blog. “Because the service is made available to a wide range of customers, companies that don’t have the ability or resources to pursue alternative approaches can participate. And by tapping utilities’ strengths in power generation and delivery, it makes it easier for companies to buy renewable energy on a larger scale.”
Google’s move was hailed by Greenpeace.
“Google’s announcement shows what forward-thinking companies can accomplish when they are serious about powering their operations with clean energy,” said Greenpeace International Senior IT Analyst Gary Cook. “Before today, even large energy users in North Carolina were only offered dirty energy by Duke Energy: coal, nuclear and gas. In living up to its commitment of powering 100% of its operations with renewable energy, Google has given Duke Energy the push it needed to offer a Renewable Tariff which could finally mean access to clean energy for Duke Energy’s customers in North Carolina.”
Will Google Pay a Green Premium?
The approach is not without its challenges. Utilities will need to work out the details of the service with state regulators, and find cost-effective renewable projects.
But the major issue for data center providers will be cost, as renewable energy sources are currently more expensive than the current industrial rates from Duke Energy of about 4.5 cents per kilowatt hour, which is primarily sourced from coal and nuclear power. Google is aware of this discrepancy.
“We might take a loss (at first),” said Kava in his presentation last year. “But over a 20-year period, I’m betting the price of power will go up.”
By committing to be the pilot customer for Duke, Google appears to be ready to pay a premium for renewable energy, at least in North Carolina. Even a “green tariff” would likely include some non-renewable energy, since solar power is only available during the day and wind power can be intermittent.
“If needed, a supplemental ‘shaping’ service from other (likely non-renewable) generation would fill in the gaps of variable renewable resources and ensure
that customers receive continuous and reliable service,” Google said in its white paper. “Thus,the tariff will eliminate many of the complexities of intermittent renewable energy production for customers.” | | 4:30p |
Network News: Mellanox Launches Cloud Interconnect Here’s a roundup of some of this week’s headlines from the network sector:
Mellanox Launches ConnectX03Pro interconnect. Mellanox (MLNX) announced ConnectX-3 Pro, a network adapter IC and card with hardware offload engines that support virtualized overlay networks commonly used in cloud infrastructures. Networking overlay technologies were developed to enable application and tenant scalability and resource tunneling across the cloud, however this has been limited due to the high CPU overhead imposed on cloud resources. The ConnectX-3 Pro implements these overlay technologies within the interconnect hardware itself, allowing clouds to take advantage of virtually unlimited scalability and resource mobility without the CPU overhead. “To meet the growing demand of cloud computing services, cloud providers must be able to take full advantage of new software techniques to scale-up their cloud networks without reducing performance or efficiency of the infrastructure,” said Gilad Shainer, vice president of marketing at Mellanox Technologies. “With ConnectX-3 Pro, cloud providers will be able to easily scale and grow their business and provide new value-add services while reducing the cost of their cloud infrastructure; ushering in the age of Cloud 2.0.”
Level 3 names new CEO, selected by Voice America. Level 3 Communications (LVLT) announced that Jeff K. Storey has been appointed by the Board of Directors to be the company’s president and chief executive officer, effective immediately. Storey was also nominated for election to the Board of Directors at the company’s Annual Meeting of Stockholders to be held on May 23. Storey joined Level 3 in 2008 as the company’s president and chief operating officer. “We are extremely pleased to name Jeff as our new CEO and look forward to him joining our Board of Directors,” said Walter Scott, Jr., chairman of the Board of Directors of Level 3. “Jeff was the clear and unanimous choice of the Board. With 30 years of industry experience and his intimate knowledge of Level 3′s customers, employees and operating environment, Jeff is the right executive to lead Level 3 into the future.” Level 3 also announced it is providing content delivery network (CDN) services for VoiceAmerica, a producer of more than 300 original live Internet talk radio programs delivered on a weekly basis. Level 3′s CDN services will help enable VoiceAmerica to deliver its online content to a more global audience. “VoiceAmerica’s popularity is rising across the globe, and in order to continue providing a superior streaming experience to a growing audience, we must have a CDN that can handle high demand and scale to meet our expansion needs,” said Jeff Spenard, President of VoiceAmerica. “Level 3′s suite of media delivery services and its extensive IP backbone allows us to seamlessly deliver our content to more browsers, mobile devices and listeners around the globe.”
Avaya innovates fabric-enabled networking. Avaya unveiled new solutions demonstrating the company’s continual innovation in network fabric technologies, including the industry’s first fabric-enabled multi-service edge device and a new model for IP Multicast that significantly improves efficiency, reliability and scalability over traditional approaches. Based on an open, standards-based implementation of Shortest Path Bridging, Avaya VENA Fabric Connect delivers an array of network services, including Layer 2 and Layer 3 virtualization with optimized routing and now, fully integrated support for IP Multicast. The VSP 4000 fabric-enabled multi-service / multi-tenant edge device further extends the Avaya VENA Fabric Connect across the entire network to the campus, metro or WAN edge. ”Avaya is fundamentally changing the way that networks are designed, built and operated with our Fabric Connect technology,” said Marc Randall, Senior Vice President and General Manager, Avaya Networking.” With this announcement, we are delivering all the services capabilities with far greater simplicity, agility and availability than the wide majority of current rigid, complex and error-prone networks supporting enterprises today.” | | 5:00p |
Akamai Partners To Deliver Federal Mobile Authentication Akamai Technologies (AKAM) announced that in collaboration with Daon, a leader in identity authentication technology and services, the two companies will offer Mobile Authentication as a Service (MAaaS). The solution is designed to provide cloud-based multi-factor authentication to increasingly mobile federal employees.
The authentication service can be used across a variety of mobile devices, and will be delivered as a cloud-based application in conjunction with CGI Group Inc., the first large IT services provider to receive FedRAMP authorization. By allowing federal agencies to maintain security control at the application level even if they do not manage the actual device, MAaaS can be used in conjunction with increasingly popular “Bring Your Own Device” (BYOD) programs.
“Public sector computing is happening on a wide variety of mobile devices, many of which are privately owned by federal employees,” explained Tom Ruff, vice president, Public Sector, Akamai Technologies. “As such, Federal agencies are looking for more effective ways to manage devices, applications and data in smart, secure and affordable ways, while at the same time adhering to programs such as Cloud First.”
The MAaaS solution can allow authentication parameters to be customized based on application and associated risk policy. This multi-factored, layered approach can help ensure the right level of protection is applied to protect information and privacy. Unlike mobile authentication solutions that employ single or 2-factor authentication, the MAaaS solution can incorporate as many as seven factors of authentication provided by Daon.
“As part of the Daon pilot for the National Strategic Trusted Identities in Cyberspace (NSTIC) initiative, we have been able to provide our members a secure and easy way to authenticate themselves to the restricted areas of our website,” said Carter Morris, senior vice president, Transportation Security Policy at the American Association of Airport Executives (AAAE).
The new solution is also designed to allow government agencies to incorporate existing Common Access Card (CAC) or Personal Identity Verification (PIV) card implementations into their mobile authentication strategies. This should allow agencies to take full advantage of the government’s current investments in efforts such as X.509 compliance technology, while allowing greater flexibility and security to their workforce. |
|