Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, September 30th, 2014
| Time |
Event |
| 12:00p |
Digital Realty Intros Private Links to VMware’s Hybrid Cloud Digital Realty Trust announced it is now offering its customers direct private network connectivity to VMware’s hybrid cloud services infrastructure hosted at one of Digital Realty’s Dallas data centers.
A private network link increases performance and security for customers that want to connect their infrastructure to VMware’s vCloud Air (formerly vCloud Hybrid Service). VMware announced the service in May 2013, promising customers with existing VMware environments seamless integration with its hosted multi-tenant cloud infrastructure.
This is yet another step in Digital Realty’s quest to reinvent itself into something more than a company that simply provides large chunks of data center capacity. The company has been on this quest since early this year, when it announced that it would change its strategy and find partners who can help it deliver fuller IT infrastructure solutions.
Through agreements with network carriers, such as tw telecom, Level 3 and Zayo, Digital Realty also offers direct connectivity to Amazon Web Services and Microsoft Azure clouds. The VMware cloud option is different, however, since the cloud infrastructure lives within a Digital Realty facility.
The company interconnects its data centers within each metro it is present in, and then interconnects all of the metros among each other. This enables any customer in any of its data centers to establish a direct link to VMware’s cloud in Dallas.
The Dallas facility is one of several that host vCloud Air infrastructure. The VMware hybrid cloud service lives in four more U.S. data centers by other providers, one data center in Japan, as well as another Digital Realty facility in Slough, UK (just outside of London).
Freebies await first customers
Joe Goldsmith, vice president of sales and global alliances at Digital Realty, said the company will provide the direct connection to vCloud Air for free to customers who sign up between now and the end of the year. VMware will also give the data center provider’s customers who take advantage of the offering a 15-percent discount on the cloud services.
Customers can also get free cross-connects between their equipment and the network meet-me rooms at the data centers they are located in as well as between VMware’s environment and the meet-me room in the Dallas facility. Setup and installation takes from three to four weeks.
Digital’s growing ecosystem play
Digital Realty has been pursuing partnerships with service providers quite aggressively. The VMware deal is the latest in a series of agreements the San Francisco-based real estate investment trust has made.
Only last week it announced a partnership with Carpathia Hosting to provide infrastructure solutions that combine data center space and Carpathia’s managed hosting and cloud infrastructure services.
Michael Bohlig, a former AWS exec who came to Digital Realty six months ago to grow this partner ecosystem, said the company has struck partnerships with a range of other cloud and managed services providers.
Digital Realty also has an online cloud marketplace where customers can buy cloud services from its partners. | | 4:00p |
IT Analytics Startup Numerify Closes $15M in Series B Numerify, which applies Big Data analytics to IT infrastructure management, announced a $15 million Series B funding round, led by Sequoia Capital, with existing investor Lightspeed Venture Partners also participating.
After launching Numerify360, its turnkey IT offering, earlier this year, the company is sticking to a focus on the niche market of Big Data executives in an industry with plenty of competition big and small.
Numerify aims to provide business analytics tools for those running the business of IT. Its entire stack runs in the cloud.
Numerify says it wants to replace costly custom analytics solutions with Numerify360 providing a single view of customers’ IT business. The company says it is already working with more than a dozen enterprises, including Netflix, University of San Francisco, Incomm and Spansion.
“Most IT executives are ‘flying blind’ when it comes to understanding the financial and operational dimensions of their service delivery, and they don’t usually have the money, time, or talent to invest in the type of analytics solution needed to effectively run IT like a business,” said Doug Leone, partner at Sequoia Capital who has now joined Numerify’s board of directors. “Numerify’s IT Business Analytics application solves this problem beautifully–with the entire stack in the cloud–making it fast and easy for customers to arrive at precise and actionable insights.”
Numerify said that the new funding will help expand sales and marketing efforts, as well as fill out its suite of cloud analytics applications.
The company has raised $23 million in two rounds since it was founded in 2012. Other investors include Amit Singh, president of Google for Work, Deep Nishar, senior vice president of products and user experience at LinkedIn, and Lane Bess, former CEO of Palo Alto Networks. | | 4:00p |
The Hybrid Approach: Rewriting the Rules for Backup Storage There are more users accessing complex applications with a variety of new data points. These users must be enabled by their organizations to leverage critical resources from any device and any time.
So how does a business keep up? How can an organization maintain control over an ever-evolving data management process?
Exponential data growth and long-term retention requirements are putting tremendous pressure on IT professionals who are expected to accommodate huge, rapidly-expanding quantities of unstructured and structured data while simultaneously slashing their storage costs.
Of course, simply adding more high performance primary storage is very expensive and highly inefficient. Not all data is equal, and IT managers are acutely aware that the cost effectiveness of any storage solution will directly correlate with how efficiently it matches data’s performance, capacity, and connectivity needs.
Today, many IT professionals are struggling to shrink backup windows and reduce their Recovery Point Objective (RPO) and Recovery Time Objective (RTO) targets, while concurrently coping with quantities of data that are an order of magnitude greater than those seen just a few years ago. Simply put, backup performance (along with scalability and manageability) has failed to keep pace with the needs of today’s organizations, driving up IT costs and reducing efficiency while jeopardizing effective data protection.
Fortunately, there is good news.
In this whitepaper from Nexsan (Imation), we learn how innovative hybrid technology, combining the speed of SSDs with the capacity and economy of HDDs, is now bringing affordable high-performance backup storage within reach for organizations of virtually any size – even those working within tightly-constrained budgets. No longer must IT pros play catch-up when it comes to backups and restores; cost effective data protection has finally entered the fast lane.
There once was a time when SANs (block storage) were used exclusively to support transactional, structured data driven by database applications—often the core of an organization’s business; NAS (file storage) was associated with user home directories and office productivity applications. But now, even the most critical databases are being run on NAS devices. Server virtualization has further raised the profile of file storage, with applications like VMware storing and manipulating entire server instances as individual files.
Download this whitepaper today to learn how block and file data types are both driving the fundamental storage requirements of virtually every IT infrastructure. Additionally, you’ll find that the ability to support SAN and NAS in a single unified hybrid backup solution pays multiple dividends:
- Greater consolidation
- Easier management
- Higher capacity utilization
- Lower CAPEX and OPEX
- Future-proof expansion
Hybrid technology is leading a new era of backup performance, and when allied with unified storage architecture, can provide unprecedented levels of efficiency, manageability and cost-effectiveness to data protection. | | 4:33p |
Microsoft to Launch Azure Data Center in India Microsoft is planning to establish Azure data centers in India to provide cloud services inside the country from local facilities, the company announced Tuesday. The company’s CEO Satya Nadella made the announcement while on a visit to New Delhi, saying the new facilities will come online by the end of 2015, but did not specify how many data centers were being planned or how big they were going to be.
Cloud providers establish local data centers to serve customers in the surrounding areas to improve performance. “Data sovereignty” has also become more acute of a concern in the post-Snowden world. The third reason Microsoft gave is geographic redundancy, meaning the more locations your data is in physically, the more resilient your infrastructure is as a whole.
Bhaskar Pramanik, chairman of Microsoft India, said a local Azure data center would give the company a chance to sell to new sets of customers in the country that have to keep their data within its borders. “This opens new possibilities in e-governance, financial inclusion, healthcare and education, and will help us positively impact the lives of a billion people,” he said in a statement.
“With more than 250 million Indians using Internet-connected devices today, there is incredible demand and opportunity for India with Microsoft’s cloud services,” Nadella said in a statement.
Microsoft said its cloud services revenue in India has grown 100 percent over the past year. Customers include Bajaj Finance, Fortis Hospitals and the advertising agency FCB/ULKA.
In addition to well established enterprises, Microsoft is aiming Azure services at Indian startups. The company has a program to provide Azure service credits to local startups, totaling $60,000.
Microsoft is also reportedly planning to launch an Azure data center in Busan, South Korea. Nadella visited the port city earlier this month to discuss the project with local officials, Korea Herald reported.
Another major international data center capacity expansion is expected in Germany, according to news reports. | | 5:20p |
Oracle to Add Two Data Centers in Germany Oracle is going to establish two data centers in Germany to serve Oracle cloud services out of. The company said there will be a primary German data center in an Interxion facility and a secondary one in an Equinix facility.
Oracle said the data centers will be in Frankfurt and Munich. Equinix has data centers in both cities, while Interxion doesn’t have presence in Munich, meaning Oracle will use an Interxion data center in Frankfurt as the primary site and an Equinix one in Munich as the secondary facility.
Oracle is going after the cloud services market with a vengeance. At its OpenWorld conference in San Francisco this week, the company rolled out close to 200 new Software-as-a-Service applications. The company also has market domination ambitions in the Platform-as-a-Service and Infrastructure-as-a-Service markets.
These will be the first Oracle cloud data centers in Germany, although the company already has two other data centers in Europe – one in UK and another one in the Netherlands. Oracle said the new facilities will be dedicated to serving customers who are sensitive about location of their data.
The concept of “data sovereignty” has always been important in certain verticals, such as financial services, government and healthcare. These concerns have grown in importance around the world since last year’s revelations about the U.S. government’s Internet and telecommunications surveillance programs.
According to recent reports, new German data centers by Amazon Web Services and Microsoft Azure – both cloud services giants – were in the works. Oracle executives said this week they were trying to take both of these companies and others on across the SaaS, PaaS and IaaS markets.
Recently appointed Oracle CEO Mark Hurd said in a press conference Monday that the company’s IaaS offering would match the already low prices the incumbents offer. “We will price our infrastructure right on top of or exact same … as Amazon and other providers,” he said.
When they launch in 2014 Oracle’s new German sites will serve the company’s ERP, HCM, Sales, Service and Talent Management cloud services. They will add to the company’s existing 19 data centers across eight countries.
Oracle has data centers in North America, Latin America and Asia Pacific. “And my guess is we’ll move across other geographies as we go into the future,” Hurd said. | | 8:12p |
Social Security Finally Gets Shiny New (and Huge) Maryland Data Center After a long and arduous road, the Social Security Administration has finally opened a data center that will replace a 34-year-old building in Baltimore, Maryland. The data center comes after several delays, including site selection snafus. It was funded by $500 million in economic stimulus funds from the American Recovery and Reinvestment Act of 2009.
The new 300,000-square-foot Social Security data center is in Frederick, along interstate 270. It replaces a legacy data center that was “severely limited” in its capacity. The agency said it “remain[ed] troubled about the growing risk of structural problems” at the old data center.
The data center is responsible for maintaining earnings and benefits information for nearly every American worker, processing 75 million transactions per day.
The new facility is one-third smaller than the building it replaces and uses about 30 percent less electricity than a typical data center, officials told the Washington Post. Four acres of photovoltaic panels supplement its power supply.
The General Services Administration’s timetable for the data center was September 2014, so the new facility comes on time. However, the road to this data center goes back several years.
In April of 2010, government auditors expressed concern that the site selection process had not given enough consideration to the cost of electric power. By August it appeared that the selection process had been narrowed to two sites in Maryland, one in Urbana and another in Woodlawn, not far from the existing Woodland site of the agency’s primary data center.
The decision on location was set to occur in September 2010, but questions were raised about the SSA’s decision to buy new land rather than find existing space at the Woodlawn campus. The land purchase was scheduled for December 2010, and in September 2011 it was finally purchased in Frederick, near an existing Fannie Mae data center.
There is another Social Security data center in North Carolina, serving as backup facility for the Woodlawn site. It previously used a commercial data center for backup.
Data centers continue to be a priority issue for the federal government, which has been consolidating its sprawling critical facilities infrastructure.
The Government Accountability Office reported on Federal Data Center Consolidation Initiative (FDCCI) last Thursday, estimating that agencies would save as much as $3.1 billion through next year by consolidating. The amount of savings they reported for the same period was $876 million, however, according to the report, which charged that there were big problems with reporting savings.
The U.S. Senate has voted in favor of legislating consolidation on Monday. | | 8:36p |
OpenStack Founding Father McKenty Joins Pivotal’s Cloud Foundry Team as CTO Pivotal has landed major talent in the form of Joshua McKenty, a cloud visionary and one of the original creators of OpenStack. McKenty joins Pivotal as field CTO for Cloud Foundry, the popular open source Platform-as-a-Service.
McKenty’s fingerprints are all over non-proprietary cloud technology as we know it today. In addition to being co-founder of OpenStack, the widely adopted open source framework for creating public and private clouds, he also founded Piston Cloud Computing, provider of a cloud operating system based on OpenStack. Piston was the first to have a commercially supported OpenStack-Cloud Foundry integration.
He also acted as founding technical architect of NASA Nebula, the federal government’s first cloud computing platform, which played a major role in OpenStack’s creation.
Pivotal is stacking its bench with talent. In addition to McKenty, the software company (majority owned by EMC) also recently hired Andrew Shafer, co-founder of the DevOps IT automation firm Puppet Labs, and Simon Elisha who previously served as cloud architect at Amazon Web Services.
McKenty brings two decades of experience across everything from maturing a startup to building cloud architecture.
“With OpenStack and Piston both on firm course, I’m taking this opportunity to climb ‘up the stack’ as it were, and join Pivotal as field CTO for Cloud Foundry,” McKenty wrote on Piston’s blog. “This position will give me an opportunity to work directly with the Fortune 100, on the front lines of 3rd platform adoption. I will also be working with the Cloud Foundry product team, translating these real-world challenges into user stories.”
Cloud Foundry, an open source project born at VMware and taken over by Pivotal after EMC formed the startup, is an enterprise application platform. It automates resources across OpenStack, vSphere and Amazon Web Services, enabling a reduction in the complexity of application lifecycle management for developers and IT operators. Pivotal CF is the company’s commercial distribution of the PaaS.
“I believe Cloud Foundry will become as boring (which is to say, fundamentally ubiquitous) as IaaS, as the software platform for modern app development, and I’m delighted to be part of that,” said McKenty. | | 9:19p |
CoreSite, Digital Realty Join Obama’s Better Buildings Challenge CoreSite Realty and Digital Realty Trust have joined the Department of Energy’s Better Buildings Challenge, a major component of President Barack Obama’s Climate Action Plan, whose goals include reducing data center energy consumption.
By joining the program, the two data center providers (both among the largest wholesale players in the U.S.), have committed to reduce their data center energy use by at least 20 percent over the next 10 years. They joined the program along with 17 other companies and government organizations who pledged to achieve the same data center energy use reductions.
The other companies that joined were eBay, Schneider Electric, Home Depot and Staples. Government agencies that joined were numerous DoE national labs, federal agencies and Michigan State University.
Together, the new participants operate data centers that draw more than 90 megawatts of power, according to the DoE.
The department estimated that if all data centers in the U.S. became 20 percent more energy efficient, the country’s energy consumption as a whole would go down by more than 20 billion kWh by 2020, Energy Secretary Ernest Moniz said in a statement. “As the Better Buildings Challenge expands, leading organizations are partnering with the Department to apply energy efficiency measures and energy management strategies that will shape the nation’s next-generation of data centers,” he said.
About 200 organizations have joined the Better Buildings Challenge so far. According to the DoE, they have already completed upgrades to more than 9,000 facilities.
The DoE expects CoreSite, Digital Realty and others to report results of their energy efficiency improvements in the first year.
Here is the full list of government agencies that have recently joined the challenge:
- Argonne National Laboratory
- Department of Defense, Defense Information Systems Agency
- Environmental Molecular Sciences Laboratory
- Environmental Protection Agency
- Department of Justice, Drug Enforcement Administration
- Lawrence Berkeley National Laboratory
- Los Alamos National Laboratory
- Michigan State University
- National Aeronautics and Space Administration
- National Energy Research Scientific Computing Center
- National Renewable Energy Laboratory
- Social Security Administration
- Department of Veterans Affairs
The DoE has a similar challenge program just for federal agencies. The government’s Data Center Energy Challenge is organized by Lawrence Berkeley National Laboratory and the Government Information Technology Executive Council. | | 9:30p |
Oracle in Discussions to Bring Cloud Data Center to China Oracle may soon partner with a Chinese service provider to establish an Oracle cloud data center in China to provide services to customers in the country.
Thomas Kurian, Oracle’s executive vice president of product development, said the company was currently in discussions with several players who could help it establish data center presence in the country but had not made any concrete decisions yet.
“We haven’t made a decision on who,” he said during a press conference at Oracle OpenWorld in San Francisco Tuesday. “As soon as that happens … you’ll see a number of partnerships.”
Oracle already has customers in China using its cloud-based software services, such as HR, ERP or CRM. A local data center would improve service performance for those customers and enable the company to serve customers that have to have their data stored within the country.
Cloud service performance usually improves when a service is hosted at a local data center instead of one overseas. While this is true elsewhere around the world, it is an even bigger deal in China.
The Chinese government’s Internet surveillance and censorship system called Golden Shield – also referred to as the Great Firewall of China – can slow down Internet traffic crossing the country’s borders and cause errors, Brian Klingbeil, senior vice president of international development at CenturyLink, which recently established a data center in Shanghai, said.
Even though CenturyLink has data centers in Hong Kong, Singapore and Tokyo, it decided to launch one in mainland China to improve performance for local customers as well as to serve the more location-sensitive organizations in the country.
A partnership with a local provider is a must for a foreign company to establish data center presence in China, where regulations preclude outsiders from doing business on their own. CenturyLink is using Chinese IT services provider Neusoft to buy hardware and lease data center space from a company called GDS in Shanghai on its behalf.
Oracle is likely to do something along similar lines if it does decide to start hosting cloud services in China.
At the kick-off of this week’s conference in San Francisco the company announced it would establish two data centers in Germany to host Oracle cloud services. It is using data center providers Interxion and Equinix to do that. |
|