Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, February 5th, 2015
Time |
Event |
1:22a |
The Data Centers Sues University of Delaware Fallout from the failed data center construction project in Newark, Delaware, continues to get nastier and nastier.
First, University of Delaware broke its lease agreement with the developer (the development was going to be on a university-owned property), dooming it to fail. What followed were lawsuits against The Data Centers LLC (the developer) filed by companies it had hired to do some initial work on the project. Then came a lawsuit by one of the partners in TDC against another.
Now, TDC is suing the university, claiming its leadership has sabotaged the $1.3 billion project, which would include a 280-megawatt cogeneration plant fueled by natural gas. The company filed the suit in Delaware Superior Court Wednesday, Delaware Online reported.
Essentially, the plaintiff is alleging that UD officials caved in under community pressure to kill the project after signing a 75-year lease agreement with the developer, which caused the developer to lose $200 million or more. TDC is claiming the officials were disingenuous in their stated reasons for breaking the lease to make sure the decision was legally defensible.
Members of the community who opposed the project were mainly concerned with impact the power plant would have on the area. TDC was also accused of not being forthcoming enough with details about its plans.
The plans were for something unprecedented: a large-scale data center powered completely by a cogeneration plant. But, perhaps ironically, the power plant, its most cutting-edge feature, was what brought it to its knees.
There is at least one other planned large-scale data center construction project that is facing strong community opposition today. Amazon subsidiary Vadata wants to build a data center for Amazon in Haymarket, Virginia.
A group of residents in the area has been vocal in opposing construction of an electrical transmission line that would be required to power the future facility. Two state legislators have sided with the opposition. One of them even penned a bill that would create a law that would make it much harder for the project to move forward. | 1:58a |
VMware’s Government Cloud Gets FedRAMP Certification VMware’s vCloud Government Service has achieved FedRAMP certification, which means government agencies can now use the service that has been officially confirmed to meet standard government security requirements.
VMware’s government cloud service, provided by its partner Carpathia out if its data centers, joins a relatively short list of FedRAMP-certified cloud service providers. Others on the list include Akamai, AT&T, HP, IBM, Lockheed Martin, Microsoft, Oracle, Salesforce, Verizon, and QTS.
The government created FedRAMP to fast-track adoption of cloud services by its agencies. Instead of each agency individually evaluating different vendors to make sure they are compliant with security requirements, the can just pick a service from the list of pre-screened vendors.
VMware’s government cloud received the certification Tuesday, according to the FedRAMP website.
Chris Wolf, CTO of the Americas at VMware:
VMware’s government cloud services are similar to its services for civilians. The company is offering seamless integration between the customer’s own data center and VMware’s multi-tenant cloud hosted in the provider’s facilities (in the government cloud’s case, in Carpathia’s facilities).
So far, no public cloud provider has been able to beat Amazon Web Services in the amount of Infrastructure-as-a-Service business it does with federal agencies (or in the private sector for that matter). VMware is a very serious contender in this space because of the ubiquity of its data center software in customers’ data centers.
Users often want to keep some of their infrastructure in-house but integrate it with public cloud for elasticity. If VMware can convince federal IT leaders that it’s better for them to extend their in-house environments with its cloud services than with AWS or others’, it wins. | 1:00p |
Defense Department Warming to Commercial Cloud Services U.S. Department of Defense is moving more data to the cloud and wants a closer partnership with commercial cloud providers.
Currently, the Defense Information Systems Agency (DISA) is providing these services, but cost savings considerations have the DoD assessing commercial alternatives for some types of data.
The DoD plans to use more commercial IT services and infrastructure on the whole where it makes sense. At a recent industry day, acting DoD CIO Terry Halvorsen revealed the DoD is considering a commercial solution for the next version of its unclassified enterprise email.
The move to cloud is driven by cost reductions, technical efficiencies, and security considerations. In the early stages of transitioning to the cloud, it’s important to communicate with defense industry partners, Halvorsen said in a recent DoD release.
He also said that it’s important to move all non-sensitive data, such as public-facing websites. to the commercial cloud as soon as possible.
Halvorsen has spoken extensively at conferences about the DoD and cloud. During a March MeriTalk event, the Navy’s move to cloud was highlighted.
DISA is feeling pressure to reduce costs. Halvorsen praised the agency during a recent DoD Cloud Industry Day for reducing costs 10 percent, but said these reductions are not enough.
The event was the first in a series of planned Cloud Industry Days. The events are meant to create an open dialog on driving modernization and streamlining of government IT.
The DoD released a new security requirements guide several weeks ago, which outlines security demands specifically from cloud service providers. Those providers meeting FedRAMP standards are eligible to handle the DoD’s less sensitive data without any additional security.
FedRAMP is a government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services. Just this week, VMware was added to the short list of FedRAMP-certified IaaS providers, which also includes Amazon Web Services, Microsoft, IBM, and HP, among others.
Currently, the department is requiring a “FedRAMP Plus” certification that provides additional layers when dealing with more sensitive data. The goal is to get FedRAMP up to snuff in general as a national standard. This will involve dealing with different types of data differently, outlining vertical-specific needs.
Certain types of data will remain inside the government because of financial, technical, and political risk. This is where VMware has a special advantage, since it omnipresent in government data centers already and since its IaaS is specifically designed to make it easy for users to extend their in-house environments to a public cloud.
Commercial milCloud?
DISA has a cloud services product portfolio called milCloud. It combines commercial and government-built technologies for users to build and maintain DoD applications.
The agency opened the milCloud marketplace to Defense Department mission partners last October. It provides access to both classified and unclassified services.
DISA is not simply a cloud broker, as DoD Principal Deputy Alan Estevez pointed out at the industry day event. It develops tools to show the pluses and minuses of going to cloud. The biggest cost drivers are people and security. It’s also important to determine the right level of security for different types of data.
Security Remains a Top Concern
Recent surveys, such as MeriTalk’s “Cloud Without Commitment” and “Heart of the Network: Data Center Defense,” show security issues continue to be top-of-mind among agencies dealing with cloud and data center consolidation.
The surveys look at what types of applications are moving, and where concerns lie. They both reveal extreme interest but also apprehension in the very early days of the overhaul.
Less than 20 percent of federal agencies are delivering more than one quarter of the agency’s IT services either fully or partially via the cloud, according to one survey. The early movers are email, web hosting, storage, collaboration environments, and testing and development.
What hasn’t been moving is traditional business applications, custom applications, and disaster recovery, with only a third of those using cloud having moved these applications to cloud in some way.
Security’s Big Role in FDCCI
While FedRAMP deals with security, the Federal Data Center Consolidation Initiative’s goal is to reduce the number of physical data centers.
In a MeriTalk survey, over 40 percent of respondents said integration will prove to be the top security challenge of FDCCI.
Nearly half say that cyber security is more challenging as they modernize, and 70 percent are concerned about security within the data center fabric. The study was underwritten by Palo Alto Networks.
Nearly three quarters give their agency a grade of “A “ or “B” for security efforts during modernization, but half say key security measures are still absent. Automation, mobile device management, and endpoint security management are at the top.
The report identifies advanced targeted attacks and advanced persistent threats, malware on host servers, and network viruses as the top three concerns.
“Many agencies have focused security efforts at the perimeter,” Steve Hoffman, a regional sales vice president at Palo Alto Networks, said. “But, as we consider increasingly sophisticated cyber security attacks, all government agencies need a platform approach to protect the heart of their network – the data center – while safely enabling business applications. They need to be able to correlate known and new threats and take preventative action, not just detect and remediate.” | 4:30p |
How to Adopt a Hybrid Cloud Strategy When One Size Does Not Fit All Lief Morin is the president of Key Information Systems, Inc., a leading regional systems integrator with world-class compute, storage and networking solutions and professional services for the most advanced software-defined data centers.
Gartner predicts that 74 percent of enterprises are pursuing hybrid cloud strategies, and given the conversation around hybrid cloud this year, that number is only expected to rise in 2015.
Large-scale public clouds deliver real value in terms of compute power and economic advantage, but there are still workloads and use cases that are best kept on-premise. Whether due to regulatory reasons, company preference or other factors, public cloud is not always the best solution.
The interest in hybrid cloud might be widespread, but that doesn’t mean the needs are uniform. There is no one-size-fits-all hybrid cloud approach. You need to find the strategies, technologies and expert help that best match your company’s individual needs.
Which Cloud is Best for You?
In most instances, the best way to determine which cloud is best for your business is to look at the infrastructure you have today, and then consider what you’re looking to build and what you want to see as the end result. Enterprises need to ask fundamental business questions, including:
- What are we doing as a business and why?
- What does our budget look like?
- What are our security needs?
- What cloud skillsets do we have in house and what do we need to outsource?
If you are evaluating limited customer service, public cloud alternatives, where your data and applications are secondary to your business model, you need to look at more than just price. Organizations considering a public cloud solution need to ask themselves:
- What kind of ongoing support do we need?
- Do we need to know where our data is being housed?
- What kind of service-level agreement is necessary for our business?
With most of the big players, the price is a reflection of the service; low prices likely mean you won’t have access to people when you need support.
Private clouds, while offering the maximum in security and privacy, can be expensive. Unless you need a maximum level of security for all of your data, private clouds can be overkill. This is where hybrid cloud has found a niche – it offers the affordability and accessibility of a public cloud while providing the option of privacy and security for your sensitive data.
Driving Your Own Hybrid Strategy
When crafting a hybrid cloud strategy, it is best for IT teams to take a phased approach. Step one is to examine all the different cloud options and solutions that meet your company’s needs – not the needs of your competitor, your neighbor or an enterprise you saw featured in a case study. Identify the blend of public cloud and on-premise solutions that fit your company to create the best possible mix.
If you decide to rely on your own internal team for cloud implementation, the next step is to develop a clearly defined strategy and ask key questions, including:
- Have we considered the capabilities the cloud might offer next year or in five years?
- Is the cloud solution we’ve chosen the best bet for scaling over time?
- Which capabilities can public cloud providers offer us, and which will our team have to provide?
- What kind of post-migration support will the team need, and are we equipped to provide it?
Cloud implementation presents a constant learning curve. To be successful, in-house teams need to be knowledgeable, open-minded and willing to continue their overall cloud education. Many organizations provide ongoing training for IT staff to stay ahead of cloud innovation, but others choose to partner with a cloud provider for specialized expertise.
Partnering With a Provider for Hybrid Cloud Help
As they pursue hybrid cloud strategies, enterprises are looking for highly integrated, well-architected infrastructures. For this reason, many enterprises choose to work with a channel partner or a consultant that can provide strategic guidance around cloud adoption. This guidance should be based on a provider’s holistic understanding of and experience with all the related components of the IT system. In addition, an external partner can save an enterprise the cost of keeping its IT staff, which can be transient, trained on rapidly developing technologies.
If you opt to seek out expert assistance with your hybrid cloud strategies, keep regional issues in mind. Are you partnering with someone who can:
- Deliver superior service?
- Meet your latency requirements?
- Tell you where your data will reside – preferably somewhere nearby?
Regional cloud providers can answer these questions affirmatively.
Hybrid Cloud Trends in 2015
With companies eager to take advantage of an architecture that offers the benefits of cloud computing alongside the option of on-premise operations, cloud interest and adoption will continue to rise. As your IT team determines how to best map out and adopt hybrid cloud strategies, you should look first and foremost at your own requirements, rather than seeking out one-size-fits-all solutions for highly individualized needs.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | 5:00p |
Going Cloud: How to Squeeze More Out of Your White Space Data center operators often wonder how to make more revenue within their existing facility. Partnering with a cloud services provider may boost their capacity and performance, without a lot of overhead.
“Every day we are talking to data centers about how do they make their white space a commodity? How do they make $20,000 or $30,000 per rack instead of $2,000?,” said Daniel Pfeiffer, vice president of Marketing & Partnerships, OrionVM, a turn-key, wholesale cloud infrastructure provider. The answer, he says, is leveraging a wholesale cloud partner.
Sheng Yeo, OrionVM CEO, said that as a turn-key, wholesale cloud infrastructure provider, his company is able to white label cloud infrastructure for multiple industries and applications, including customers in telecommunications, pharmaceuticals, government and healthcare. “We have a cloud platform built for enterprise apps,” he said.
How to approach cloud and make the most of cloud services is one of the continuing issues faced by data center managers and operations, and the spring Data Center World event has several sessions on cloud and cloud options. Yeo, of OrionVM, will be presenting on the concept of wholesale cloud and how it fits for enterprises as well as within the service offerings of data centers. In advance of spring Data Center World Global Conference, Data Center Knowledge had the opportunity to discuss cloud strategy with Yeo and Pfeiffer.
Pfeiffer explained that OrionVM sees data centers as fundamental partners. “We cloud enable them and make them relevant to their customers,” he added.
“The hosting and managed services companies are interested,” said Pfeiffer. “They have explored AWS and Google and the “build your own” options. They are thinking how do we deliver capacity without diluting business? AWS requires that you change what you are doing. It is different from what you do.”
Yeo said that OrionVM’s cloud approach is “market tested and market proven.” The company currently has 450 global clients. At present, they use Equinix data centers in the Bay Area and have several facilities in Australia. They are looking to expand to the East Coast of the United States.
The future is moving towards larger cloud providers, said Yeo. “We are going away from the disparate retail cloud providers. It’s kind of like the ecosystem of data center providers, that has turned from thousands of tiny providers to fewer larger providers.”
 Traditional Cloud Architecture (Graphic by OrionVM)
 How OrionVM structures its technology stack. (Graphic by OrionVM)
Looking to the Supercomputing World for Inspiration
Our next generation, wholesale cloud is built a very efficient platform, said Yeo. “It is a faster platform, for example, our cloud storage is 25-30 time faster than AWS.”
The performance-focus comes from the world of supercomputing, said Pfeiffer. “The heritage of Orion VM is we looked at super-efficient computing for enterprise environments, and looked at the high performance computing (HPC) space,” Pfeiffer explained. “In the HPC space, they optimize on performance, getting it out of every dollar spent on infrastructure.”
The company has people on the team who come from the supercomputing background and couples their expertise with an enterprise-grade architecture. While they use Linux, there is a development team who wraps its own code around that operating system for software defined storage, software defined networking and virtualization functions.
“We have a development team in Sydney who have been working on this for the last five years. We have built it from the ground up. We are removing the client’s infrastructure costs,” Pfeiffer said.
To find out more about wholesale cloud services and the growing cloud ecosystem, attend the session by Yeo at spring Data Center World Global Conference in Las Vegas. Learn more and register at the Data Center World website. | 6:14p |
DuPont Fabros Names NTT Exec Eldredge CEO DuPont Fabros Technology has named Christopher Eldredge its president and CEO following a two-year search for a successor to Hossein Fateh, who will be stepping down after 18 years in the role. Eldredge comes from NTT America, where he has served as executive vice president since 2013.
DFT originally announced it was looking for a successor to Fateh in 2013. The original plan was to appoint someone as president and have them transition into the CEO role later. The real estate investment trust put the search on fast track in late 2014, saying it was looking for someone ready to take over the CEO role immediately. Fateh, one of the company’s founders, will remain on the board, serving as vice chairman.
Eldredge joins during a time of change for DFT and for the wholesale center industry at large. The company recently brought online the massive ACC7 data center in Ashburn, Virginia, the first facility employing a new data center design it plans to replicate in all future builds. The design was also used in a 9-megawatt expansion in Santa Clara, California, as well.
The North American wholesale data center market in general is doing well, up 37 percent in 2014, according to North American Data Centers, a commercial real estate firm specializing in data center space. But the market is undergoing some changes. Wholesale providers are agreeing to smaller-than-usual deals and have been differentiating their services beyond the traditional powered-shell or turnkey data center space.
DFT announced intentions to pursue smaller deals and enter the retail colocation market early last year. Its biggest competitor, Digital Realty Trust, which also recently named a new CEO, has been making moves along similar lines.
Eldredge’s previous company, subsidiary of Japan’s gigantic Nippon Telephone and Telegraph Corp., has been expanding its data center business in the U.S. aggressively. Its biggest move was acquiring a majority stake in RagingWire, a DFT competitor in some markets, last year.
At NTT America, he was responsible for the network services business.
Prior to NTT, Eldredge was president and general manager of Ethernet exchange and product management at Telx Group. In the past he has also held executive leadership roles at Broadview Networks and Frontier Communications (formerly Citizens Communications). His early career was with Cablevision Lightpath.
“It’s an honor to take the helm of a company built on such a solid foundation,” Eldredge said in a statement. “I look forward to collaborating with the board, executive team, and employees to build on DFT’s momentum.”
In his statement, Fateh said he was confident in Eldredge’s ability to assume leadership immediately. “In my role as vice chairman, co-founder and shareholder, I remain committed to DFT and available to Chris for insight and support,” he added.
DFT held its Initial Public Offering in 2007, during a time of economic turmoil. The company even halted some projects early on in its public life, before the credit spigot finally reopened. The data center sector proved to have been one of the most resilient and well-performing sectors through the recent recession.
The company’s portfolio now consists of 11 data centers located in four major U.S. markets, totalling 2.75 million gross square feet and 240 megawatts of critical load. | 7:35p |
Dimension Data to Add 300 Data Center Jobs Dimension Data, a South Africa-based global IT services giant, announced plans to fill 300 data center jobs over the next 18 months as the company seeks to take the size of its data center business from $1 billion to $4 billion by 2018.
The company said it will be recruiting experts for data center jobs around the world: in North America, Europe, Middle East, Africa, and Asia-Pacific. Dimension Data already provides a wide range of services, but said it was planning to invest big in new consulting, managed, and cloud services.
The $6.7-billion company is a wholly owned subsidiary of Japan’s Nippon Telegraph and Telephone Corp., which bought it for $3.2 billion in 2010.
“Our clients want to do things differently, and look to us to help them with alternative and innovative delivery and consumption approaches,” Steve Joubert, group executive of the company’s data center business unit, said in a statement. “With NTT’s backing, we’re in a strong position to disrupt the market with our value proposition, footprint and people, and take market share.”
Dimension Data announced its plans to quadruple the size of its data center business in 2014. To that end it acquired a California IT services company called Nexus in April of last year and launched global managed services for data centers in September. The Nexus acquisition expanded its U.S. operations by 40 percent. | 8:38p |
IBM Puts More Machine Learning Services in Watson Cloud IBM has added five new services in beta to Watson Developer Cloud, which offers developers a way to build applications that incorporate Watson’s cognitive computing capabilities. IBM made some of those capabilities available to the masses in 2013 and now claims that over 6,000 applications tapping them have been made to date.
The new services are speech to text, text to speech, visual recognition, concept insights and trade-off analytics. Watson Developer Cloud now has 13 services available in beta. These services aim to enable a new class of applications that employ machine learning, understand natural language, and identify hidden patterns.
About the new services, available today:
Speech to Text: IBM claims this is one of the first real-time services with accurate low-latency speech recognition capabilities. Watson is adept at understanding natural language. The service applies machine intelligence related to grammar and language structures within a specific context for more accurate transcription. In other words, it’s good with homonyms. If you’re converting Jimmi Hendrix lyrics from speech to text, the likelihood of the transcription reading “’scuse me while I kiss the sky” instead of “excuse me while I kiss this guy” is greatly increased.
Text to Speech: This service supports translation in both English and Spanish with three optional voices across the two languages. One of those options is the voice Watson used in its famous 2011 Jeopardy match. You can get Watson to say “I’m sorry, Dave. I’m afraid I can’t do that,” in your modern telling of ‘2001: A Space Odyssey.’
Visual Recognition: Visual recognition analyzes various forms of media and helps to collect and organize large sets of visual data to build semantic associations. Again using machine learning technology, it recognizes an array of visual entities like settings, objects, and events to understand content of images and videos. It can even tell the difference between Bill Paxton and Bill Pullman.
Concept Insights: The service looks at text in a conceptual manner. It has a search capability that discovers new insights on text compared to traditional keyword searches. Concept Insights links user provided documents with a pre-existing graph of concepts based on Wikipedia. These links are explicit (directly mentions a concept) and implicit, which links the user’s documents to relevant concepts not directly mentioned. Kids, when I was young, I had to do this kind of thinking manually.
Trade-off Analytics: This service helps you make better choices by dynamically weighing multiple and often conflicting goals. It uses Pareto filtering techniques to identify optimal alternatives across multiple criteria. It uses various analytical and visual approaches. Here, IBM might have finally built something that can help married couples figure out what to eat for dinner when one person says “they’re up for anything” but rejects everything the other suggests.
Cognitive computing has implications in many fields, a point highlighted by some of the promotional work IBM did prior to making Watson widely available to developers. Some notable visible mainstream uses include the appearance on Jeopardy, staring in late-night monologues, and coming up with unique recipes at SXSW in Austin.
Watson is versatile when it comes to wider industry applications. In healthcare, for example, Watson can make some recommendations a doctor makes, loaded to a smartphone.
IBM already has numerous cloud services that leverage Watson and provides APIs developers can use to tap into the capabilities in their own applications through its Bluemix Platform-as-a-Service. Watson is one of the most active communities within the IBM Bluemix PaaS ecosystem.
In early 2014, the company committed $1 billion to investments in the various businesses around Watson and opened a headquarters building in New York City dedicated exclusively to Watson. | 9:53p |
Oracle Linux Image Added to Docker Hub The wave of hype around Docker continues to rise, and Oracle wants a piece of the action. The Redwood Shores, California, giant has made images of its own Linux distribution available on Docker Hub, the public registry for tools and components developers can use as building blocks for their Docker-enabled applications.
Docker has created a lot of excitement among developers around its open source application container technology. As a recent survey funded by Canonical found, however, while there’s a lot of interest in Docker, there’s not a whole lot of adoption taking place. After all, the company has officially been around for less than one year.
Linux is extremely popular. More than one-third of the world’s websites run on Linux-powered servers, for example, but Oracle Linux is not a very popular distribution. The most widely used distro on the web is Debian, followed by Ubuntu, CentOS, and Red Hat, in that order, according to W3Techs, which conducts web technology surveys.
Ubuntu, the Linux distribution by Canonical, is also the most downloaded image on Docker Hub. CentOS is second.
While not widely deployed in production, Docker, has garnered a lot of support from giant IT vendors and service providers. Google has a private Docker container registry service on its cloud platform; Amazon Web Services has a Docker container management service; Microsoft Azure supports Docker, and Windows has a command line interface for Docker.
Oracle Linux is not the giant’s only presence on Docker Hub. A MySQL image has been available on the registry before. Oracle owns MySQL, even though, like Linux, it’s open source.
“With Oracle Linux and MySQL images available on the Docker Hub Registry, users will be able to quickly create and publish custom Docker containers that layer applications on top of Oracle Linux and MySQL, which is a great time-saver for both independent software vendors and IT departments,” Wim Coekaertz, senior vice president for Linux and virtualization engineering at Oracle, said in a statement. |
|