Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Monday, November 24th, 2014
| Time |
Event |
| 1:00p |
Open Cloud Alliance Formed to Answer Germany’s Data Privacy Concerns A group of companies, including IBM, Univention, and Open-Xchange IBM, have launched the Open Cloud Alliance with the goal to make it easier for local cloud and hosting providers to offer diverse cloud services on the back of open source technology and to improve data portability. The alliance appears to be a way to address Germany’s data privacy and data sovereignty concerns.
OCA provides a single open source platform which integrates with many open source enterprise software solutions, including a single identity management solution. The aim is to make users less dependent on the large providers, a message that resonates in the European market.
The software-side basis for the platform is formed by the Univention Corporate Server, the open source cloud management system OpenStack, and the Univention App Center, a repository, or marketplace, of enterprise open source apps.
Reference hardware for the Open Cloud Alliance are Intel-based IBM/Lenovo server systems with IBM Cloud Manager, the OpenStack implementation of IBM.
Peter Ganten, CEO of Univention, said OCA is currently working with a handful of cloud service providers focusing on a small initial phase to get up and running within the next quarter. “At the moment, we are not looking to start with thousands, because there’s also the need to gain a bit more experience,” he said.
Univention and Open-Xchange are headquartered in Germany, which is an important detail. The focus for OCA is centered in Germany, emanating out to the rest of Europe, and hopefully worldwide. While each European country is a unique market, data privacy and sovereignty concerns in Germany are especially acute.
Yes, Another Cloud Alliance
It can be hard to keep track of all the cloud alliances and consortia. They are rising as a result of the move away from proprietary systems as the IT book is rewritten for cloud. Another example is the Open Data Center Alliance, which focuses a lot on hypervisor interoperability. There’s also an Open Compute Alliance for fostering openness in the ICT space.
At the server level, there’s an OpenPOWER consortium, IBM a prominent founder member. There are also countless organizations and alliances formed around networking.
While a lot of work is being done on the Infrastructure-as-a-Service level, with projects like OpenStack, the OCA is focused on bringing the same open source philosophy to the software and services level. It is defining an open software stack.
Germans Care About Privacy More Than Others
The OCA reveals the mentality of the German market that all Internet infrastructure and service providers need to take into account. The view on data privacy and in-country needs has propelled local cloud providers to success.
“This idea was born in Europe, born with a European focus,” Ganten said. “We think that there is a good opportunity for smaller cloud service providers. Local initiatives are much more visible in Europe than the U.S. due to data privacy and awareness.”
The U.S. is catching up. A recent survey found only 5 percent of Americans are unaware of government surveillance. How much each country’s populace ultimately cares might be where the disparity lies.
This German background has a lot to do with the OCA’s philosophy, said Rafael Laguna, CEO of Open-Xchange. Open-Xchange is an open source email, collaboration, and productivity app provider that has massive traction with many hosting providers worldwide.
“The European and US market is different,” said Laguna. “European companies are much more conscious and careful and more wary of big U.S. cloud providers. The big proprietary silos are very anti-competitive, very monolithic, and channel-unfriendly.”
One example of how Germany and the U.S. view privacy differently is email. In Germany, it is considered extremely weird for an employer to have access to an employee’s email. In the U.S., sometimes you communicate with superiors by keeping a draft saved in your box.
Data privacy concerns are born out of a history that saw the Gestapo, and later the Ministry of State security Stasi, encouraging spying on neighbors when the country was split, explains Laguna. It has a profound effect. “That’s why we generally distrust governments,” said Laguna. “Because they’ve been doing bad things.”
Market Favors Smaller Players
For this reason, the European cloud market on the whole is fragmented and not dominated by a single player. Colo provider Interxion notes that it has done well with local cloud providers because the market share is spread out.
As a result, there have been a lot of U.S.-based cloud providers opening up data centers in Germany to meet in-country data needs. Amazon Web Services recently opened a region in Frankfurt; VMware is opening two data centers in the country; then there’s two planned Oracle data centers, and a Salesforce one. However, even having a local data center might not be enough.
“[Opening a local data center is] a good step and will be enough of a move for many,” said Laguna. “The problem is with the legislation in the U.S., and AWS being a U.S. company. Even if data tips in Germany, the U.S. government can still force access to that data.” Whether or not this actually happens, the issue is the possibility of it happening.
European confidence in the future of data privacy in the U.S. is not high. Consider the Microsoft meeting with a group of CIOs in Berlin regarding potential unilateral access to data, or Angela Merkel’s call for a separate European Internet.
Germany is a country that doesn’t have Street View updates in Google Maps.
Being born out of this mindset might be a very good thing. Open source has moved from alternative to prime time, so that part of the equation is a no-brainer. However, OCA’s Europe-centric take on data privacy is an interesting wrinkle. Will it be able to expand its influence beyond Germany? The question is answered by whether or not you think the world will move toward or away from current data concerns. | | 4:30p |
The Ethics of Cloud Computing Seth Payne is a senior product manager at Skytap and previously worked as a technical product manager at the New York Stock Exchange.
“Cloud Computing” is the reigning buzzword in high-tech these days. Everything, it would seem, is either in the cloud, powered by the cloud, or some variation on the theme.
I remember when hosted email was just that: hosted email. Today, such offerings are considered cloud offerings even though they have not changed since first being introduced over 15 years ago. After all is said and done, we are still working with the same essential technology: compute, network, and storage. The reason cloud computing is so attractive is because it affords us incredible flexibility in how these basic elements are utilized – with both speed and specialized toolsets.
Cloud computing is a disruptive technology because it delivers existing technologies in new, efficient ways. As such, the essential unwritten – yet widely recognized – code of ethics that have informed IT operations for decades, apply to cloud computing as well. However, as cloud computing involves third party service providers as a matter of course, it is important to broadly review this code to determine if it introduces new challenges or presents new ethical questions.
IT Ethics Broadly Defined
Perhaps the most widely recognized ethical rule in IT is to respect the privacy of system users. IT admins, engineers, and others will often have access to employee email and personal files. It goes without saying that these admins absolutely need this access in order to manage a system and troubleshoot issues.
Another widely accepted ethical rule in IT is to protect customer data. When a customer pays for a firm’s services, the customer has placed significant trust in the provider and consequently, the provider has an absolute duty to protect its customers from data theft or misuse. If a data breach does occur, the provider has a duty to make the customer as whole as possible.
Providers also have the moral duty to be honest with customers regarding security policies, and at times, even system architecture. I once found myself in the unfortunate circumstance where a company I was working with had poorly designed network architecture with a massive single point of failure. As a matter of course this single point of failure did fail in a spectacular way which brought down the entire system and impacted several customers. After the mad scramble to get the system back up and running, it was important for me to discuss the details with the affected customers. I wanted to be completely honest with customers, as any attempt to obfuscate would not only have been unethical but also counterproductive.
Broadly speaking, cloud services providers have a moral duty similar to members of the customer IT operations team – to be responsive. It is easy for service providers to fall into the trap of thinking that since they provide a self-service product, customers can generally manage and resolve many challenges unaided.
In crisis situations, it is also tempting for vendors to focus solely on the needs of the biggest accounts. There is sense to this as the majority of larger accounts pay for certain levels of support and responsiveness; something smaller accounts often forego. However, service providers must not forget that while a major outage impacts larger customers, outages also impact one-person IT consultancy shops.
In short, service providers have a responsibility to make their customers – big or small – successful. There are a myriad of ways this can be accomplished through proper communication, effective product design, and documentation.
Cloud Computing Ethics in Practice
Just as core technologies, the essential ethical principles of IT remain unchanged with the advent of cloud computing. And even though the governing ethics remain largely unchanged, it is important to reexamine them, especially in light of the fact that so much of what used to be entirely internal considerations of operations and risk management, has been entrusted to providers and individuals who sit well outside direct organizational control.
Service providers must understand the operational risk they are assuming for their customers. Providers become stewards of customer data, functional operation, and risk mitigation. Customers also have a responsibility as they are, most likely, providing services to customers of their own. Consumers of cloud services must have a deep understanding of the technology being utilized and its accompanying risks. The only way to meet this responsibility is to 1) perform due diligence when considering a third party cloud services provider and 2) maintain consistent communication with their chosen provider.
Ultimately, it comes down to some pretty simple ideas: be honest, responsible, respectful of privacy, and treat both customers and vendors as we would like to be treated. Cloud computing can only reach its full potential if a real, lasting trust is established between providers and customers through a well-defined system of ethics.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 5:30p |
Digital Realty Names Bill Stein Permanent CEO Digital Realty Trust has named William Stein its permanent CEO. Stein, the company’s CFO, has been at the helm as interim chief executive since resignation of its founding CEO Michael Foust in March.
Stein’s appointment signals that the real estate investment trust’s board likes the direction he has taken the company since taking the helm earlier this year, and that the world’s largest wholesale data center landlord will stay the course. Digital Realty’s shares were down 0.22 percent late Monday morning Eastern Time.
This has been a year of changes for Digital Realty. The two biggest strategic shifts are pruning its massive real estate portfolio (something it has never done since it was founded in 2004) and going after partnerships with a variety of service providers to offer packaged deals that include its bread-and-butter wholesale data center space and services up the stack, such as network connectivity and cloud.
The company’s stock has been going up throughout the year, from about $50 per share in early January to close to $70 per share this month.
Getting Rid of Non-Core Properties
In October’s earnings call, Stein said Digital Realty had identified first nine of the properties it was going to divest. The company’s CTO Jim Smith told us earlier this year that pruning the portfolio was standard practice in the real estate business, and the current effort was something that should have been done earlier.
“We will continue to focus on growing our portfolio with the highest quality properties while culling under-performing assets, and will strive to improve operational efficiencies and strategically expand our global footprint,” Stein said in a statement.
Unneeded, or non-core properties for Digital Realty are properties that aren’t data centers or data center properties in markets the company’s leadership doesn’t want to grow in. They can also be simply overvalued properties in markets that are hot at the moment and can be sold at a premium, Smith said.
Diversifying Available Services
To help grow its ecosystem of service provider partners, Digital Realty hired Michael Bohlig, a former business development exec at Amazon Web Services.
The company has struck deals with carriers that can provide private network connections from its data centers to big cloud providers, such as AWS and Microsoft Azure. It has also partnered with smaller cloud service providers.
One of the most recent partnerships is with Carpathia Hosting. The two companies will jointly market infrastructure solutions that combine colocation space and a range of services offered by Carpathia, including hosting and cloud.
Another big partnership was with VMware, giving tenants at one of Digital Realty’s Dallas data centers direct access to VMware’s public cloud service called vCloud Air (formerly vCloud Hybrid Service). | | 5:30p |
Seeing is Believing: Learn How DCIM Improves Data Center Visibility Take a moment and examine the modern IT and corporate landscape. We have a lot more devices, many more applications, and a truly consumerized user. All of this has resulted in a boom in both data and data center utilization. Why? Because the data center is the home to all of these modern technologies.
Running a data center comes with consistent requirements: maintain uptime of critical infrastructure, operate as efficiently as possible and manage costs. But today’s complex business environments also demand agility and flexibility to meet the requirements of your organization.
In this whitepaper from Emerson we test the amount of visibility you have into your data center and the respective operations. Remember, the key to true agility and flexibility is having the important information you need at your fingertips. It means knowing and seeing all the data center assets, their consumption and connectivity so you can make important decisions in an informed and conscious manner. Without this information you cannot effectively enable the business because you can’t manage what you can’t measure, and you can’t measure what you can’t see.
Let’s take a second to test your vision: How many of the items below can you see accurately and in real time?
- Available power capacity
- Available space to add new equipment and servers
- The actual assets you have and their exact location
- Utilization for chargeback to business units
- Which power components are connected to which
- Impact of planned changes power systems affected by outages
- Assets in alarm state
- Interactive 3D alarms
Download this whitepaper today to learn how the Trellis platform, the leading DCIM solution from Emerson Network Power, delivers unprecedented, real-time visibility into critical infrastructure and the impact of changes.
Trellis monitors all IT and facilities resources in the data center and automates management and control to help your IT and facilities organizations realize their objectives. With this unified and complete solution, you gain the power to visualize the real situation in your data center, make informed decisions and take action with confidence. | | 7:09p |
Hosted Hadoop Services Firm Expands in Denver With Fortrust Bit Refinery, which hosts and manages Hadoop clusters and VMware private clouds for customers, has expanded in the Rocky Mountain region with Fortrust.
Apache Hadoop services are a growing market as the open source framework for storing a lot of data across clusters of commodity servers for parallel processing becomes more and more commonplace in the enterprise. Major Hadoop distribution providers, such as Hortonworks, MapR, and Cloudera, have built businesses by providing paid Hadoop services around free distributions of the framework.
Earlier this month, Hortonworks became the first among its peers to file for an initial public offering.
If demand for hosted Hadoop environments continues to grow, it will be a growing opportunity for data center providers like Fortrust, which signed Bit Refinery for a colocation cage in its Denver data center.
Bit Refinery uses a variety of colo providers to host its infrastructure in other markets: Telx in Silicon Valley, Atlanta, and New York, Telehouse in London, and Digital Fortress in Seattle, among others.
“Although Hadoop’s model is built around redundancy, hosting with a company located within a secure and reliable data center is a top priority,” Brandon Hieb, managing partner at Bit Refinery, said in a statement commenting on the Fortrust deal.
In addition to Denver, Fortrust offers data center services in Phoenix and New Jersey within IO.Anywhere modules by Phoenix-based IO. | | 8:26p |
Moogsoft Adds Health Monitoring for Docker Containers Moogsoft, a startup that sells IT monitoring software for DevOps teams, has added the capability to monitor Docker containers and OpenStack clouds to its platform.
The startup’s two founders, Phil Tee and Mike Silvey, played a key role in the creation of Netcool, the technology that provides a lot of monitoring capabilities in IBM Tivoli, one of the most widely deployed IT operations management platforms.
Moogsoft’s core product, Incident.Moog, is an early-warning system that identifies problems in the IT environment. According to Moogsoft, it relies on machine learning rather than preset roles or models.
Docker is an open source technology that enables applications to share server resources without the added layer of a hypervisor and a guest operating system on every VM – the traditional form of sever virtualization. It also standardizes the way applications communicate their resource requirements, making them portable across different kinds of physical or virtual servers or clouds.
There is also a company called Docker, which leads the open source project and sells tools that help enterprises use containerization in their data centers. The company has been around for less than two years but the technology has already become popular with developers because it frees them from worrying about the infrastructure their applications are going to run on when they are writing them.
Moogsoft said its new monitoring feature offers visibility into code within Docker containers and software layers outside. Other tools, while aware of containers, do not provide such deep level of visibility, according to the company.
“Critically, it is impossible today to understand when an operational issue occurs in a containerized app, whether the issue is with the application or elsewhere in the supporting stack from the network through bare metal servers, hypervisors, OS, orchestration, containers, and so on,” Tee, the company’s CEO, said in a statement.
The software looks at events from the containers themselves, output logs of containerized apps, management software, and monitoring systems. It integrates tightly with Docker and OpenStack APIs.
Moogsoft started shipping its software in 2012. The company has raised $23 million to date from Wing Venture Capital, Redpoint Ventures, and Cisco Investments among others. | | 9:00p |
Massive Yahoo Outage Keeps Customers Disconnected from Email for Days 
This article originally appeared at The WHIR
A cut underwater fiber cable is leaving some Yahoo email customers unable to access their accounts for days. UK-based Internet providers BT and Sky customers are affected since both use Yahoo’s email servers.
According to the BT website, the Yahoo email issue is not yet resolved. Sky updated its site Monday to say it has a temporary fix and “that engineers arrived at the site of the break and have started repair work. We don’t yet have confirmation on when a permanent fix will be in place but we’ll provide further updates soon.”
Consistent with other outages at Comcast, HostGator and 24/7 Hosting, customers biggest complaints on Twitter are regarding the lack of clear communication by the company. This seems to be a theme with outages in general. Service providers should note that customers like to be updated often even when there is nothing new to report.
Yahoo’s help site said the cable problem is due to a third party but has not said when it expects the repair to be finished. Metro is reporting that a ship cut through the data cable while fixing a separate pipe nearby.The last update to its twitter feed regarding the incident was on Friday and the help site hasn’t been updated since Thursday. Email giant Gmail also experienced an outage in October but updated it’s customers much more quickly.
Users began reporting login issues on Tuesday and the Yahoo twitter feed said problems had been resolved for most customers that same day. Some users are still reporting they can’t access email. Twitter users are using the hashtag #yahoomaildown to express their outrage.
This outage comes shortly after Mozilla dumped it’s long time partner Google to take on Yahoo as it’s default search engine. Yahoo also had outages earlier this month.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/massive-yahoo-outage-keeps-customers-disconnected-email-days | | 9:30p |
Unknown Government Using Advanced Hacking Spyware to Attack Russia and Saudi Arabia 
This article originally appeared at The WHIR
Data collection and spying continues to make news with a never before seen complex surveillance software called Regin. A report released Monday by Symantec says, “The level of sophistication and complexity of Regin suggests that the development of this threat could have taken well-resourced teams of developers many months or years to develop and maintain.” This level of investment in software designed to stealthily collect data is indicative of a nation state.
“We are probably looking at some sort of western agency,” said Orla Cox, director of security response at Symantec to Financial Times. “Sometimes there is virtually nothing left behind – no clues. Sometimes an infection can disappear completely almost as soon as you start looking at it, it’s gone.”
Regin software is different than a traditional advanced persistent threat (APTs) and trojans. Usually APTs collect intellectual property with Regin continuously monitors a targeted organization or individual and collects all kinds of data. What this software can do goes well beyond the malware used in the JP Morgan, Kmart, Dairy Queen and Home Depot attacks. Payloads include:
- Capturing screenshots
- Taking control of the mouse’s point and click functions
- Stealing passwords
- Monitoring network traffic including monitoring traffic to Microsoft Internet Information Services (IIS) web Servers
- Gathering information on processes and memory use
- Scans and retrieves deleted files
- Collect administration traffic for mobile telephony base station controllers
- Parsing mail from Exchange databases
Symantec first began to explore this threat in fall 2013 when they found several in the wild affecting a variety of targets. Version 1.0 was used from 2008 to 2011, version 2.0 has been used from 2013 onward but may have been used earlier.
The software is designed to hide the data it’s stealing and most of the time is not written to disk. A computer can be affected in a variety of ways. Targets may be tricked into visiting a spoofed website where the threat is installed by the browser. At least one infection originated in a Yahoo! instant message. Known infection files are usbclass.sys (version 1.0) and adpu160.sys (version 2.0).
There is no specific industry being targeted, the attacks have included several different types of organizations, government systems and research institutes. Almost half of the targets were small businesses and individuals. Russia and Saudi Arabia were the countries most affected by Regin, 28 and 24 percent respectively.The US does not appear to be affected.
“We believe Regin is not coming from the usual suspects. We don’t think Regin was made by Russia or China,” Mikko Hypponen, chief research officer at F-Secure, told the Guardian. His company first spied Regin hiding on a Windows server inside a customer’s IT infrastructure in Northern Europe.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/unknown-government-using-advanced-hacking-spyware-attack-russia-saudi-arabia |
|