Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, March 26th, 2015

    Time Event
    1:16a
    Report: Suit Claims Facebook Stole Data Center Designs

    A U.K. engineering company has filed a lawsuit against Facebook, accusing the social network of using its proprietary data center designs and promoting their public use through its Open Compute Project, IDG News Service reported.

    BladeRoom Group filed the lawsuit in a federal court in San Jose, California, alleging that Facebook had stolen its intellectual property when it used the approach of using pre-fabricated modules to build its Luleå, Sweden, data center.

    Facebook was very vocal about the data center design and construction methodology it called “Rapid Deployment Data Center.” The idea was that a massive data center could quickly be deployed anywhere in the world if all of its major components were pre-fabricated at a factory and shipped to the site for quick assembly.

    Facebook design engineer Marco Magarelli described the approach in detail in a blog post on the Open Compute Project’s website.

    Emerson Network Power was one of the key suppliers of modules for the Luleå facility. The two companies said last year Emerson was going to provide modules such as power skids, evaporative cooling air handlers, a water treatment plant, and data center superstructure modules.

    Facebook’s second Luleå data center would be the first site to use the approach.

    BladeRoom started under a different name more than 20 years ago as a company that provided modular buildings for commercial kitchens but later added hospital operating rooms and technical complex facilities to its portfolio. It rebranded in 2008 and changed focus to data centers.

    In 2013, it partnered with U.S. company Modular Power Solutions to form a U.S. subsidiary that would manufacture modules at a factory in Michigan.

    In its lawsuit BladeRoom claims that it had approached Facebook’s infrastructure team about using its data center design concept in 2011. The U.K. company’s representatives said they may have remained silent about “Facebook’s misdeeds,” had it not started “to encourage and induce others to use BRG’s intellectual property through an initiative created by Facebook called the ‘Open Compute Project’.”

    OCP is a Facebook-led open source data center and hardware design initiative. Its members include companies such as Microsoft, Apple, Goldman Sachs, Fidelity Investments, Capital One, Cisco, and HP, among many others.

    12:00p
    Data Center Tax Bill Goes to Oregon Governor

    A bill that seeks to exempt data centers and telcos from Oregon’s unusual “central assessment” property tax has cleared both houses of the State Legislature and has gone to Governor Kate Brown for signing.

    Democratic senator Mark Hass, the bill’s main sponsor, told The Oregonian, that Brown had personally promised to him that she will sign it once it makes its way through the House of Representatives and the Senate.

    The state established central assessment back in the 1800s, seeking to collect taxes from railroads based on their entire networks’ value rather than just the property they had within state borders, The Oregonian explained. It has been updated to include telegraph and telephone and then again to include microwave towers.

    Some Oregon officials have sought to apply central assessment to cable TV companies and data centers, which are now abundant in the state. Comcast, which has both network assets and data center space in Oregon, has been fighting it in court over the past six years.

    Operators that participate in a data center tax break program have been exempt from central assessment, but that exemption has an expiration date.

    If Brown signs the bill, it will likely mean millions in lost tax revenue for counties.

    Hass and State Representative Mike McLane, a Republican who co-sponsored the bill, have argued that central assessment may have put in jeopardy whatever future Oregon data center expansion plans companies like Facebook, Apple, and Amazon may have had.

    The bill’s fate is also likely to impact future plans of multitenant data center providers in the state. If the governor does not sign the bill, companies thinking of leasing data center space in Oregon will have to reevaluate their plans because current tax code exposes them to risk of higher tax liabilities, Aaron Wangenheim, chief operating officer with T5 Data Centers, said.

    “We’d obviously have to evaluate how customers are viewing this. Everyone will have to stop and evaluate what is the true impact on their business. We don’t want to be exposing our customers to risk.”

    Oregon is not above competition with other states for data center construction projects, Wangenheim said. Companies often evaluate it along with Washington, Utah, and even Idaho during site selection.

    4:00p
    Government Surveillance Dilemmas Present Challenges for Data Centers

    Since the summer of 2013 when Edward Snowden, a government system administrator and insider, unveiled secret government snooping on large volumes of citizen data (without proper legal tools such as search warrants), media coverage and discussion of government surveillance operations have been widespread.

    It’s been revealed that U.S. law enforcement and spy agencies with the three-letter acronyms (NSA, FBI, CIA and so on) have enormous data gathering operations, including accessing and analyzing data about phone calls as well as the content of emails, documents and web visits of U.S. and foreign citizens.

    The Internet infrastructure industry is experiencing business and operational impacts, since these headlines caused renewed focus on data privacy and surveillance issues.

    How does one sort through the fact and fiction, while reassuring jittery customers and maintaining business relationships? David Snead, co-founder and public policy chair, Internet Infrastructure Coalition (I2C) will address these issues in a session titled, “Surveillance, Privacy and the New Congress” at the spring Data Center World, which convenes in Las Vegas April 19-23. The conference’s educational tracks will include many topical sessions, covering issues and new technologies that data center managers, service providers as well as owners and operators face, such as external challenges like the security, privacy and surveillance situations in the Internet age.

    Snead said his session will cover “understanding what the debate is about, how the world has responded and how to deal with it.”

    Snead, an attorney, makes presentations globally on behalf of the I2C, which is an industry group representing the interests of those who build and run the “nuts and bolts” of the Internet. Coalition members include such companies as Google, Equinix, Rackspace, among many others.

    Surveillance Erodes Trust

    “The debate about privacy has been going on for a significant amount of time,” Snead said. However, the NSA-triggered discussion has become a big issue for business, and particularly, for data centers and infrastructure providers, he explained.

    The business relationships have changed. “Customers don’t trust infrastructure providers,” Snead added. “They don’t want to be in a situation where they won’t know that their data is being disclosed. People are getting questions from customers.” Concerned clients are moving their business to providers outside the United States, he added. That is a significant business impact.

    Among privacy advocates and businesses, law enforcement agencies were not trusted previously, but now companies in general have had an erosion of trust. “Government and people are responding in different ways to restore that trust,” Snead said, adding that the I2C is “pushing hard” to create more transparency and improved surveillance laws.

    Current Hot Issues

    Since Snowden’s revelations, there have not been any successful changes to U.S. laws and regulations to alter how the government is going about its surveillance operations. However, a part of the Patriot Act (Section 215) that is used as a legal justification for phone metadata collection will be up for renewal before Congress this summer.

    “This is going to be a huge fight,” Snead said. The coalition believes that the situation needs to be addressed and it will be hotly debated, he added.

    This week, the Obama administration said that if the provision is not renewed, the government will stop the bulk data collection, according to this statement on the Electronic Frontier Foundation website. The EFF also states that there is a possibility that data collection could continue, even with Section 215 being expired. Snead said the continued collection is a slim possibility.

    Other current issues are laws that were put in place in the European Union to protect data privacy, which impacts U.S. companies as well as global ones. “Privacy and surveillance are a continuing concern for people around the world,” he added. “Germany is particularly concerned about this issue. Other countries are concerned as well, such as Brazil.”

    The European Union has finalized a new regulation, the General Data Protection Regulation, which applies to American businesses doing business in any European country and holds them to the European standards of privacy protection.

    Also, there are issues of localization of data. New regulations are coming about the knowledge of where data resides and where it transits. This presents operational challenges as data centers need to audit their policies and procedures, review IT applications and really know where data lives. For example, if a business requires its data to be entirely in the United States, Snead said, a small application or script cannot hit a server in Canada for any reason.

    “Data centers will have to work with customers to ensure that data will not leave this bandwidth, that it won’t transit outside this area,” he said. This brings real operational concerns to infrastructure providers, and calls for working with bandwidth providers and detailing customer requests in contracts.

    What Can Data Centers Do?

    In terms of building trust with customers, Snead suggests a review of privacy policies or other customer statements and agreements is in order. He adds that being as transparent as possible with customers helps.

    Infrastructure providers would be well served to “set out how information is disclosed, and how they share info (to the extent that you can), have a detailed compliance policy (what happens when the provider is served a warrant or subpoenas and differentiate between the two), and explain how any data request activity is disclosed to customers,” Snead said.

    One challenge with some data requests made by law enforcement is a provision which includes a “gag order” on the infrastructure provider. “Google and larger companies have an agreement with the Attorney General,” he noted, “that allows them to reveal the number of requests for data that they have responded to, in ‘bands.’ For example, a band is 0-100, in number of requests. Most companies are a lot smaller than Google or Yahoo!. When the bands are larger rather than smaller, customers assume the worst, so if a band is 0 to 100, they assume 100 requests. It would be better to have narrower bands.”

    What Lies Ahead?

    Snead said the heightened awareness of privacy and surveillance will remain at front of mind for a while. “Requests are coming from all customers,” he noted. “The privacy pendulum is moving towards increased awareness of these issues, rather than away from it.”

    To learn more about privacy, surveillance and its business impacts on data centers, attend Snead’s session at spring Data Center World Global Conference in Las Vegas. Learn more and register at the Data Center World website.

    5:49p
    Analysis: Amazon’s Unlimited Storage No Threat to Cloud Storage Rivals

    Amazon launched unlimited cloud storage plans. The unlimited everything plan is $60 a year, while unlimited photo storage is $12 a year. Fire device owners and Prime members already had unlimited photo storage. Do these new plans spell bad news for the rest of the cloud storage market?

    Pricing for the top storage tier of 1 TB used to run $500 a year. That ceiling has been eliminated and the price slashed. The price cut is drastic, but it doesn’t spell doom and gloom for cloud storage.

    Unlimited cloud storage for consumers is a complement rather than the product itself. Amazon wants consumers using Cloud Drive because it basically ensures they are on the Amazon platform. It makes them more likely to purchase digital music and arguably even physical goods, because that consumer is within Amazon. The move isn’t about making consumer cloud storage a profitable product so much as a play for consumer hearts.

    Google and Amazon have the data center infrastructure scale to make these low price points work. However, Google Drive and Amazon Cloud Drive could be completely free and still benefit the giants in terms of getting people on their platforms.

    There is a cloud storage price war, but of more concern are business-focused storage cost cuts. Google recently fired enterprise cloud storage shots of its own, unveiling cold storage that is a serious competitor to AWS.

    IBM SoftLayer caught up with the other enterprise cloud storage product offerings earlier this week, announcing the addition of block storage and file storage to its services.

    These can be seen as a competitive threat for the likes of Dropbox and other consumer cloud storage offerings, but most of these offerings have shifted the paid focus to business offerings. The raw storage itself is a commodity – it’s the services and security around storage that businesses pay for.

    This is seen in earnings from Box. The focus on calls is business-account progress: investors want to hear the number of paid business accounts, not the number of raw users. Almost all acquisitions in the cloud storage space have gone toward capturing the enterprise market.

    Also of note is the group of consumer cloud storage closures leading up to the creation of Dropbox and others. AOL shuttered their consumer cloud storage Xdrive in 2008, as it was deemed unprofitable. That was a much higher price point and AOL had scale.

    Microsoft, Google, Dropbox all offer 1 TB of storage for $10 and under a month. In the consumer space, 1 TB is enough for most users. There will be some that really test the ceiling with Amazon, but Amazon won’t run out of space, and it’s cheap for it to operate the very hands-off service.

    These price wars can be devastating for other players, but the consumer space does not dictate the health of the business space. A similar price war occurred in the web hosting space, with the rise of cheap and unlimited plans meaning giant web hosters essentially pricing out the competition. It was another game of unlimited chicken – the majority of users don’t need that much capacity. The ones that do are better suited for business offerings with bells and whistles.

    The web hosting market has consolidated considerably since and moved away from price as a differentiator. What happened to web hosting is now happening to cloud storage. Amazon isn’t interested in making money on unlimited cloud storage so much as becoming a platform for user content.

    6:11p
    Energy Firm’s Cray System One of World’s Largest Commercial Supercomputer Deployments

    Cray announced it has been awarded a contract for a 5-petaflop supercomputer for Norwegian oil-and-gas company Petroleum Geo-Services.

    One of the fastest supercomputer systems deployed by a private-sector organization and one of the supercomputer vendor’s largest commercial sales ever, the PGS solution is comprised of a Cray XC40 supercomputer and Cray Sonexion 2000 storage system. PGS will use the system to produce high-resolution seismic maps and three-dimensional models of the earth’s sub-surface, which oil-and-gas companies use to explore and produce offshore reserves worldwide, according to Cray.

    The Top500 list of fastest supercomputers in the world is dominated by public-sector entities. Cray noted that the PGS system will be among the largest commercial supercomputers deployed. At 5 petaflops, the new system would likely rank in the top 10 supercomputers in the world, where three systems are currently Cray installations.

    Launched late last year the new XC40 focused on driving down cost for I/O-intensive applications and includes Cray’s DataWarp I/O accelerator technology. The XC40 is expandable to 100 petaflops and is loaded with the latest Intel Xeon processors, Intel Phi coprocessors, Cray’s Aries system interconnect, and a number of other HPC-optimized features. Its scale-out Sonexion 2000 storage solution can scale to over 2 petabytes in a single rack and performs at 1.7 terabytes per second in a single file system.

    “With the Cray supercomputer, our imaging capabilities will leapfrog to a whole new level,” Guillaume Cambois, executive vice president of imaging and engineering at PGS, said in a statement. “We are using this technology investment to secure our market lead in broadband imaging and position ourselves for the future.”

    6:38p
    Bye Parallels, Hello Odin: Parallels Renames Service and Hosting Provider Unit Odin

    logo-WHIR

    This article originally appeared at The WHIR

    Parallels has split its service and hosting provider business from its cross-platform solutions unit, renaming its service provider business Odin as part of its re-commitment to the hosting industry.

    In an interview with the WHIR at WHD.global 2015 on Wednesday, Odin CEO Birger Steen said the move was a natural next step as the company doubles down on investments in containers, WordPress and other services of interest to hosting providers.

    “Our hosting and service provider business used to be known as SWsoft until 2008 when we changed names to Parallels because we had acquired the Parallels software business,” Steen said. “What we realized is we’re actually in two great businesses, and there is very little overlap or synergy between them.”

    Steen said there are about four million users of its Parallels Desktop and Parallels access solutions, and 10,000 service providers using Parallels Plesk and Virtuozzo.

    “It wasn’t very hard to choose which one was the Parallels side because the first Parallels product was Parallels Desktop for Mac which is a tremendously successful product,” he said. “And a very strong mass market appeal and roots for that brand, particularly in the Apple community.”

    Steen said that Odin represents “unity, strength and wisdom, and is certainly memorable. We’re very happy with the choice.”

    So far, Steen said that he has received positive feedback from the hosting and cloud companies in attendance at WHD.global, where Odin has revealed its new branding.

    “The reception has been great,” he said. “People like the simplicity and the symbolism, they like not having to figure out if Parallels is spelled with two, three or four Ls.”

    Steen said that while the name is different, Odin is as committed to the service provider business as Parallels has been.

    “In a sense the priorities and the choices we made over the last year or two in that business are priorities we’re going to continue to execute and move forward with,” he said.

    “What’s important there is we’ve reaffirmed our commitment to the hosting community with Plesk 11 and Plesk 12. I think we showed that not only are we investing in control panels but we’re also innovating the category.”

    “With Plesk 12 we’re going to be much more value-driven and focused on the user needs by saying, ‘here’s what you need if you’re a small business user, here’s what you need if you’re a web professional who sees the control panel as a tool for making money from web design and limited scale hosting, here’s how you will use it as a small hoster and here’s how you’ll use it as an app developer’,” he said.

    Steen also said that Odin has brought the Virtuozzo brand back to the forefront.

    “Virtuozzo was a container-based virtualization technology and it was developed specifically for hosters to run virtual private server as an upgrade to shared hosting,” Steen said. “We’ve been carrying the torch for the container technology per se for the last 15 years, the last five years or so you’ve seen people like Google and Facebook and others starting to use it internally but the last year to 18 months has really exploded.”

    “People are starting to see that virtual machines are good if you’re enterprise IT and if you have a bunch of server sprawl that you want to consolidate but if you want to run an agile and a fast-paced dev-ops infrastructure what you need to do is containers,” he said.

    This article originally appeared at http://www.thewhir.com/web-hosting-news/bye-parallels-hello-odin-parallels-renames-service-hosting-provider-unit-odin

    7:07p
    Microsoft Aims New Azure App Service at Developer Hearts

    Microsoft this week unveiled Azure App Service, a collection of backend web and mobile application creation services that bundle existing services like Azure Websites (now dubbed Web Apps), Azure Mobile Services and Azure BizTalk. Also included are additional features such as logic apps and a service for building APIs.

    The bundle of Azure app services will cost the same as the standalone Azure Website Service.

    This is Microsoft’s bid to entice developers with a goal of becoming the platform for the new breed of apps. By bundling existing app development services, it allows a developer to build applications that support scale and multiple platforms and connect to other data feeds.

    “App Service is a new, one-of-a kind cloud service that enables developers to build web and mobile apps for any platform and any device,” wrote Bill Staples, corporate vice president, App Platform, Microsoft Azure. “App Service is an integrated solution that streamlines development while enabling easy integration with on-premises and SaaS systems while providing the ability to quickly automate business processes.”

    The service is specifically for writing apps on Azure, despite capabilities to hook in and consume data and services from other clouds and applications with public APIs.

    Web Apps allows hosting websites, web apps, and APIs in the cloud and supports multiple languages and frameworks. It features enterprise capabilities like hybrid connectivity and Active Directory integration.

    Mobile Apps is a mobile application development platform that provides a comprehensive set of SDKs for developing on multiple phone platforms and makes it easy to build mobile app-type functionality such as push notifications.

    The newest addition, Logic apps, is a service that uses a visual designer to build complicated and automated business processes and workflows. Apps can then plug into publicly available programming interfaces. Microsoft has built in dozens of connectors to SaaS applications like Dropbox, Twitter, and Office 365.

    API Apps provides versioning capabilities for APIs and easily manages new versions and updates in real time.

    Those currently using one of the formerly separate services will migrate over to the new suite. Azure App Service is backed by a common development, management and billing model.

    Backing the new service is the Azure cloud. Azure runs in 19 data center regions around the world with multiple copies of apps running simultaneously.

    The company also announced Azure services for a student development program which provides high school students with free access to Azure.

    << Previous Day 2015/03/26
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org