Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, March 25th, 2015

    Time Event
    12:00p
    The Allure of Singapore, the World’s Second Gateway to China

    It’s difficult to accuse George Slessman of subtlety. Bold statements and bold actions make up the style of Phoenix-based data center provider IO’s founder and CEO.

    IO began as a fairly standard colo company but very quickly differentiated by putting a spin on its business that was unusual in the colocation business: it would provide colo space inside shipping-container-like modules it designed and manufactured at its own factory outside of Phoenix. A customer could also deploy modules in a location of their choosing, with IO providing management services. Finally, you could simply buy an IO module if you wanted to.

    It was a colo with a highly differentiated business model that never stopped evolving.

    IO’s bold strategic evolution has always been accompanied by its CEO’s equally bold public statements. During a presentation at an industry symposium in 2011, Slessman said that no “snowflake” data center would be built after May 2013. By “snowflake” he meant custom non-standardized data center facilities.

    The rather specific prediction was obviously meant to raise eyebrows and help the cause of convincing the industry that IO’s modules were indeed a one-size-fits-all data center solution. Snowflake data centers have not gone away of course. Nobody could accurately predict whether that would happen in 2011, and nobody can today, but the statement served its purpose. The message was out.

    Following ‘the Center of the Internet’

    His most recent bold statement has to do with Asia. Chinese e-commerce giant Alibaba’s IPO in September – reportedly the largest tech IPO ever – marked the moment when “the center of the Internet moved to China,” Slessman told Data Center Knowledge in an interview.

    That bold statement, as customary, was accompanied by a bold move. Earlier this year, Slessman and his wife changed their place of permanent residence from Phoenix to Singapore, and his belief that the center of the Internet was now in China had everything to do with it.

    The idea to move to the Southeast Asian city-state came around the time IO launched its Singapore data center in September 2013. “I just became increasingly intrigued with the [Singapore] market in general,” he recalls.

    George Slessman, founder and CEO of IO

    George Slessman, founder and CEO of IO

    The hundreds of millions of people in Southeast Asia, and the rate of growth in Internet users reminds him of the U.S. a decade or so ago. Alibaba’s blockbuster IPO was a sign that Asian markets were just starting to mature. And, compared to major U.S. metros, the Singapore data center market was fairly open. There are data center providers, but there aren’t nearly as many of them as there are in mature North American or European markets.

    “There just aren’t a lot of international companies that want to take the time to understand the local geography and market,” Slessman says. He thinks Singapore will be the largest growth market for IO in the next five years. The company entered the New York market about two years ago, and is now also preparing to launch in Slough, U.K.

    The Second Gateway to China

    Singapore is one of Southeast Asia’s more mature data center markets, Jabez Tan, senior analyst at Structure Research, says. Telcos dominate data center markets in other parts of the region, while Singapore has a good mix of both telcos and data center specialists.

    The primary reason the small island nation has such an active data center market is that it has become an Internet gateway between China and the rest of the world, Tan explains. Now on its way to reaching a gateway status that’s on par with Hong Kong, Singapore is where international companies go to serve customers in China, and where Chinese companies go to serve customers in Europe or North America.

    Demand is Growing, but so is Supply

    Demand for space in the Singapore data center market comes from companies that already have presence there but want to expand as well as from newcomers. “You’re primarily seeing [demand from] cloud and IT service providers,” Tan says. Just this month, CenturyLink announced the launch of a Singapore location for its cloud services, and so did Fujitsu.

    There is also quite a bit of demand from financial services companies. One example was Goldman Sachs, which was the anchor tenant at IO’s Singapore data center.

    (Watch a video of a motorcade consisting of Goldman’s data center modules and local police on motorcycles rolling through Singapore streets.)

    Major international players in the Singapore data center market include Silicon Valley giants Equinix and Digital Realty, U.K.’s Global Switch, Japan’s NTT, and New Zealand’s OneNet. Local companies with substantial presence there include Keppel and ST Telemedia.

    There is also a new local provider called Rack Central that’s “trying to enter Singapore in a big way,” Tan says. Rack Central is an example of a company using the island nation as a gateway to China. It is planning two locations in Singapore and one in China.

    Equinix announced the launch of its third Singapore data center this month, its largest in Asia Pacific, and more than a handful of other existing players are looking to expand over the next several years. They include OneNet, Global Switch, Telecom Indonesia, and StarHub, an IO partner. Not all of this capacity is going to come online this year. It will happen over a period of three years or so, Tan says. “I don’t think you’re going to see a supply glut in Singapore.”

    New Market, New Focus

    Singapore being such a rapidly growing market, Slessman felt it was important for him to be there physically. His vision extends beyond Singapore alone. “We are going to be expanding significantly in the Asian market,” he says. “It’s important to us to have that focus and the clarity of having the senior leadership team there.”

    Now that IO has been split into two companies – one of them is IO the colocation provider, and the other is BaseLayer, the maker of data center modules and management software led by Slessman’s brother William – he can focus on growing its colocation business in one of Asia’s hottest markets. Providing senior leadership with the ability to focus on their respective missions was the primary reason given for the split.

    But Slessman also simply likes Singapore as a place to live. “It’s the safest country in the world,” he says. “It’s a wonderfully organized, clean, safe place.”

    3:30p
    Video: The Calibrated Data Center

    Chris Crosby is CEO of Compass Datacenters.

    The issue with many new “next big things” is that they tend to skip one or more essential steps. In this brief video, Compass Datacenters’ CEO, Chris Crosby, will explain why calibrating your data center is the essential step required to accurately measure and model data center performance and provides the necessary bridge to new capabilities like the Software Defined Data Center.


    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    3:40p
    Snowden Urges Cloud Providers to Take Action Against Mass Surveillance

    logo-WHIR

    This article originally appeared at The WHIR

    If you attended the WHD.global 2015 keynote with former NSA contractor and whistleblower Edward Snowden on Wednesday morning, it is very likely that you were being watched.

    “Unfortunately in many ways I am the X,” Snowden told the packed conference room at WHD.global. “I expect and accept that at this point I’m going to be scrutinized by every government and every bad actor in the world.”

    “In a real way it’s interesting because it means that there are records that are being made right now of you, the people who are here. Not because you’re a target, but because you are associated.”

    Snowden spoke to an audience of cloud and hosting service providers via a live video stream, and was interviewed by Wikileaks journalist Sarah Harrison, who attended the event in-person. Harrison escorted Snowden to safety in Russia two years ago. She currently lives in Berlin.

    Speaking from her experience as a journalist, Harrison started the presentation by discussing how journalists’ attitudes towards security have changed in the wake of Snowden’s revelations.

    “Unlike most other media, one of the first things I learned when I joined my job [at WikiLeaks] was security,” she said.

    Prior to 2013, when Snowden’s first revelations about the mass surveillance governments came out, Harrison said journalists were not comfortable using encryption, and often told her that she must be hiding something illegal, like child porn, if she was using TOR. Things have come a long way since then.

    “I can’t tell you how annoying it was to tell journalists that encryption was necessary, especially in the West…there has been quite a drastic change,” Harrison said.

    That drastic change hasn’t just applied to journalists, as online users and technology companies around the world have had to face an entirely new security landscape brought forth by Snowden’s revelations about mass surveillance.

    “When we talk about journalists and progress we kind of zoom out from that 2013 moment. The NSA revelations changed the fabric of the Internet,” Snowden said.

    Use of Encryption is Growing

    Snowden said that the amount of encrypted traffic has more than doubled since 2013, and a lot of work on encryption is happening in academics and technology companies.

    The type of security actions a person or organization might take “ultimately depends on what security specialists call a threat model,” Snowden said. “You need to think what the likely vectors are for attack.”

    When Harrison mentioned that more journalists were clearing their browser histories, Snowden said that “as a basic practice, clearing your browser history is great…however that’s not really how surveillance works.”

    “You have routes across the Internet between them, that is where the majority of surveillance happens online today,” he said. “Your cookies can flag you, however your IP address, your email address, all of that is visible when it crosses the wire, particularly when it’s not encrypted.”

    So, can we trust encryption? Snowden said that one of the biggest and most important steps is removing the NSA from the standards process since as it stands they are given the ability to influence the existing standards.

    “We need to have community standards that are internationally selected,” he said.

    For service providers who rely on open source standards, they should be setting aside some of their budget to help fund those projects that their infrastructure relies on.

    “The problem is this is developed by volunteers, while it is open and it can be reviewed, if we are not funding review will miss critical issues,” Snowden said.

    “We need greater adoption, but we also need companies to look at where they are relying, where we can kick them some tiny amount of money each year,” he said. “It’s a positive PR hit and is just common sense when you’re relying on this common infrastructure.”

    Cloud Service Providers and the Reputation Risk

    Protecting users’ privacy is not just the right thing to do, it is also a smart business move.

    The most important thing that you need to think about are the promises you’ve made to your users and how they would expect you to operate,” Snowden said. Making headlines for being a company that has been the target of NSA attacks has a negative effect on a service providers business.

    “You want to be the guy that the users, that the customers trust,” Snowden said. “You have the opportunity today to have the trusted service provider relationship with your customers simply by changing your policies”, and only retaining information necessary to your business, he said.

    “So when people come knocking, people are reminded that investigations are their job, not yours.”

    Companies like Gemalto or Cisco, who have been targeted by GCHQ and NSA, respectively, “do have a legal cause of action here” but haven’t moved on it because “they’re trying to stay out of headlines.”

    Service providers that make it clear to their customers that they will protect their interests, even if it means taking legal action, are “the ones that are going to be successful.”

    “Make a commitment to your users when they say, ‘if it is shown that someone has attacked our networks, particularly government, we will litigate this’,” Snowden said.

    In the case of Cisco, which Snowden elaborated on, he said the company could create methods for verifying that upon receipt of shipment the same code is received as was sent. It could set up a trace, physically on hardware or on software, that indicates a change has been made. These methods don’t have to cost a lot of money either, he said.

    Snowden said that service providers who take trust in their brand “out of the equation” and instead design their systems and products in such a way that even their “worst, most cutthroat competitors will trust” will see real success and be thought-leaders in security and privacy.

    Can Service Providers Stand Up to the US Government?

    “What do you do when the most powerful government in the world shows up at your doorstep and tells you to change your business process?” Snowden asked.

    He said that service providers should cooperate with the government, but make it clear that they will not change their operations entirely to comply with their requests.

    Snowden’s own email provider, Lavabit, went out of business after it refused to comply with the US government’s request. “The FBI demanded that [Lavabit] provide access to Snowden’s personal email, and everyone within their service,” he said. He called this a “teachable moment” for service providers.

    He said companies could also set up a global presence in order to house data through wholly owned subsidiaries. He likens the approach to “the incredible extent that some enterprises have gone to avoid tax liability.”

    With multiple subsidiaries, service providers will be able to better protect themselves through being under different jurisdictions.

    “When you limit your liability, you’re limiting your vulnerability. For business, that’s really important,” Snowden said.


    When you limit liability, you limit your vulnerability #snowden #WHDglobal
    Click To Tweet


    Snowden’s Take on Cloud Security

    The cloud is vulnerable to a number of different security issues, but the “idea here is you want to have a number of options and make use of all of them,” Snowden said.

    “We have companies like Dropbox who you can’t trust, because they are actively hiring people that worked on warrantless wiretaps,” he said.

    Being aware of the low hanging fruit and the weaknesses in your network can lead to higher security. Still, “we don’t want the technical community to be dictating the way society works,” Snowden said. “When we think about free and open liberal democracies” we think about debate and participatory government.

    Technology companies should be spending some of their budgets on lobbying changes to existing legal frameworks.

    “Just because the NSA can find their way in your smartphone doesn’t mean we should throw our hands up or give up entirely,” Snowden said. “We don’t want to lock law enforcement agencies out of everyone’s computers around the world entirely – but at the same time we don’t want them to be able to look at everybody for no cause.”

    This article originally appeared at http://www.thewhir.com/web-hosting-news/snowden-urges-cloud-providers-take-action-mass-surveillance

    4:52p
    Apple Acquires NoSQL Database Startup FoundationDB

    Apple has acquired FoundationDB, a Virginia-based startup developing a NoSQL database, Tech Crunch reported. The reasons behind the acquisition aren’t immediately apparent. What is apparent is the increasingly growing role the database is playing in modern cloud services.

    FoundationDB specializes in high-speed ACID-compliant transactions. “ACID” stands for Atomicity, Consistency, Isolation, Durability. It provides NoSQL and SQL access and is multi-model, meaning many types of data can be stored in a single database. It uses a distributed architecture that scales up and down and handles faults, while looking and acting like a single ACID database.

    FoundationDB’s claims make it suitable for web apps, and Apple will potentially use it with some of its services like iMessage. Apple is not discussing its plans publicly.

    FoundationDB has flown under the radar, though the company has raised a few rounds totaling over $20 million. Founded in 2009, its goal was to address the lack of transactional NoSQL database systems.

    The startup is no longer offering downloads of its software, meaning potential headaches for early adopters. A commercial version of its database was released in 2013.

    “FoundationDB has been popular by its capability to store any object given its key value store and then making it easy to access for mainstream DB developers with its SQL interface,” Holger Mueller, vice president and principal analyst at Constellation Research, said in an email. “It’s sad for the DB community that all external activity of FoundationDB will cease and to a certain point a mystery what Apple will be doing with the assets, but we expect an internal usage of the assets.”

    Apple has acquired 23 companies in the last 15 months.

    5:47p
    White House Poaches Facebook Engineering Director to Run its Own IT

    Further building out his U.S. Digital Service team President Barack Obama appointed former Facebook Engineering Director David Recordon to the newly created position of Director of White House Information Technology. Recordon will be responsible for “modernizing the White House’s own technology,” the administration’s announcement explained.

    On a mission to optimize service delivery by the White House IT infrastructure, Recordon will build on past efforts to converge overlapping systems, modernize collaboration software, and bring use of new technologies in line with private-sector best practices. The presidential memorandum establishing the new position gives it the “authority to establish and coordinate the necessary policies and procedures for operating and maintaining the information resources and information systems provided to the president, vice president, and EOP (Executive Office of the President).”

    The USDS team was launched last year. It is tasked with building effective, modern digital services and optimizing public interactions with the various government digital channels. The team engages engineers, designers, and product managers to take on government IT challenges.

    “In our continued efforts to serve our citizens better, we’re bringing in top tech leaders to support our teams across the federal government,” Obama said in a statement. Recordon’s “considerable private sector experience and ability to deploy the latest collaborative and communication technologies will be a great asset to our work on behalf of the American people.”

    Recordon was a consultant to USDS for about one year prior to his recent appointment, according to his LinkedIn profile.

    At Facebook Recordon developed internal productivity tools and led open source, engineering education, and other teams. He was also a founding member of the OpenID Foundation, which promotes a single open standard for online authentication provided by third parties.

    The Obama administration has heavily recruited Silicon Valley talent,as it builds technology expertise in federal teams. Former Google X vice president Megan Smith was appointed U.S. CTO last year. Former VMware CIO Tony Scott was appointed as U.S. CIO earlier this year, and DJ Patil, whose resume includes senior roles at eBay and LinkedIn, was named the first U.S. Chief Data Scientist.

    10:52p
    IBM SoftLayer Adds Enterprise Cloud Storage Options

    IBM SoftLayer has expanded the variety of enterprise cloud storage flavors available to its customers, adding block and file storage services for bare-metal and virtual cloud servers. Until now, the company offered object storage, customizable SAN- or NAS-like mass storage servers, and cloud backup.

    While primary uses for object storage include storing objects like emails, images, or other media files, objects that remain unchanged and can be accessed quickly by applications, block storage volumes, treated like individual drives by operating systems, are more for high-performance apps that use dynamic data sets.

    SoftLayer is offering two service tiers for each of the new enterprise cloud storage services: the high-durability Endurance tier, and the maximum-IOPS Performance tier. Endurance starts at $0.15 per Gibabyte. Performance starts at $0.10 per 1GB plus network charges.

    SoftLayer CTO Marc Jones said the changes to the company’s cloud storage portfolio were meant to bring it in line with the diversity of workloads being deployed in the cloud today and their many unique storage needs. “When an application or data set is mission critical, it is important to be able to control as many dimensions of its storage as possible,” he said in a statement.

    Block storage is a well-established product in the enterprise cloud storage market, available from SoftLayer’s major rivals.

    Amazon Web Services has had its Elastic Block Store service since 2008. Google announced its Compute Engine Persistent Disk service at the same time it announced Compute Engine in 2013.

    Microsoft Azure does not offer block storage exactly. It has its own approach to cloud storage, consisting of Blobs (for unstructured data), Tables (for structured NoSQL data), Queues (for messaging), and file storage.

    11:16p
    Software Defined Storage Startup Hedvig Raises $12.5M

    After three years of growing funding, software-defined storage company Hedvig wants to change the way enterprises approach and architect storage. The storage startup plans to spend the $12.5 million recently raised from investors that include Atlantic Bridge Capital, True Ventures, and Redpoint Ventures on further development of its distributed storage platform.

    The new platform lets cloud and data center professionals consolidate storage of any type, in any location—on premise, and in private or public clouds—into a virtualized pool.

    It also enables complete protocol consolidation by collapsing several layers of the storage stack into a single software platform, resulting in faster provisioning time, lower costs and a more flexible storage picture. The company claims it provides the ability to scale from several terabytes to petabytes.

    The storage world continues to undergo a revolution with many startups looking to challenge legacy architectures. Software-defined storage startups are prepping platforms for next-gen storage. Data Center Knowledge recently looked at software-defined storage at a high level.

    Companies like Hedvig look to help companies avoid the need for constant refreshing by providing a platform that makes commodity storage more future proof by pooling resources. Software-defined storage in general means the answer for enterprises isn’t necessarily a hardware refresh, but rethinking storage as a distributed system and pooling commodity storage together with software. As a distributed system, organizations can increase capabilities as needs evolve without having to refresh or reset their storage architecture.

    “Storage platforms that were built just five years ago cannot handle modern workloads which are complex, heterogeneous and spread across internal data centers and the cloud,” said Brian Long, general partner at Atlantic Bridge in a press release. “To keep up with this rapid pace of growth and change, storage platforms must now be software defined. They must also be scalable, streamlined and most importantly, flexible enough to make instant provisioning changes.”

    Hedvig wanted to announce its platform earlier but decided to wait until the product was proven and being used by customers.

    “We considered launching the company at the seed round stage more than a year and a half ago, but many industry experts did not believe what we were doing could be done so it didn’t make sense to publicize our company until our platform was proven and in production at several customer sites,” said Hedvig founder and CEO Avinash Lakshman in a press release.

    Lakshman said that the Hedvig platform looks like infrastructure that Google, Amazon and Facebook run internally but is packaged in a way that makes it accessible to any enterprise.

    The big web scale companies continue to lead the way in re-architecting IT, with many startups attempting to bring their innovations over to the business world. Lakshman has built distributed computing systems at Facebook and Amazon including Cassandra and Dynamo platforms. Hedvig takes that distributed systems approach and applies it to storage.

    “I think we can all agree that software-defined storage is the future, but Hedvig tackles the problem from a very unique, distributed systems approach,” said Puneet Agarwal, Partner at True Ventures in a press release. “We look forward to seeing the company grow and pave the future of modern storage.”

    Forrester Research believes the software-defined storage market will reach $13 billion by 2020. Four out of five enterprises today are looking at SDS.

    Other startups with software defined storage platforms include Springpath, a company founded by VMware vets that recently raised $34 million; and Intel Capital-backed Maxta, provides a storage platform that turns standard servers into a converged compute and storage solution for virtualized environments.

    EMC and IBM are investing heavily in software-defined storage, though these giants still often couple the platform with specific hardware. IBM recently committed to investing over $1 billion on software-defined storage. HP bundles SDS capabilities with servers.

    There are also open source approaches. SwiftStack raised $16 million for open source software defined storage. Red Hat has a big play, making a few SDS acquisitions such as Ceph experts Inktank. Finally, Linux distro provider SUSE recently released SUSE Enterprise Storage. Based on open source Ceph, it is a distributed software-based storage solution for enterprise customers. It leverages commodity off-the-shelf servers and disk drives.

    << Previous Day 2015/03/25
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org