Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, December 17th, 2015

    Time Event
    1:00p
    Sentinel to Bring Direct Cloud Connectivity to Its Data Centers

    Sentinel Data Centers, a New York-based data center provider that caters to large enterprises for whom it builds custom high-capacity data halls in its large East Coast facilities, is planning to start offering customers direct private network links to large public cloud providers, such as Amazon Web Services and Microsoft Azure.

    Through a partner, Sentinel will be providing “capabilities to direct-connect to basically all of the major cloud providers,” Todd Aaron, the company’s co-president, said in an interview.

    Large enterprises prefer to use cloud services over private network connections for both security and performance reasons. Being able to plug directly into AWS, Azure, IBM SoftLayer, or Google Cloud Platform is growing in importance among factors data center customers weigh when selecting a data center provider, and many providers have been expanding cloud connectivity options in their facilities around the world.

    Many decisions companies make about their IT infrastructure in general today are made “under the shadow of cloud” if not about cloud directly, Bob Gill, a Gartner analyst who covers the colocation space, said in a presentation at the market research firm’s data center conference in Las Vegas earlier this month. “Cloud and colo are natural-born allies,” he said.

    While enterprises can use network carriers to connect privately to cloud services from their on-premise data centers, it makes much more sense to take space in a colocation facility from where they can get that direct access to multiple cloud providers at once, Gill said. Large enterprises rarely use only one cloud provider, so having access to a variety of providers is important to them.

    For Sentinel, the decision to build a portal for direct connectivity to cloud providers was driven by both existing and prospective customers, Aaron said. Having the option is appealing even to companies that aren’t currently using public cloud services, he said.

    According to him, almost all Sentinel customers are Fortune 500 companies, including financial services firms, pharmaceutical and healthcare companies, as well as large IT service providers. Sentinel has also built a custom data center for Bloomberg in Orangetown, New York, together with Russo Development.

    The company will be using a partner to set up the cloud portal, but Aaron did not disclose who the partner would be. Cloud connectivity will be available in Sentinel’s multi-tenant data center sites in Durham, North Carolina, and Somerset, New Jersey, data centers.

    The cost of using data center services by the likes of Sentinel in North Carolina recently improved, as the state lowered the investment threshold for benefiting from data center tax breaks that massive data center operators like Facebook and Apple have been enjoying there. Sentinel tenants will be able to take advantage of sales and property tax on IT equipment and sales tax on energy purchases, Aaron said.

    The new tax breaks will make North Carolina more competitive as a data center location against other states and especially its neighboring Virginia, which has cultivated one of the world’s largest and most active data center markets. Considering energy rates and tax incentives, the TCO in North Carolina can now be up to 15 percent lower than in Virginia, Aaron said.

    4:00p
    eBay May Build Reno Data Center Beside Leased Space at Switch SuperNap

    eBay is considering a $230 million investment in its own data center build near Reno, Nevada, in addition to the space its servers will occupy in the multi-tenant data center Las Vegas-based Switch is building in the area.

    “This potential expansion is separate from eBay’s decision to be an anchor tenant at Switch’s new facility in the Reno area,” company spokesman Ryan Moore wrote in in an email. “We are also considering building an eBay-owned facility within that same Switch campus.”

    Like other major web-scale data center operators, the online auction giant, which recently separated from its former sister company PayPal, uses a mix of leased and owned data centers. eBay occupies a huge amount of space across several buildings on Switch’s SuperNap data center campus in Las Vegas, but it has also designed and built its own data centers in Utah and Arizona.

    Switch will be involved in the eBay-owned data center build in Reno if the internet company goes ahead with the plan. The data center provider will essentially build something similar to a powered shell on its campus, which eBay will fill with its own infrastructure, Switch CEO Rob Roy said.

    Rendering of the planned Switch Tahoe Reno SuperNap data center campus (Image: Switch)

    Rendering of the planned Switch Tahoe Reno SuperNap data center campus (Image: Switch)

    Switch expects to invest about $1 billion in its SuperNap Tahoe Reno campus. The company said it will be the world’s largest data center, providing 150 MW of power and 82,000 tons of cooling capacity.

    Until recently Reno was known primarily for its casinos – a smaller-scale version of Las Vegas – but over the past two years the city has attracted some big-name technology companies. With Switch building a massive multi-tenant data center campus there, more high-tech firms are likely to come to town.

    The Switch campus is neighboring a construction site for a massive Tesla battery plant, which will reportedly also include a data center. Also in the vicinity is a quickly growing Apple data center campus.

    4:30p
    Why No One is Using Your API

    Deepak Singh is the president and CTO of Adeptia.

    Application Program Interfaces (APIs), which are a natural evolution from the Services Oriented Architecture of the early 2000s, are all the rage in connectivity and integration. Despite being quite useful for providing data access to mobile apps and creating layers of services which abstract underlying data sets, APIs do not seem to work well in the business-to-business integration (B2Bi) area where companies create automated data connections with their customers, partners, and suppliers to exchange business information.

    A common refrain from clients and prospects is often the frustrated, “No one is using our APIs!” These companies say that they created their APIs with a lot of fanfare – sometimes spending more than a million dollars to analyze, implement, document and publish – and then found that hardly any companies were connecting to it. As a result, they sometimes add more features (better security or better documentation) in an attempt to make their API more attractive, but even those enhancements didn’t seem to help their offering gain much traction. In other words, the “if we build they will come” approach does not work for APIs. The reason for this is that APIs were not the right choice in the first place. Here are the five main reasons why publishing APIs to create inter-company connections does not work well as a singular approach:

    APIs Are Hard To Use Because They Are For Developers

    Many companies don’t quite understand how much they’re asking from the companies they’re hoping will connect with their API. Consuming APIs is a difficult task which requires developers with extensive knowledge of programming and the ability to read technical documentation. The process involves requesting API keys for authentication, writing code to call it, testing it, and then deploying it in solutions. This is not a simple task; it’s a full-fledged IT project.

    Using An API Still Requires Back-End Integration

    On top of this, many companies that acknowledge the necessity of a developer mistakenly believe that it is a simple-enough process for an IT team from another company to connect to their API. They forget that APIs are all about moving data from one place to another, and that data has to originate from somewhere and end up somewhere. In order to connect with an API, IT teams and developers are usually required to create connections to other applications and databases in the back-end. The process of comprehensively integrating all back-end databases adds a huge level of complexity and effort to the project of consuming APIs.

    Connecting to One API is a One-Off Project

    To add insult to injury, a further problem with the difficulty of consuming APIs is that all this work is done only to connect with one company. If you have to connect with another company, then you have to go through the entire time-consuming and expensive process over again in order to use their API. This makes companies incredibly reluctant to go down the path of connecting with other companies’ APIs. They figure that if they’re going to do something, they might as well publish an API once and force other companies to connect with theirs, which leads to another problem with API integration…

    ‘I Won’t Use Your API, You Have To Use Mine’

    Another major issue with APIs involves deciding who will use which company’s API if both have their own in place. If a company has already published an API, they probably believe their connectivity job is done and everyone doing business with them should use the API they’re offering. But if they have a customer (or larger organization) on the other side that has also published an API, then who gets to dictate which API should be used? Nobody wants to initiate an IT project to connect with a particular API when the company already has one in place. This situation arises from time to time and is frustrating for both parties involved.

    Short-Lived Business Relationships

    One business reality that intrudes upon API integration is that relationships between companies and organizations these days exist for shorter and shorter periods of time. A generation ago, companies would work with the same suppliers or have the same customers for many years, and, at that point, it made sense for IT departments to spend time and effort creating automated connections with other companies. But these days business is so dynamic that companies may only do business with each other for a quarter or two and then move on. Because inter-company relationships are short-lived, doing a special project to connect with some other company’s API is often not worth the effort.

    Despite these obstacles, there is still hope for successful API integration, and the way to make APIs more attractive is by making them usable for business users. By democratizing IT, and making APIs accessible and easy to consume for business users without requiring developer skill-sets, organizations can more effectively ready themselves for the new business landscape and more efficient operations.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in ourKnowledge Library.

    6:12p
    Docker Encourages Container Log Analysis for Insights

    WHIR logo

    Article courtesy of theWHIR

    In an effort to recognize companies that provide a comprehensive view of distributed containerized applications, application containerization platform Docker this week announced its Ecosystem Technology Partners for log management of Dockerized applications.

    The first set of partners to demonstrate their logging expertise include Amazon CloudWatch, elastic.co, Graylog, Rapid7/Logentries, Loggly, Papertrail, Sematext Logsene, Sumo Logic and Treasure Data.

    Docker’s ETP program ensures that log management solutions not only integrate with a Docker environment, but also extend application portability across the platform.

    The streams of log data produced by distributed applications can be analyzed to provide operations teams insight into application health and help resolve issues. Operations teams need tools for collecting and interpreting log data to render an accurate picture of the availability and performance of these applications.

    Docker ETPs employ different methods to help collect this data. Some, like Amazon CloudWatch, contribute logging drivers directly to the Docker Engine (enabled in the 1.6 release), while others providing containerized agents that let Docker APIs connect to external collection systems.

    David Messina, Docker’s VP of enterprise marketing, said that monitoring and logging are crucial for supporting applications, but are often considered an afterthought. “Things like monitoring and logging tend to get overshaddowed by other technologies.”

    He said that the focus in recent years has shifted from data centers and IT infrastructure to applications, which are seen as making the most impact for businesses. Yet, it’s important not to forget about the underlying infrastructure especially when it comes to Docker business-critical applications running in production.

    “Things like logging and monitoring are, in effect, at the heart of making sure your solutions are up and running to your expectations,” Messina said. “They’re also at the heart of providing what you need to know to resolve issues.”

    The integrations will also allow many organizations to integrate Docker logs into their existing logging solutions, making it less risky for organizations to experiment with containerization and microservice-based architectures.

    Messina notes ETPs will position themselves well among the “large number of vendors who are now coming out that are Docker-optimized or microservices-optimized.”

    The ETP program for logging is just one of the initiatives to make Docker containers easier for organizations to adopt. Just months ago, there were several announcements made during Dockercon EU aimed at making it easier for hosts and cloud providers to provide Docker-based services, which is significant because service providers can make it easier for developers to experiment with containers.

    This first ran at http://www.thewhir.com/web-hosting-news/new-docker-program-recognizes-organizations-that-turn-container-logs-into-valuable-insights

    9:38p
    EU Regulators Agree on Pan-European Data Privacy Rules

    European Union’s three regulatory bodies have reached an agreement on common rules for governing data privacy across all member states. Europe’s data privacy reform has been in the making for at least three years and now finally appears close to enactment.

    While addressing what businesses can and cannot do with users’ personal data and outlining rules for access to personal data by law enforcement, the packages do not address cross-border data flows, which until recently were governed by a set of rules called Safe Harbor but was stricken down by the European Commission, causing a stir in the cloud services industry, where the biggest players are by their nature operating globally distributed data center infrastructure.

    “Our next step is now to remove unjustified barriers which limit cross-border data flow: local practice and sometimes national law, limiting storage and processing of certain data outside national territory,” Andrus Ansip, VP for the Digital Single Market, said in a statement on the recent agreement, reached earlier this week. Digital Single Market is an EC initiative to promote a unified single digital economy across the EU, governed by a common set of laws.

    The reforms EC, the European Parliament, and the European Council agreed to consist of two sets of rules. Rules for personal data and businesses are in the General Data Protection Regulation, while law enforcement’s access to data is covered by the Data Protection Directive.

    The Directive aims primarily to protect the privacy of victims, witnesses, and criminals while enabling police across the EU to exchange information during investigations and follow the same data access protocol regardless of where in the Union data they seek reside.

    But it also outlines rules for transfer of personal data outside of the EU, an issue over which Microsoft and the US government have been battling in court. US law enforcement officials have requested that the company provide personal data of an investigation subject, but the data is stored in a Microsoft data center in Dublin, Ireland, and the company has declined to provide it on the grounds that the US government’s jurisdiction does not extend beyond US borders.

    The next step for the EU’s data privacy reform is for the European Parliament and Council to adopt the new rules starting in 2016. The laws will go into effect two years thereafter.

    More details on the proposed rules in the European Commission’s announcement.

    << Previous Day 2015/12/17
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org