Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, December 23rd, 2014

    Time Event
    1:00p
    Internap Completes Move Out of Google’s Manhattan Building

    Internap has successfully moved its New York data center out of the Google-owned building at 111 8th Avenue in Manhattan and into its brand new Secaucus, New Jersey, facility.

    Google, which uses the building for office space, has not allowed data center providers there renew their leases since it bought it four years ago. Internap was one of the casualties. The building is one of the most desirable data center locations in the world because of the huge amount of carrier networks it is connected to. It is one of a handful of the top carrier hotels in Manhattan.

    Internap has been working non-stop to move its New York data center customers to New Jersey. One of the customers, Outbrain, moved more than a thousand servers over a rainy Memorial Day weekend this year.

    Other data center providers that still have active leases in the building include Digital Realty Trust, Equinix, and Telx. Different providers have different lease terms in the building. Telx CEO Chris Downie, for example, told us earlier this year the colocation company’s lease there would not end for several more decades.

    An Undesired but Necessary Project

    Internap built the Secaucus data center with plans for long-term growth, the company’s senior vice president Mike Higgins said.

    “We went through quite an undertaking starting in 2013 – something in the data center space that no one likes to do,” he said. Internap operated out of 111 for over a decade, but its lease was up. “Customers had been with us a very long time, running production environments. The migration took a tremendous amount of effort for both Internap and the customers, but we’re happy to say the project was successful.”

    Apart from the physical move — complicated by Google employees moving in as Internap was moving out — network configuration was a big part of the puzzle. Customers couldn’t afford downtime, and the provider was somewhat between a rock and a hard place. The company put in temporary infrastructure so customers could keep network connections live.

    “Many customers took a maintenance window, a number of customers did not,” said Higgins. “We had other customers that needed to lease temporary infrastructure. Those customers took advantage of bare metal cloud.” Bare metal cloud is one of Internap’s services.

    For some tenants the move meant conducting inventory of their applications. “Some customers were one, two, three CIOs removed from their initial Internap deployment,” said Higgins. “A number of customers had to get external project management support to get a handle on the applications they were running.”

    Many customers didn’t migrate with Internap because their technical expertise was in Manhattan. Still, the company managed to migrate enough customers to Secaucus for the move not to affect revenue, Higgins said.

    Outbrain’s 1,000-Server Migration

    Content discovery platform provider Outbrain operates from three data centers in the U.S., one of which is now the Secaucus facility. The company’s infrastructure serves more than 72,000 links to content every second.

    “We were cognizant of customers with larger environments,” said Higgins.”We built a migration plan around customers like Outbrain to make sure we had the resources to accommodate everyone. All hands were on deck, we provided full NOC support. We mapped out the weekends how those customers would fit.”

    “There are two approaches to migration,” said Orit Yaron, head of operations, Outbrain. “The first is to build everything new on the new side and slowly migrate it over the network. We took the second approach, the big bang approach. We moved very fast over the weekend, meaning more risk but less intrusive.”

    Outbrain performed a lot of project management and risk mitigation prior to the move. The company wanted to migrate over a long weekend, and Memorial Day was the only option. “We didn’t have any infrastructure in Secaucus,” said Yaron. “It was a very aggressive timeline with a lot of cross communication between us and Internap needed.”

    The company used multiple trucks and planned what infrastructure specifically would go in what truck. “We did this for redundancy,” said Yaron. “We wanted to split the nodes on different trucks, so if one were to get stuck, we could continue to operate. People thought we were paranoid, but we took a lot of precautions.”

    The company uses commodity servers that all look the same, so each one was tagged three times just in case a tag went missing.

    Outbrain uses a continuous integration methodology, meaning there are lots of production deployments on any given day. It was very important for them to maintain that during the move, said Yaron. The company performed lots of tests to make sure the monitoring and test systems remained active and ran a test simulating 111 8th being taken down to prepare. Scripts were plugged in to prevent a flood of alerts during the move and to bring the system back to normal following the move.

    The weekend of the migration, it was raining hard, but the move was completed in 36 hours rather than the planned 48.

    “We were prepared for almost every scenario,” said Yaron. “The location of the cabinets, what equipment goes into what truck, printing out the phone numbers, we went into deep-level details in the planning.”

    ‘Most complex migration’

    Internap’s New York data center is now out of the Manhattan building, and the company is ready for its future in Secaucus.

    “This was by far the most complex migration I’ve been a part of,” said Higgins. “Think about New York, the number of customers in that multi-tenant environment. I’ve bee a part of some enterprise migrations, but none this complex.”

    4:00p
    T5 Closes $55.5M Credit for Oregon Data Center Construction

    T5 Data Centers has secured a $55.5 million credit facility to finance its data center construction project in Hillsboro, a Pacific coast city that’s part of the Portland metro. The secured credit facility was provided by CIT Bank, subsidiary of the U.S. financial holding giant CIT Group.

    The announcement comes the same week QTS Realty Trust said it had raised $100 million in credit to finance data center construction, both deals signaling a healthy debt market for data center providers.

    T5 started construction of its two-building data center campus in Hillsboro in September – two years after it bought a 15 acre property there and announced plans to build the campus. This September, the company said it had secured an unnamed anchor tenant that would take one of the future buildings in its entirety.

    Joe Junda, managing director of CIT Corporate Finance, said T5 had a strong management team and a solid track record, and that Portland had strong data center demand.

    “T5 Data Centers is a leading and experienced wholesale data center owner and operator with a strong management team,” Junda said in a statement. “Portland has a proven demand for data centers and should be an excellent location for this new facility.”

    Portland metro’s abundance of low-cost electricity and network connectivity options have made it one of the hottest data center markets in the country.

    There are submarine cable landing stations in six towns along the coast, one of which is Hillsboro, according to TeleGeography. Cables that land there offer direct connectivity to countries in the Asia Pacific region, Alaska, Northern California, and Southern California.

    There are more landing stations in and outside of Seattle, which is about 170 miles north of Portland.

    Other data center wholesalers, Digital Realty Trust and Fortune Data Centers, have data centers in Hillsboro. Major retail colocation players in the market include Telx and ViaWest.

    Some high-profile data center users there are Intel, NetApp, and Adobe.

    T5’s other data centers are in New York, North Carolina, Atlanta, Dallas, and Colorado.

    4:30p
    The Sony Hack: A Bitter Multi-Motive Pill to C.H.E.W.

    Carl Herberger currently manages Radware’s security practice in the Americas. With over a decade of experience, he began his career working at the Pentagon evaluating computer security events affecting daily Air Force operations.

    Have you ever been in a noisy room when suddenly an unbelievable new sound manages to silence all other sounds immediately? Well, that’s how 2014 is leaving the world of information security professionals – with a piercing sound that received everyone’s attention.

    Already a watershed year for cyber security, we close out 2014 with a story of unparalleled scope in the high profile attack on the Sony Corporation. The attack reinforces much of what we have seen during the past 48 months: that cyber-attacks continue to grow dramatically in their virulence, and when successful, these attacks can result in major, potentially irreparable damage. It is far too early to know the long-term impact of this event on Sony. But when one considers the results of lost sales and reputational damage, it is not out of the realm of possibility that they never fully recover.

    Who Was Behind it and What Were Their Motives?

    So, naturally everyone wants to know who was behind this attack, and what their motives were. Every day, coverage of the attack brings a healthy mix of news, rumor and political rhetoric. At the time of writing, the U.S. government has issued statements confirming the involvement of North Korea, who in turn, has denied involvement.

    Like any area of technology, information security has its own “acronym soup” that emerges as a language, of sorts, for practitioners and followers alike. In the case of the Sony hack and its motives, the acronym that comes to mind is C.H.E.W., popularized by Richard Clarke, former Special Advisor to the President of the United States on cyber security. Clarke outlined C.H.E.W. to categorize common cyber-attack motivation as follows:

    • Cyber Crime: an attack where the primary motive is financial gain
    • Hacktivism: attacks motivated by ideological differences. The primary focus of these attacks is not financial gain but rather to persuade or dissuade certain actions or voices
    • Espionage: an attack with the straightforward motive of gaining information on another organization in pursuit of political, financial, capitalistic, market share or some other form of leverage
    • Warfare: the notion of a nation-state or transnational threat to an adversary’s centers of power via a cyber-attack

    One of the interesting aspects of the Sony attack is that it blends aspects of multiple motive categories. It has been well covered how many of the attacks in 2014 constituted “multi-vector” attacks, i.e., they leverage a variety of protocols and technology vulnerabilities to create a more complex detection and mitigation scenario for the target. What is also now clear, and on display in the Sony attack, is the emergence of “multi-motive” attacks … a blurring of lines across the C.H.E.W. principle.

    Most would consider the Sony attack principally a hactivist-driven attack. North Korea’s statements about “The Interview” representing “an act of war” both prior to and following the attack make it clear that the film represented a serious collision of ideological views. So would this attack be considered an act of warfare? It’s likely that most wouldn’t call the attack itself an act of war, but the escalating dialogue between the U.S. and North Korean governments exhibit the increasing interrelation between cyber security events and broader elements of domestic policy.

    Nobody is Immune in Today’s Complex Threat Landscape

    A reality of today’s cyber-attacks that is highlighted by the attack on Sony is that no organization is immune. While Sony may have its various detractors, they wouldn’t generally fit the profile of organizations that in the past would be considered a high risk target. In particular, Sony Pictures as an organization focused on entertaining the masses wouldn’t have fit that profile. But the situation around the subject matter in the film goes to show how one man’s comedy can be another man’s (or in this case nation’s) declaration of war.

    We recently launched our Global and Network Security Report that gives a broader range of industries and organizations of varied size that have become targets of cyber-attack. Part of the report shares a representative view of industries and their tendency toward more frequent attacks in what we refer to as the “Ring of Fire.” One of the more notable changes to this year’s Ring of Fire is the addition of the healthcare industry as one more commonly attacked than in past years. And perhaps no other healthcare organization’s situation highlights this better than the experience of the Boston Children’s Hospital, which became the target of a serious Distributed Denial of Service (DDoS) attack in 2014. Who would want to target such a seemingly altruistic organization you might ask? In this instance, the Boston Children’s Hospital found itself tangentially involved in a controversial child custody matter, providing necessary care to the child in question. The case highlights how organizations at the surface level would appear to be odd targets, but get pulled into broader hactivist activities.

    It is clear that each year brings with it new challenges for IT and information security teams working to protect system and data availability and confidentiality. The DDoS attack on Boston Children’s Hospital and the breach that felled Sony, act as a stark reminder that no organization is immune and how effective a multi-motive attack can be. So effective, that several movie theaters became collateral damage by receiving threats of terrorism and physical violence if they were to show the film.

    Regardless if “The Interview” is ultimately released on the silver screen or perhaps finds its way to the public via video on demand, one thing is clear: the threats are real and the challenges are complex. But the klaxon is sounding—and we must take meaningful action to prepare against emerging attack trends and techniques.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    6:23p
    Oracle Beefs Up Marketing Cloud With Datalogix Acquisition

    Oracle has acquired an ad-tech company called Datalogix for an undisclosed sum, continuing to strengthen its marketing cloud. Oracle bought BlueKai, also an ad-tech firm, earlier this year.

    Datalogix offers purchase-based audience targeting to drive online and offline sales. According to Oracle, the Datalogix data platform aggregates data on over $2 trillion in consumer spending from 1,500 data partners across 110 million households. The platform will become a part of the Oracle Data Cloud.

    Datalogix is a big addition to Oracle. Besides its analytics capabilities, the firm has partnerships with Google, Facebook, and other large Internet properties.

    Through its consumer identity graph, comprised of behavioral, social, and purchasing data, Oracle Data Cloud seeks to provide a comprehensive view of the entire customer experience.

    When Oracle acquired BlueKai for $400 million in February, it made its CEO Omar Tawakol general manager of the Oracle Data Cloud. BlueKai and Datalogix were both competitors and partners.

    “The addition of Datalogix to the Oracle Data Cloud will provide data-driven marketers the most valuable targeting and measurement solution available,” Tawakol said in a statement. “Oracle will now deliver comprehensive consumer profiles based on connected identities that will power personalization across digital, mobile, offline, and TV.”

    Like many other Oracle acquisitions, Datalogix was an attractive addition to its marketing cloud because its solutions are used by Fortune 500 companies. Datalogix claims more than 650 customers across the top 100 U.S. advertisers and seven of the top eight digital media publishers, including Facebook and Twitter.

    7:00p
    Six Rules for a Modern Private Cloud

    To this day, a private cloud is still one of the most popular cloud delivery methods out there. Organizations looking to bring rich content to the ever-mobile end user are updating their infrastructure, creating a more scalable data center and trying to keep up with the market.

    We have a very different kind of user today. The modern user is:

    • Highly mobile
    • Wants to receive apps and data on a variety of devices
    • Wants to be productive from anywhere
    • Wants their connection to be seamless, secure, and powerful
    • Wants to experience the best possible user experience… all the time

    Now, it’s hard to meet all of these demands at once. But there are ways. There are private cloud technologies that allow organizations of all sizes to create their own private cloud environment. Now, let’s take a look at the critical steps for creating and controlling your own private cloud environment.

    1. Understand your end-user. Probably one of the most important first steps is to understand what kinds of devices your users are deploying, how they are accessing their apps, and what will make them most productive. This means administrators must maintain constant visibility into the end-user environment. Monitoring and controls should be established around the following:
      • Log-on times
      • Latency
      • The type of device being used
      • Location of the connection (external/internal and secured/unsecured)

      It’s also important — when it comes to the end user — to never become complacent. Much of the current evolution around cloud and content delivery is primarily because of how modern end users process data. Are your users mainly on tablets or laptops? Are they in environments with poor network connections? All of these questions dictate how you deploy your private cloud environment and whether you’ll need more to enhance the end user’s experience.

    2. Create data center efficiencies. This is both physical and logical. Look for technologies that directly help your data center become more agile and efficient. Maybe you need to deploy a new virtual application environment on a hyper-converged infrastructure for optimal performance. Or, maybe you need a virtual WANOP appliance at a branch location to help with connectivity and latency. Little enhancements can go a long way for your users and data center.
    3. Start with your hypervisor. Really, this is the gateway to your private cloud. This is where you control your resources, create policies, incorporate automation, and optimize your virtual machines. New technologies now allow you to directly integrate with hypervisor platforms to make VM control even easier. The better you can control you hypervisor and the resources assigned to it, the better you can control your private cloud.
    4. Publish applications and data. This can be a bit tricky and sometimes takes the longest. Applications now come in ALL shapes and sizes. SaaS, HTML5, legacy, and cross-operating system applications can all make the publishing process interesting. There are tools out there that allow you to analyze applications and make very intelligent deployment decisions. You can check on dependencies, cross-app interoperability, and OS support. The more you know about the app the better. Plus, the great part about delivery is that you can aggregate all apps (public, private, SaaS, and HTML5) under one user portal. For the administrator, they can manage all apps and data points from one location. For the user, all apps are available in one spot.
    5. Never forget about security. The cloud, and the private cloud, have all evolved substantially. New security solutions bring options from enterprise to SME levels. You can deploy virtual security appliances, load-balancers, and access gateways. The point is that you can now look at both physical and virtual options to enhance the overall security of your private cloud.
    6. Monitor and evolve. Cloud computing continues to evolve at a scorching pace. We’re seeing new technologies impact the way we compute on an almost daily basis. That said, just because you have a private cloud does not mean you might not one day move into a hybrid cloud platform. New hypervisor extensions, powerful APIs, and cloud integration methodologies allow you to quickly integrate with public cloud resources. Never get complacent with your cloud and always be ready to evolve.

    At the highest level, that’s pretty much what you’ll need for a solid private cloud architecture. The most important part to understand is the need to create a very agile environment. You never know when you’ll be expanding your private cloud into a hybrid cloud.

    7:30p
    Mac Hosting Provider MacStadium Raises $1 Million in Series A Funding

    logo-WHIR

    This article originally appeared at The WHIR

    Atlanta-based Mac hosting provider MacStadium announced on Monday that it has raised $1 million in Series A funding from a private angel investor, according to a report by the Atlanta Business Chronicle.

    MacStadium provides dedicated Mac Mini and Mac Pro hosting, as well as an Apple-centric infrastructure-as-a-service for developers. The funding round will help it grow its iOS testing platform, as well as its product development and marketing, the report said.

    The funding will also help it accelerate expansion into Europe and APAC. MacStadium says that by bringing Mac minis closer to the developers, it can cut its network latency loss to increase testing efficiency.

    MacStadium said it is expecting revenues of $2.4 million this year. It is seeing a 15 percent increase in demand every month, according to the report.

    The company hosts around 3,000 Apple Mac minis in a data center in Georgia. In an interview with Cult of Mac in November, MacStadium CEO Greg McGraw said that while it has seen great success with developers, the Mac Pro has opened up the opportunity to serve the enterprise market.

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/mac-hosting-provider-macstadium-raises-1-million-series-funding

    8:01p
    EMC Wants to Build New Ireland Data Center

    Storage and cloud giant EMC is in the process of acquiring planning permission for a new data center and office in Ireland. The planning application would mean an expansion of EMC operations in the country.

    The planning application for an EMC data center was submitted to Cork County Council last week to develop a site next to its Ovens plant, the Irish Times reported.The company is planning a data center and office block, and if planning is successful, EMC will have 10 years to develop additional facilities at the site.

    Cork and Ireland in general has a booming IT scene. EMC is already a large employer there, employing over 3,000. The new data center and office would represent a major expansion.

    Cork is located in the southwest region of the country and is also home to European headquarters for Apple and Logitech. Amazon has also set up shop in the Cork Airport Business Park. A new submarine cable is in the works, which will boost direct connectivity to North America for Ireland greatly, making it an even more attractive destination for data centers.

    The Ovens plant, where the future EMC data center will be built of the company gets the official approval, is one of its eight global “centers of excellence.” It is used for research and development projects.

    EMC has two other plants in Cork: at Mahon and Ballincollig.

    DCK provided a photo tour of the Cork Internet Exchange in 2007 (to give you an idea of how long ago that was, Digg was popular).

    Cork is an alternative to the current data center capital in the country, Dublin. Microsoft has a data center there, and so do Google and Amazon. Digital Realty Trust launched a brand new facility in Dublin in September.

    9:59p
    Global Retail Data Center Colocation Market Reaches $25B

    The global colocation data center market has reached $25 billion in annual revenue run rate, more than half of that revenue generated by the top 60 service providers in the space, according to the latest report by 451 Research.

    The researchers pegged total number of data center providers across North America, Latin America, EMEA, and Asia Pacific at 1,086. These companies collectively operate 3,685 individual data centers.

    The 60 top providers account for about six percent of the total number of players in the market. The top 10 players pull in 28 percent of the revenue.

    Many Smaller Players, Despite Consolidation

    While the colocation market has been consolidating, more than 1,000 companies, many of whom are regional players, account for the other half of the revenue.

    “At its heart, the multi-tenant data center business is a regional business,” Greg Zwakman, research director at 451, said in a statement. “So despite active consolidation and some concentration at the top, much of the market remains highly fragmented, with a mix of national and local players.”

    Some of the most recent big acquisitions in the colocation data center market were Colt’s acquisition of KVH in Asia Pacific in November, and Shaw Communications’ $1.2 billion ViaWest deal in July.

    Examples of smaller deals that took place this year were Zayo’s CoreXchange acquisition in March, DataBank’s acquisition of Arsalon in May, and the purchase of Colo5 in Florida by Cologix in September.

    Graph courtesy of 451 Research

    Graph courtesy of 451 Research

    Interest in Second-Tier Markets Rising

    The bulk of multi-tenant data center construction over the past decade has taken place in big markets with lots of potential tenants, but, according to 451 research director Kelly Morgan, interest in markets outside of the big cities is growing.

    “This is for several reasons, such as to reduce latency, to target medium-sized local businesses, or because operating costs are lower,” she said in a statement. “We expect to see strong growth in several of these secondary markets over the next few years.”

    More Ongoing Construction in Major Markets

    Despite growing activity in second-tier data center markets, more square feet of data center space and megawatts of power capacity will come online over the next two years, according to 451, which has identified 176 known data center expansions and 134 new builds.

    More than 60 percent of this construction activity is taking place in North America, but expansion rates in Asia Pacific and EMEA are strong as well.

    << Previous Day 2014/12/23
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org