Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, December 31st, 2014

    Time Event
    1:00p
    Ten Huge Multi-Tenant Data Centers that Came Online in 2014

    There was no shortage of huge data center openings this year, majority of them multi-tenant facilities, where owners lease space to companies. In November, market research firm IDC said huge data centers being built by service providers will account for more than 70 percent of all data center construction by 2018.

    Here are some of the biggest multi-tenant data center construction projects that were completed this year. While this isn’t an exhaustive list, these are some of the most prominent big data center launches we have covered this year.

    DuPont Fabros in Santa Clara and Ashburn

    The wholesale data center developer launched the sixth data center on its already massive Ashburn, Virginia, campus in September. Largest in the company’s portfolio, the ACC7 facility is about 450,000 square feet and has the capacity to provide 41.6 megawatts of power.

    ACC7 and the 9 megawatt Phase IIA of its Santa Clara, California, data center that was finished around June, were the first to use DuPont Fabros’ new electrical design, which enables the company to deploy capacity in smaller pieces that it has done with its earlier builds.

    An aerial view of the DuPont Fabros data center in Santa Clara, California. (Photo: DuPont Fabros Technology)

    An aerial view of the DuPont Fabros data center in Santa Clara, California. (Photo: DuPont Fabros Technology)

    Pacnet in China

    Hong Kong-based Pacnet brought online a 225,000 square foot data center in Tianjin, China, in October. Mainland China is a major growth market for data center service providers, and it has become important for them to have data centers within the country’s borders to serve customers there without services being impeded by the government’s massive Internet surveillance and censorship system known as the “Great Firewall of China.”

    QTS in Dallas-Fort Worth

    QTS Realty Trust has retrofitted a former semiconductor plant in Los Colinas, Texas, into a data center that can accommodate 700,000 square feet of raised floor and 140 megawatts of power, according to the company, which announced the launch of the behemoth facility in October.

    DataGryd in New York City

    DataGryd is the newest player to enter the Manhattan data center market. The company has four floors of the massive carrier hotel and data center building at 60 Hudson. In September, it announced the launch of the data center with its anchor tenant Telx, which has leased about 70,000 square feet of build-to-suit space from DataGryd.

    The entrance to 60 Hudson Street, the Manhattan data center hub that previously served as the headquarters for Western Union. (Source: 60 Hudson)

    The entrance to 60 Hudson Street, the Manhattan data center hub that previously served as the headquarters for Western Union. (Source: 60 Hudson)

    RagingWire in Ashburn

    RagingWire, now majority owned by Japan’s NTT Communications, brought online an additional 8.1 megawatts of critical power at its Ashburn campus –the final phase at the site. The company said it has seen strong sales in the Ashburn market ever since it launched the site in 2012. Most of the capacity it has brought online there was pre-leased before launch.

    One of the three new RagingWire data center PODs now available in VA1 Ashburn (source: RagingWire)

    One of the three new RagingWire data center PODs now available in VA1 Ashburn (source: RagingWire)

    ViaWest in Denver and Minneapolis

    ViaWest, acquired by Canadian telco Shaw Communications this year, launched 140,000 square feet of raised floor in the Denver market in June. This is the fifth data center in the service provider’s home state of Colorado.

    Electrical gear at ViaWest's Compark data center in Englewood, Colorado. (Photo by Paul Talbot, 23rd Studios)

    Electrical gear at ViaWest’s Compark data center in Englewood, Colorado. (Photo by Paul Talbot, 23rd Studios)

    Another big expansion for ViaWest came in April, when the company announced the launch of a 9 megawatt data center in Chaska, Minnesota, which is just outside of Minneapolis.

    Cervalis in Connecticut

    IT services provider Cervalis opened a 168,000 square foot data center in Fairfield County, Connecticut, in February. The facility provides 75,000 square feet of raised floor and 16 megawatts of power.

    Virtus in London

    Virtus Data Centres brought online its London2 facility in Hayes, just outside of London, in October. The facility’s power capacity is 11.4 megawatts, delivered across 65,000 square feet of space. The data center had five customers on board when it was officially launched.

    Entrance to the new London2 data center by Virtus (Source: Virtus)

    Entrance to the new London2 data center by Virtus (Source: Virtus)

    4:30p
    Maintaining the Benefits of a Hybrid Cloud Environment

    Maurice McMullin is the Product Marketing Manager at KEMP Technologies.

    A hybrid cloud environment enables businesses to leverage the benefits of both on-premise and private cloud to provide optimal application delivery to their clients. By combining internal resources with cloud services hosted by the likes of Amazon, VMware and Microsoft, organizations gain significant flexibility and resilience at a price point that can be extremely compelling.

    However, benefits gained from hybrid implementations can be eroded if the environment leads to duplication of resources and processes. To mitigate against the erosion of value, a strategy should be taken to maximize consistency and compatibility of data center and cloud environments. This drive for consistency should cover all aspects of the application delivery environment including policies and processes as well as platforms and toolsets. This consistent approach will pay long-term dividends as the hybrid environment will be viewed as a single entity from an operations, consumption and compliance standpoint.

    Virtualizing Services and Infrastructure

    Pursuing an operations strategy that supports unified management and orchestration not only simplifies the day-to-day operation but also makes migration and integration a less daunting prospect. Virtualization of services and infrastructure in both data center and cloud environments facilitates the use of a single set of tools which makes management, migration and orchestration a simpler and less costly prospect.

    A cloud provider may offer a free or low cost service such as application load balancing, however, in the long term it may be more cost effective and less painful to deploy a virtualized variant of the data center load balancer in the cloud. This approach leads to consistent application of policy and configuration that means reduced operational complexity and greater flexibility to move workloads.

    The use of a virtualized solution from a specialist vendor will also deliver a higher level of functionality and greater control than the use of a generic service provided by the cloud. Solution-focused vendors are also providing integration to cloud orchestration and management tools and have been explicitly designed for hybrid cloud use cases such as high availability and disaster recovery. These vendors are designing solutions that address challenges posed by hybrid environments by offering integration with the leading cloud orchestration and management platforms to give a single pane of glass view.

    A Consistent Approach is Key

    Using common tools and services in a hybrid environment facilitates a consistent approach to security and service management that simplifies core tasks such as patching, monitoring and backup. Staff will not need to maintain skills across multiple vendors and will benefit from exposure to the latest technology and practices being employed by the cloud provider.

    Cloud providers are constantly innovating and client organizations can improve efficiency by leveraging the innovations directly and by replicating these innovations in their own data center.

    The flexibility and scalability offered by a hybrid environment is best exploited when the underlying platforms facilitate easy migration of workloads between on-premise and cloud. Ease of migration simplifies the business scenarios such as capacity on demand and disaster recovery while also providing the reassurance that workloads can be removed completely from the cloud should the business or regulatory environment change.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:00p
    Strangest Data Center Stories of Past Few Years

    Let’s close out the year looking at some of the strangest technology stories of the past few years, many of which flew under the radar. This started as a list for 2014 only, but stories from a few years back proved too weird to pass up.

    The Other Home Data Center

    In August, “The Home Data Center: Man Cave for the Internet Age” highlighted some interesting things hobbyists and IT professionals were doing in their homes with technology. It wasn’t the only the only case of servers invading homes. French company Qarnot and German cloud company Cloud&Heat took an interesting approach to address excess server heat.

    Cloud&Heat offered free heating to homes and offices willing to house their servers. The heating, of course, comes via the server. A fireproof safety cabinet equipped with servers is installed in the building that is to be heated. The company provides various cloud services.

    It’s the grassroots version of what Telus has been doing. Telus warms condos with heat from its servers in Vancouver. The difference is the servers are housed in central data centers instead of homes.

    Imagine waking up to the smell of Doritos and Red Bull from the IT guy in your basement, sent to fix a problem at 3am.

    Qarnot computing sells compute capacity to corporate clients and then channels that heat into central heating, stated a BBC article. The company invented the distribution system, which is tied into the thermostat. A small wall mounted radiator is mounted in the home and it pumps in heat generated. The company pays customers for the electricity Qarnot’s digital heaters use, so heating is free. Benoit said 4 servers are enough to heat a room.

    Qarnot has distributed supercomputers into employee homes, schools and offices.

    Evidence Wiped from Federal Servers Because of High Costs

    A Miami doctor charged with selling fake prescriptions online had charges dropped because it was costing too much money to keep the evidence on federal servers, reported Yahoo in 2012. That evidence was around 2 Terabytes of digital space. As a reference point, a consumer can get a 2 Terabyte external hard drive for under a hundred bucks today compared to roughly 1 Terabyte for a hundred in 2012.

    Perhaps the most amazing statistic is that those 2 Terabytes represented 5 percent of the Drug Enforcement Agency’s worldwide global storage network. To be fair, the case did start in 2003 and former Doctor Armando Angulo was indicted in 2007. However, here’s hoping the DEA has done a tech refresh since then – or not, depending on your predilections.

    The overall investigation sent 26 people, including 19 doctors, to jail. Those people clearly cleaned up their tracks too well or didn’t work hard enough to generate enough evidence to be too expensive a prosecution.

    WSJ: Higher Resolution Displays on Macs Wreak Havoc on Corporate Networks

    In 2012, The Wall Street Journal reported that bigger, higher resolution retina displays on Macbook Pros would increase consumption of network bandwidth and slow performance of corporate networks. The higher resolution displays would increase bandwidth usage. Let that sink in.

    The article was later corrected to say that the likelihood of streaming HD movies would go up because of the nice displays, and that would increase bandwidth. However, nothing truly dies on the Internet, as Reddit documented the whole affair. The comments section, of course, blew up, serving as another reminder.

    This one was painful to write, as we all make mistakes. The journalist’s mistake was to write “display resolution” instead of “content resolution” It’s like writing IBM’s new brain inspired chip consumed 70 kilowatts instead of milliwatts on a site read by data center professionals.

    In 2012, Microsoft Reportedly Burned $70k Worth of Electricity to Avoid Fine

    Microsoft opened its zero-carbon methane powered data center this year, and it continues to commit to renewable energy in a big way. But, the company may have done something not very “green” in 2012. The New York Times reported Microsoft wasted electricity at its Quincy, Washington, data center to avoid fines and penalties for under-consumption of electricity.

    The company burned $70,000 of electricity in 3 days to avoid the $210,000 fine, according to the report, which was amended to $60,000 after wasting $70,000 worth of electricity. The utility had a clause in the contract. But don’t get out the pitch forks. Microsoft has committed to 100 percent renewable energy for its data centers and is doing many interesting things in terms of reducing its carbon footprint.

    Earlier this year, DCK looked at the strangest data center outages of all time. What strange, offbeat story do you remember?

    << Previous Day 2014/12/31
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org