Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Friday, November 21st, 2014
Time |
Event |
1:00p |
Telx to Add Capacity in Key New York Carrier Hotels There may be a glut of data center space in New Jersey, but the market across the Hudson is a different animal. As customary, available quality data center space in Manhattan is hard to come by and costs a lot of money.
Telx, a key player in the New York data center market, is getting ready to loosen the supply by bringing new capacity online in two of the three buildings it operates in, Telx CEO Chris Downie said.
The buildings – 32 Avenue of the Americas and 60 Hudson Street – are both key connectivity hubs not just for the U.S. but for the global Internet. New York carrier hotels like these are the reason companies are willing to pay the premiums that come with using data center space in Manhattan when there are tons of new space ready to go in New Jersey.
“New York City – in the buildings that we run – hasn’t had that amount of available capacity in quite some time,” Downie said. “It’s a great time to be bringing on that capacity, because the demand on the island has to go somewhere.”
Telx is going to launch 15,000 square feet of data center space on Avenue of the Americas and 17,000 square feet at 60 Hudson. Telx is leasing one of the four floors at 60 Hudson operated by wholesale data center provider DataGryd.
Google Offices a Problem for the Market
The company also has eight suites at 111 8th Avenue, another major carrier hotel owned by Google. Those suites are filling up, Downie said.
The building on 8th Avenue has been a problem for the New York data center market ever since Google bought it in 2010. The company bought it primarily for office space, but it is home to a lot of data centers because of the high density of network carriers that interconnect there.
We’ve heard from multiple sources that Google has pretty much taken the building off the data center market. The new owner has not been letting data center providers renew their leases there once they expire.
One well-known casualty is Internap. The data center provider’s lease in the building runs out at the end of this year, and it has been moving its colocation operations to a new facility in Secaucus, New Jersey.
Downie said Telx’s presence at 111 is safe. He declined to say when its lease there runs out, but said it would be several decades before that happens.
What About Sabey?
Telx CEO isn’t worried about competition with Sabey Data Center Properties’ skyscraper in Lower Manhattan, which already has a lot of space built out and a lot more ready for redevelopment.
“Sabey has space, but I would say there’s a different value proposition between our environments and Sabey,” he said. “It’s intended more for wholesale demand in the marketplace.”
Downie doesn’t deny that Sabey – which also happens to be Telx’s landlord in the Seattle market – may offer retail colocation too, but the former Verizon building doesn’t yet have the level of network density Telx offers at its New York facilities.
Rich connectivity is the reason data center space on the island is in demand. “Because why else would you position yourself in New York? It’s not an inexpensive place to do business,” he said.
Massive New Jersey Assets
To be sure, Telx has a huge play in the New Jersey market as well. The company has three buildings there, newest one of them the largest data center investment it has ever made, according to Downie.
Finished last year, NJR3 in Clifton was the first data center the provider built from the ground up. At 350,000 square feet total, it provides 100,000 square feet of sellable data center space.
Across its six facilities in the New York metro, Telx has about 750,000 square feet of inventory total and the ability to serve just about any data center need anybody may have.
No Geographic Expansion Plans
Downie has no plans to enter any new markets in the near future. The company has been taking space in major hubs like New York and San Francisco as it becomes available.
About 18 months ago it entered Silicon Valley with 55,000 square feet of sellable space at the Vantage campus in Santa Clara.
The company also has 55,000 square feet in Seattle and 10,000 square feet in Portland – a key landing point for submarine cables that connect the U.S. and Asia. The Portland site is important strategically, because Downie expects demand for overseas connectivity with Asia to grow in the near future. “We want to be part of that,” he said.
For now, the strategy is to focus on core markets – New York, Chicago, San Francisco, Dallas, and Atlanta. “There’s a lot going on in those markets, so that’s where you’re going to see us focused,” Downie said. | 4:00p |
Vantage Completes 4.5MW Data Center Expansion in Quincy Vantage Data Centers has completed the second phase of its Quincy, Washington, data center. An additional 4.5 megawatts brings the building’s total capacity to 6 megawatts. The build was accelerated to meet a Fortune 100 customer’s requirement, the company said without disclosing the customer’s name.
The data center campus is located in the heart of the Columbia River Valley and has access to cheap hydropower. The company’s long-term plan calls for four data center buildings totaling 560,000 square feet and 55 megawatts of critical load.
Vantage is working with the Uptime Institute on a possible Tier III Certification and anticipates receiving LEED Platinum, like it has done in Santa Clara, California.
The data center has newly-built mechanical and electrical infrastructure. It has a custom-developed indirect evaporative cooling system designed to eliminate impact from outdoor conditions through a closed-loop delivery infrastructure.
The data center also has EPA Tier 4 generators that reduce emissions by 90-95 percent as compared to traditional generator deployments. The EPA’s Tier 4 standard is designed to reduce hazardous emissions from backup generators.
Most data center generators are exempt from Tier 4 standards through an exemption for gensets used for emergency backup. A few years ago, Quincy residents expressed concern at the number of generators as the result of a booming data center industry. Vantage went the extra mile.
Vantage launched in 2010 with backing from Silver Lake Partners. Silver Lake’s Sureel Choksi was named CEO in 2013, replacing the founding CEO Jim Trout.
The company’s first data center was in Santa Clara. The first facility in Quincy was announced in 2011 and opened in 2013. It has around 60,000 square feet of raised floor inside of a 133,000 square foot building.
“Our location in Quincy benefits from access to abundant, low-cost hydropower as well as significant tax incentives, driving industry-leading TCO for customers,” Choksi said in a statement. “We are pleased to support our customer’s faster-than-anticipated growth and look forward to building additional data centers on our Quincy campus to support more customers.”
With its low-cost hydro power, Quincy has been an attractive market for companies with web-scale operations, including Microsoft, Yahoo, and Dell.
In April Vantage boosted its credit line to $275 million in support of expansion. | 4:30p |
Friday Funny: Pick the Best Caption for Pies Kip and Gary have never let work get in the way of a good celebration! Join in on the Thanksgiving fun with this week’s Friday Funny.
Here’s how it works: Diane Alber, the Arizona artist who created Kip and Gary, creates a cartoon and we challenge our readers to submit a humorous and clever caption that fits the comedic situation. Then we ask our readers to vote for the best submission and the winner receives a signed print of the cartoon.
Several great submissions came in for last week’s cartoon – now all we need is a winner. Help us out by submitting your vote below!
Take Our Poll
For previous cartoons on DCK, see our Humor Channel. And for more of Diane’s work, visit Kip and Gary’s website!
| 5:49p |
SC14: DataDirect Previews IO Accelerator for Supercomputers At this week’s SC14 supercomputing conference in New Orleans, DataDirect Networks announced performance tests and customer adoption of its Infinite Memory Engine, an application-aware acceleration engine and buffer cache.
Jeff Sisilli, senior director of product marketing at DDN, said the software drastically accelerates IO in high performance computing environments by virtualizing commodity SSDs into a single pool of non-volatile memory-based fast data storage. This resource pool then sits between compute clusters and the parallel file system.
This software solution from DDN — known primarily as a hardware company — was developed over several years and directly targets improvements in checkpointing and IO-intensive HPC application performance. The company says that placing IME between the compute and parallel file system removes POSIX semantics that bring cluster performance to its knees due to the proliferation of small and mal-aligned IO that can often make up 90 percent of the HPC workload. DDN says that by shielding parallel file systems from fragmented IO in cache users can run jobs at or near line rate with more jobs in parallel, resulting in faster time to insight.
While it won’t be released until 2015, DDN has installed preview versions of its IME technology at top supercomputer sites where high bandwidth and high IOPS workloads were required.
Through a collaboration with Intel DDN also launched the latest generation of its EXAScaler Lustre appliance. Built on its Storage Fusion Architecture (SFA), DDN added enhancements to the new EXAScaler for improving performance and simplifying deployment. DDN says this new release delivers over 4.8 petabytes of usable storage, 100 Mbps sustained scalable per-drive performance, up to 40 Gbps sustained throughput and up to 1.5 million IOPS. All of this is in a single rack and powered by Intel’s Enterprise Edition for Lustre.
Michael Vildibill, vice president of product management at DDN, said, “A massively scalable, efficient appliance designed for the HPC and Big Data markets, EXAScaler delivers extreme file system performance to help simplify and reduce the cost of building petascale computing solutions for data hungry applications across sectors including scientific research, trading simulation, climate modeling, and energy exploration.” | 6:52p |
Future Facilities Brings Data Center Modeling to Non-PhDs Future Facilities has launched a version of its data center modeling software with new user interface that aims to take the “dark art” aspect out of computational fluid dynamics.
Called 6SigmaDCX, the software takes the user through the model building process in logical steps, simplifies access to objects in the model, has tool icons with detailed tool tips, drag-and-drop features, automatic object placement suggestions, and simplified presentation of object properties.
CFD modeling is useful in data center design and management because it helps predict what will happen if a change is introduced to the complex environment where lots of elements depend on and affect each other. It helps answer questions like, how will temperature in different parts of a data hall will be affected if a certain blade chassis is added to a rack?
A Future Facilities 3D model shows the entire data hall layout, including IT racks, air handlers, and power distribution units, and visualizes temperature levels around the room, making it easy to adjust the layout and cooling capacity to avoid hot spots or overcooling.
6SigmaDCX is the new generation of the company’s 6SigmaDC software. With the new user interface, the company hopes to make CFD modeling easier for users who aren’t steeped in thermal dynamics.
The data center modeling software has won accolades from several data center engineering experts, including Vartan Moskifian, critical facilities consultant at HP, Jose Ruiz, director of engineering at Compass Datacenters, and Stuart Hall, sales engineer at Digital Realty Trust.
In a statement, Mark Fenton, product manager at Future Facilities, said, “At a time when data center technology is changing faster than ever, giving more people access to CFD will drive cooling innovation and improve resilience, capacity utilization, and efficiency.” | 7:30p |
Teradata Hadoop Integration Advances With MapR Deal Teradata is primarily known as a data warehouse company, but it is looking to change that through its Unified Data Architecture and high-profile partnerships.
One of those partnerships is with MapR, an Apache Hadoop distribution provider. The two companies are working closely on Teradata Hadoop integration by bridging MapR’s distribution and UDA. Road maps are being aligned, and the two are working on a unified go-to-market offering.
For Teradata, the partnership is about offering customers choice and simplicity when it comes to Hadoop. To tighten Teradata Hadoop integration, the company has partnered with several major distros already, including a deal with Cloudera and another one with Hortonworks (which recently filed for IPO), so MapR is the latest addition to the list of heavyweights in the space.
The expanded partnership makes sense for MapR because it extends how customers can use it and taps into Teradata’s customer base.
The Hadoop space is of great interest to all those in the big data world and enterprise end users. Google Capital recently led a $110 million round for MapR. The company said its Q1 bookings were triple what they were in the first quarter of last year.
Teradata will resell MapR software, professional services and customer support, acting as point of contact for customers using both offerings. Teradata will also provide MapR education and training services.
UDA is a big part of Teradata’s plans. However, the company isn’t solely partnering. It continues to build out functionality into its platform, particularly around orchestration and analytics.
Most recently it introduced Connection Analytics, driven by data and able to perform against disparate data sets at massive scale. Connection Analytics is powered by another fairly recent addition, the Teradata Aster Discovery Platform, a tool that provides insights that extend advanced analytics to business analysts for looking at contextual relationships between people, products, and processes.
Orchestration capabilities Teradata QueryGrid and Teradata Loom will be integrated into MapR software. QueryGrid is a data fabric for transferring data between Teradata databases, Aster Discovery Platform , NoSQL databases, and other technologies.
“Customers who have invested in both MapR and Teradata solutions have requested integration, so now is the right time to expand our partnership,” Scott Gnau, president of Teradata Labs, said in a statement. “As customers continue to build out analytic architectures, they want flexibility and choice, and Teradata’s Unified Data Architecture is the most sophisticated and open big data ecosystem.” | 8:14p |
Telefónica Tapping Equinix Cloud Connectivity Telefónica is leveraging Equinix’s diverse cloud connectivity and offering it to customers. The big Spanish telecommunications company will provide enterprises with dedicated connectivity to multiple cloud service providers through the Equinix Cloud Exchange. The Cloud Exchange offers direct access to cloud providers and networks inside Equinix data centers.
Enterprises are moving to colocation and a big reason is connectivity. Hooking up directly to cloud from a data center helps address proliferation of data growth and reduce costs.
Telefónica instantly adds diverse cloud connectivity to its arsenal through its global IP MPLS network. In addition, the company is also providing secure access to cloud application providers.
“By connecting Telefónica IP MPLS services to Equinix’s Cloud Exchange, we are expanding our own cloud services offer giving our customers the ability to choose from the best combination of cloud infrastructure, platform and software service providers to ensure they can put the best possible hybrid cloud solution in place to meet their individual business needs,” Jose Luis Gamo, CEO of the multinationals business unit of Telefónica Global Services.
Telefónica launched its own Infrastructure-as-a-Service solution in 2012 but recognizes options aren’t competition but a complement, but Telecoms are becoming increasingly neutral when it comes to connectivity because customers don’t like to be locked in, so it is growing the diversity of cloud offerings available through its network.
“Carrier Neutral” has become so commonplace that it’s almost assumed by default. Locking in customers into only your cloud or your network doesn’t lock them in; it prompts them to find a provider with diverse options.
Verizon is also offering private IP service to cloud service providers out of Equinix, despite the fact it has it’s own in-house cloud formed by acquisition of Terremark in 2011. Cloud pricing wars have suggested that cloud is becoming a commodity, but an increasing need to offer diverse options dispels this thinking.
Cloud is not one-size-fits all and there’s increasing multiple cloud usage. A RightScale survey last year and Equinix earlier this month both point to the multi-cloud trend.
The announcement is also indicative of the power of ecosystems. Equinix recently noted that private links to cloud is now its fastest growing business segment. The company has turned itself into a major cloud hub prompting companies like Telefónica to leverage this ecosystem.
Telecoms are also directly connecting to the big public clouds themselves to address these needs. | 9:00p |
Calligo Partners with Manx Telecom for Offshore Cloud Services 
This article originally appeared at The WHIR
Offshore cloud service provider Calligo announced on Thursday that it has partnered with Manx Telecom, a telecommunications company based in the Isle of Man, to provide pan-jurisdictional cloud services.
Calligo was founded in 2012 to bring virtualization and cloud services to offshore jurisdictions. Its CalligoCloud Network includes data centers in Jersey and Guernsey, two islands off the coast of Normandy, France.
According to a statement, Calligo said the partnership will support the growing demand from developing markets for services such as offshore cryptocurrency. With privacy of paramount concern for companies hosting data in the cloud, offshore options could become more popular.
In a recent interview with the WHIR, Bahamas hosting provider Secure Hosting said that companies should consider carefully whether or not they want their data to be hosted in the US, and to do their due diligence.
“It was critically important for us that we partnered with an organization that shared our commitment to security and operational excellence,” Calligo chief commercial officer Andrew Wicks said. “Both organizations have world class in-house skills for cloud deployments, making the integration of our two clouds simple, straightforward and elegant. This partnership extends the reach of our offshore cloud, which can only be of benefit to existing and future clients.”
An existing Manx Telecom cloud client, Assured Communications, said that it is looking forward to benefiting from the pan-island services from Manx Telecom and Calligo.
“Since using the Manx Telecom’s hosting and cloud services this year we have been really impressed with the level of service and the quality of the platform,” Nicholas Grove, head of Innovation and Strategic Development at Assured Communications said.
“We are now really keen to use the pan island services that Manx Telecom and Calligo have created through their partnership and subsequent cloud integration. We view this as a really strategic development for our offshore jurisdiction strategy and something that is way ahead of the other providers,” Grove said.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/calligo-partners-manx-telecom-offshore-cloud-services | 9:30p |
DDoS Attacks of More than 10 Gbps Rise Significantly in Q3 
This article originally appeared at The WHIR
Large-scale DDoS attacks (defined as 10 Gbps and up) grew significantly in the third quarter of 2014, according to a report released Thursday by Verisign. Attacks of this nature represent more than 20 percent of all attacks from July to September 2014.
Removing a couple of isolated attacks in the second quarter that were in the 200-300 Gbps range, the third quarter of 2014 had attacks that were 40 percent greater than the second.
Although no specific industry was safe from DDoS attacks, media and entertainment continued to experience the largest volume of attacks. However, ecommerce was the target of the largest attack of the quarter with an incident over 90Gbps that was a pulsing UDP flood in bursts of 30 minutes or less.
Hosting providers are also the target of recent attacks. Fasthosts customers experienced an outage this week due to a DDoS attack and Spark customers were down for an entire weekend after hackers used malware to gain access.
“This activity was aimed at disrupting the critical online commerce capability of the customer,” according to the report. “With the 2014 holiday season in full swing, the ecommerce and Financial industries must be particularly vigilant and prepared for increasing DDoS attacks during their peak revenue and customer interaction season. Historically, Verisign has seen an increase in DDoS activity against these verticals during the holiday season and anticipates that this trend will continue.”
Ecommerce is growing at amazing rate which explains why it is so attractive to criminals. For example, data released earlier this week shows the Indian ecommerce market is expected to grow from 35 million this year to 15 billion by 2016. The ecommerce market is bolstered by a strong increase in mobile payments which are growing at a rate of 60 percent.
The report also noted that the number of attacks per customer was higher by 60 percent over the second quarter. Verisign speculated that this was due to “maturation of attackers, easier access to ready-made DDoS botnets and toolkits, and adversary observation of attack impact on their targets.” This trend is expected to continue as well.
The Shellshock vulnerability was used to deploy DDoS malware on Linux systems.
“The malware that leverages this vulnerability communicates with specific hard-coded C&C servers,” according to the report. Although the malware existed at least as early as Aug. 20, 2014, this was the first time Shellshock was utilized for attacks. In October, Shellshock was also used in malicious email campaigns. Verisign also discovered a variant of the DBOT backdoor used for DDoS attacks during this quarter.
This article originally appeared at: http://www.thewhir.com/web-hosting-news/china-ramps-censorship-hosts-world-internet-conference |
|