Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, July 14th, 2015
| Time |
Event |
| 12:59a |
Amazon Invests in Wind Energy for East Coast Data Center Amazon Web Services announced another investment in renewable-energy generation, reportedly its biggest one to date, but did not disclose the size of the investment. The future wind farm in North Carolina will pump clean energy into the local utility grid, the same grid that feeds the Amazon data centers in the state that support its cloud-services business.
It has become common-practice for operators of data centers at Amazon’s scale to invest in renewable energy projects on utility grids that supply their facilities. Since environmentalists started drawing public attention to the fact that the ubiquitous “cloud” is powered mostly by coal, Facebook, Microsoft, Apple, and now also Amazon followed Google’s lead in making huge investments in clean-energy supply.
Most of these investments have been in the form of long-term power purchase agreements, or PPAs. A company commits to buying all energy the future project is expected to generate for 10 or more years, which provides funding for the development.
The wind or solar farms usually don’t feed the data centers directly. Instead, the company continues to buy power for the data center from the grid, but sells the renewable energy on the wholesale market while keeping the renewable energy credits and applying them to the power consumed by the data center.
It’s unclear whether Amazon is using this scheme for the most recent investment. A company spokesman did not respond to a request for comment in time for publication.
Greenpeace, which has been mounting pressure on the world’s biggest data center operators to push utilities to increase renewable-energy generation capacity in their mix, welcomed the announcement but said Amazon also has to become more transparent about the energy use of its data centers.
“Will the power from this North Carolina wind farm be delivered to the utilities that provide electricity to Amazon’s data centers in Virginia?” Greenpeace spokesman David Pomerantz asked in a statement. “Without an answer, AWS customers cannot be certain that the wind energy is displacing the gas, coal, and nuclear energy powering those data centers.
“More information is needed especially because Amazon’s main utility provider in Virginia, Dominion, is pursuing expansions of gas and nuclear power plants, justified by the growth of data centers like Amazon’s.”
The future 208-MW Amazon Wind Farm US East in North Carolina’s Perquimans and Pasquotank counties is expected to generate 670,000 MWh annually, once it comes online toward the end of next year.
The company has a goal of powering its operations with 100 percent renewable energy at some point. The company said it was at 25 percent in April and expected the North Carolina wind farm to bring it to 40 percent.
The project brings total renewable-energy generation capacity Amazon has invested in to 1.3 MWh per year in central and eastern US.
It announced a wind farm in Indiana in January and a solar farm in Virginia in June. In May, Amazon said it would use new Tesla batteries designed for critical infrastructure in a pilot project at a West Coast data center. | | 12:00p |
Equinix in Aggressive Pursuit of SaaS Provider Partnerships Now that private network links to nearly all major Infrastructure-as-a-Service providers are available at its data centers around the world, Equinix is aggressively pursuing similar partnerships with big Software-as-a-Service players.
Expect to see a lot more private-connectivity services to SaaS companies through Equinix in the coming months, Chris Sharp, the data center service giant’s VP of cloud innovation, said. The company is in talks with every major provider of cloud-based customer relationship management services and productivity suites, for example.
“Pretty much everything is on the table,” Sharp said.
Enabling customers to connect to public cloud services without going through the public internet has been one of the biggest growth drivers for Equinix, the world’s largest data center provider. For many security- and performance-conscious enterprise customers, this is the only way to use public cloud.
Customers can already access IaaS offerings by the likes of Amazon Web Services, Microsoft Azure, Google Cloud Platform, and IBM SoftLayer directly from a number of Equinix colocation facilities. SaaS partnerships are the next step, driven by demand from enterprise CIOs and CTOs, who want the benefits of SaaS but don’t want to rely on the internet to use those services, Sharp said.
Equinix already provides direct links to Office 365. Announced in May, it became the colo company’s first direct-connectivity service for SaaS. It is now eyeing a similar deal with Google for its Google for Work suite, Sharp said.
He did not name any CRM providers Equinix is talking to, saying only that they would be “some of the larger CRMs.” The largest CRM providers by market share are Salesforce (18 percent), SAP (12 percent), Oracle (9 percent), Microsoft (6 percent), and IBM (4 percent), according to Gartner.
The amount of Equinix customers using private connectivity to IaaS providers has been growing rapidly and driving a lot of new business for the company, Sharp said. It rolled out the service for Azure a little more than one year ago and about 100 customers are using it today, he said.
The service has gone from being available in 10 markets initially to 16 markets today.
Some of the newest customers include GE, HarperCollins, Hitachi Data Systems, and Red Lobster, among others.
Equinix has seen the most rapid growth in direct connectivity to Azure and AWS, but Google’s IaaS has been picking up steam. “Amazon and Microsoft are definitely on a very solid growth trajectory,” Sharp said about the public cloud services consumed through Eqiuinix data centers.
Access to cloud service providers has been “changing the dialogue” with customers about colocation services, according to Sharp. It has become an essential attribute of a multitenant data center, on par with the fundamentals of space, power, and cost, he said.
“Future-proofing” a data center deployment has traditionally meant having enough space and power capacity to expand infrastructure as demand grows. Today, it often means being able to pull as many applications “out of the basement” as possible, Sharp said, meaning pulling applications out of corporate on-premise data centers and consuming them as services. | | 2:45p |
Digital Realty Acquiring Telx In $1.9B Deal Digital Realty Trust announced it will acquire Telx for $1.9 billion. The combination is expected to double Digital Realty’s colocation footprint, as well as provide Digital Realty customers access to Telx’s interconnection platform.
Fitch Ratings, one of three nationally recognized statistical ratings organizations, accidentally leaked Digital Realty Trust’s acquisition of Telx for $1.9 billion Monday morning. While the deal has been rumored for a few months, neither company had formally announced the transaction at the time. This morning, Digital Realty issued an official release announcing the deal.
Digital Realty Trust is a Real Estate Investment Trust (REIT) predominantly involved in large wholesale data center deals while Telx has made its name in retail colocation and interconnection. The deal greatly increases Digital’s play in retail colocation, with Telx’s strength in interconnection a large part of the appeal.
Telx manages over 1.3 million square feet of data center space across 20 facilities, two of which are owned by Telx. The company already leases around a dozen of its facilities from Digital Realty, with six facilities leased in third parties.
On Monday morning, Fitch Ratings put out an opinion affirming Digital Realty’s bond ratings in light of the takeover. Other details provided by Fitch included news that Digital Realty will issue $700 million of equity and $1.2 billion in debt to pay for Telx. The opinion was quickly retracted, reported Forbes, but nothing ever really leaves the Internet.
A report earlier this month by Reuters citing anonymous sources pegged the potential deal at more than $2 billion.
Consolidation is occurring in the data center space, with the top three providers all active in M&A. Equinix announced a plan to acquire TelecityGroup in Europe in May, breaking up a merger between Telecity and another European provider Interxion. Digital Realty’s acquisition of Telx means it now more directly competes with Equinix.
Also in May, Digital’s rival QTS acquired Carpathia Hosting, a provider that does a lot of business with US government agencies. In April, CyrusOne, another rival, acquired Cervalis for $400 million, expanding its presence in the New York market as well as its financial services customer base.
New York-based Telx is a private company owned by private-equity firms ABRY Partners and Berkshire Partners. The acquisition fits strategically with Digital’s recently renewed focus on services beyond its core wholesale data center business, however, Digital and Telx are different businesses at their cores.
“This transformative transaction is consistent with our strategy of sourcing strategic and complementary assets to strengthen and diversify Digital Realty’s data center portfolio and expand our product mix,” said Digital Realty chief executive officer A. William Stein in a press release.
Synergy Research recently discussed the potential acquisition, touching upon the different nature of each company’s business. “While there will always be something of a blurry demarcation line between retail and wholesale, at their heart, these are two market segments with differing characteristics which have different business metrics and require somewhat differing skill sets,” said Synergy chief analyst and research director John Dinsdale earlier this month. | | 3:00p |
The Database Dating Game Ken Krupa is the Enterprise CTO for MarkLogic.
Looking for the perfect data match? Depends on your likes and dislikes, of course, as well as your data needs. Data dating today is much more complicated than when it was during the time when there were just two options: mainframes and relational databases. And in this era, the stakes are higher.
It’s always been true that data is only as valuable as what you can learn from it. The problem today is its rapid growth: Between 2011-2013 90 percent of the world’s data was created, and that was two years ago. Companies are still trying to reign in this ever-increasing data sprawl, let alone analyze it. And once they have captured this data, organizations then need to manage increasingly unique data types like video and text.
Data is the new competitive weapon for companies today, but to make the most out of it, organizations need a powerful partner to meet their unique and individual needs. Numerous available options make this complicated, so take this database dating quiz to find your optimal data partner:
Potential Match #1: Relational Database
A relational database is tried and true, possibly loyal to a fault and steady as a drumbeat…mostly. It may become a hard habit to break and eventually cost you mentally, physically and fiscally. Here are some character traits of relational databases if you consider these as the best fit for your organization.
- Plays matchmaker to tables, connecting them for more efficient and fast searching.
- Is the bouncer at the bar, the macho type that typically offers among the strongest levels of security.
- Very neat and ordered as all data fits neatly into rows and columns.
- Like a warm security blanket, this match is experienced and comforting.
- You have money because this is an expensive date.
- Lastly, you may already be invested in this long-term relationship and hesitant to venture into the dating data game.
If all of the above is true, then the relational database is for you.
Potential Match #2: Open Source NoSQL
Do you like peace, love and openness? A free, idea-sharing architecture where anyone and everyone can contribute to the greater good? Then take a look at some typical traits of open source NoSQL databases:
- Affordable scalability. Most open source NoSQL databases run on commodity hardware that can easily be added to increase resources and reduce the load.
- Schema-agnostic for a very open and flexible relationship.
- Cheap date, as features such as automatic repair and simple data models make administration and tuning requirements fewer in NoSQL.
- Hit a bump in your relationship? There is no specific IT therapist: You need to look to the community for help.
Potential Match #3: Enterprise NoSQL
Are you looking for someone who is strong, smart, has a great memory and is open to new ideas? However, if you change your mind, this person is so adaptable and flexible, he/she can work with your desires in seconds if not in real-time. This may be the perfect match.
- A focus on ACID transactions versus the typical Brewers CAP theorem found in NoSQL.
- Application and business flexibility as data and models can be changed quickly and easily, with no disruption to the application.
- Low server cost and scalability as Enterprise NoSQL runs on commodity servers.
- Flexibility accompanied with enterprise-proven features like high availability, security and disaster recovery.
- Data-driven features such as search, semantics and bitemporal.
Finding the perfect data mate can be a confusing process with so many options out there. Take this test and find your match made in IT heaven.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 4:29p |
CA Wants to Blend IT Management and Application Management CA Technologies today released two updates to its IT management software portfolio, meant to further unify infrastructure and application management.
Release 10 of CA Application Performance Management software sports a revamped user interface and support for access rules based on roles inside the organization. The company also added support for topology maps that make it simpler to correlate role- and task-based views using attributes such as location, application type, business service, or owner name.
Kieran Taylor, senior director for solutions marketing for DevOps at CA, said the goal is to make it easy for IT organizations to visually identify the root cause of any particular problem or issue. To that end, release 10 of CA APM also provides an ability to visually chart changes to application topology, status or other attributes alongside historical performance metrics.
“IT organizations are always looking for patient zero when something goes wrong,” Taylor said. “We can now make it easier to isolate any changes made to the environment.”
Version 8.3 of CA Unified Infrastructure Management, meanwhile, provides both tighter integration with CA APM along with additional support for Hadoop, Cassandra, and MongoDB platforms. Also included is integration with CA Network Flow Analysis software that makes it simpler to correlate events between server environments and the underlying network.
Taylor said the biggest challenge confronting most IT organizations today is finding a way to scale their expertise in a way that allows the organization to be more agile.
It’s clear that, thanks to convergence, IT management is getting more complicated across the data center. As a result, too many false positives are being generated that don’t actually signify a real IT problem.
Faced with a steady stream of alerts, Taylor said, IT organizations clearly need a better way to identify the real signal being generated by all the IT noise. The latest updates from CA aim to solve that problem by making is possible to visually mine the data being generated inside data center environments that today span everything from mobile applications to the mainframe.
What’s still unclear is the degree to which application and infrastructure management will ultimately merge.
There appears to be a growing requirement to correlate more data across applications and infrastructure. But many application management teams may not choose to standardize on management software from the same vendor as the IT infrastructure team.
In the absence of any real consensus, the next best thing to make it as simple as possible to share data as cleanly and clearly as possible between management environments. | | 4:30p |
Top Tips for Preventing IT Heat and Cooling Disasters With technology so closely tied to the entire business process, IT outages can be extremely costly. Many organizations today are looking for ways to create more resiliency within their own IT walls to enable a healthier environment, especially in terms of avoiding IT heat disasters. Many large data center environments already contain redundant systems and a lot of resiliency to combat outages due to equipment over-heating. Unfortunately, these processes don’t fit small to mid-sized server rooms and network closets.
Without some kind of backup air cooling or conditioning unit, the “smaller guys” are almost certainly looking at some kind of outage. However, there’s good news.
In this whitepaper, you will learn how next-generation portable air conditioners are creating powerful backup systems for small to mid-size server environments. You now have the capability to deploy a flexible air-cooling unit at the exact time and place it’s needed. This technology really takes the uncertainty out of having only one primary cooling system by allowing your smaller environment to create true cooling redundancy.
As the paper outlines, there are important considerations when working with a portable air cooling unit. Specifically, it’s important to determine if you want to buy or rent. Your decision will depend on use-cases and server room requirements. For example, if a heat-wave is likely to occur in your area, then it might make sense to buy a unit in case rentals become scarce. Also, the air-conditioning selection process is critical, so you don’t want to be “stuck with what’s left” when you need a portable AC unit for an IT heat disaster.
Download this white paper today to learn more about preventing IT heat disasters and the critical considerations around what you should look for when selecting a portable air cooling system. | | 5:05p |
Suse Throws Enterprise Linux Weight Behind 64-bit ARM Servers Now that 64-bit ARM processors are starting to generate some interest inside the data center, providers of Linux distributions like Suse are starting to get in line.
Today Suse announced that version 12 of Suse Enterprise Linux will be supported on 64-bit ARM server processors from AMD, AppliedMicro, and Cavium powering servers by Dell, HP, Huawei, and SoftIron.
Gerald Pfeifer, senior director of product management and operations for Suse, said that while ARM server chips are not set to usurp Intel’s x86 architecture inside the data center any time soon, Suse is starting to see enough momentum to warrant providing official support.
“ARM isn’t ready to rule the world,” said Pfeifer. “But interest is very high.”
Much of that interest is being driven by both improving performance attributes of 64-bit ARM processors and the fact that they require much less power to run than rival processor architectures. As such, Pfeifer said, Suse views providing support for 64-bit ARM processors as an opportunity to expand the base of IT organizations adopting Suse Enterprise Linux across multiple processor platforms.
Version 12 of Suse Enterprise Linux already runs on 64-bit x86 servers, IBM Power Systems, and IBM System z mainframes.
As part of that strategy, Suse has implemented support for ARM and AArch64 into its openSUSE Build Service, which makes it possible to build packages against real 64-bit ARM hardware running SUSE Linux Enterprise 12 binaries with an eye toward reducing the time required to build, test, and ship products based on 64-bit ARM architectures.
Initially, 64-bit ARM server processors are expected to manifest themselves in both appliances and in hyperscale computing environments. Obviously, those are market segments that every processor manufacturer covets, which means support from Linux distributors is critical to the long-term viability of any processor technology.
Another critical factor is the degree to which adding support for a new processor architecture is offset by any power consumption savings actually generated. While the cost of power in Europe, for example, is a major issue, power costs outside of hyperscale environments running cloud applications are not nearly as big a motivator to consider alternative processor architectures in parts of the world where the cost of power is relatively cheap.
In fact, with the advent of Big Data applications and containers the next big fight over processors in the data center may come down to the amount of memory being made available rather than the actual speed of the processor or the amount of energy it consumes.
Naturally, it will be several years before this latest round in the processor wars finally plays out. Server vendors clearly have a vested interest in using ARM to counter the dominance of Intel in the data center. But even if ARM servers were to be widely adopted inside the data center starting tomorrow, it wouldn’t be until the next decade arrived before ARM could even hope to account for half of the server install base. | | 5:19p |
Web Hosts GoDaddy and Papaki Promise No Interruptions for Greek Customers 
This article originally appeared at The WHIR
Facing growing pressure from financiers, Greece is being constricted by government-imposed bank holiday and capital controls that prevent Greeks from accessing many necessary goods and services, including foreign cloud services.
The bank holiday has been continually extended as talks between Greece and its creditors continue. On Saturday, Economy Minister George Stathakis told media that even if banks reopen capital controls will remain in place for at least another two months.
For two weeks, capital controls have placed a €60 per day cash limit on ATM withdrawals, blocks to credit card payments to foreign vendors, and prohibiting any money transfers outside Greece without approval from a Ministry of Finance commission. This has all made it extremely difficult for companies to pay for cloud services and web hosting from foreign providers.
GoDaddy, a large international web host and the world’s largest domain registrar, has been manually renewing expiring subscriptions for its Greek web hosting customers free of charge. Only subscriptions set to auto-renewal will be extended.
Given that more than half of companies in Greece are “micro-enterprises” with less than 10 employees, many Greek business are likely to use services like GoDaddy for their domains and web hosting.
“A website and digital presence is not just a prerequisite for doing business. For an SMB, it’s also a criteria for survival,” GoDaddy EMEA VP Stefano Maruzzi told The WHIR. “We simply can’t accept that our customers in Greece will cease to exist as a consequence of the banking system shutdown. Helping our customers in any way, particularly in these tough circumstances is a moral obligation. Every subscription with auto-renewal will be extended to ensure their online presence will continue, supporting their businesses and families.”
Papaki, a major Greek web hosting provider, has also promised that its services would not be interrupted by the capital controls. All Papaki hosting services and email address that have expiration date on a bank holiday are automatically extended. Customers can still pay for services via credit or debit card (including VISA, Mastercard, and Euroline), existing Papaki account credits, or with their remaining cash balance in their PayPal account or through a PaySafe card purchased before capital controls were imposed.
Meanwhile, many individuals with .gr greek domain names should not be worried since the Hellenic Telecommunications and Post Commission (known as “EETT”) has announced that .gr domains will not to expire during the bank holiday period.
Consumer-oriented cloud service providers have also reacted to the capital controls. For instance, Apple is giving Greek iCloud users 30 days of free service, and Dropbox is offering free service until the situation is resolved.
Tech.eu is maintaining a crowdsourced list of how cloud service providers are dealing with capital controls. This ranges from offering free services, to providing a grace period (Mediatemple), to payment extensions on request (Microsoft Azure), to continuing billing but not suspending customers (Digital Ocean), to providing no special treatment to Greek customers (Adobe).
This first ran at http://www.thewhir.com/web-hosting-news/web-hosts-godaddy-and-papaki-promise-no-interruptions-for-greek-customers | | 6:00p |
Rackspace Launches Fanatical Support for Microsoft Azure 
This article originally appeared at The WHIR
Rackspace has launched Fanatical Support for Microsoft Azure on Monday, a managed cloud solution that provides customers with application and infrastructure guidance for Azure cloud environments. Microsoft is hosting its worldwide partner conference in Orlando this week.
The offering is generally available to US-based customers initially and will be available for international customers in early 2016.
The announcement comes as Rackspace has been named Microsoft’s 2015 Hosting Partner of the Year for the fifth time, recognition it considers directly related to its commitment to Fanatical Support. The company has hundreds of employees trained as Microsoft Certified Professionals. Recently, Rackspace introduced Fanatical Support for Office 365.
Rackspace’s Fanatical Support for Azure will include managed services and expertise available 24/7/365 from Microsoft certified employees, proactive monitoring of customer environments, on-demand access to database specialists, and guidance around architecting applications and optimizing databases.
Customers can access Fanatical Support for Azure in one of two ways, Rackspace says; Fanatical Support with Azure Bundle (Infrastructure and Support) or Fanatical Support for Azure (Support Only), designed for customers who already use Azure infrastructure and want support from Rackspace.
“Fanatical Support for Azure and Azure Stack adds Rackspace’s industry-leading support to Microsoft’s deep experience with the hybrid cloud, creating a win-win for customers,” Microsoft Cloud Enterprise group EVP Scott Guthrie said in a statement. “With this relationship, our mutual customers will have even more options for migrating their diverse IT workloads to the cloud.”
Earlier this year, Datapipe extended support to Microsoft Azure public cloud, combining its managed services and tools.
This first ran at http://www.thewhir.com/web-hosting-news/rackspace-launches-fanatical-support-for-microsoft-azure |
|