Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, December 21st, 2012

    Time Event
    2:02p
    Red Hat Buys ManageIQ in Hybrid Cloud Play

    Red Hat (RHT) announced that it has entered into a definitive agreement to acquire ManageIQ, a leading provider of enterprise cloud management and automation solutions that enable organizations to deploy, manage and optimize private clouds, virtualized infrastructures and virtual desktops.

    The approximately $104 million acquisition will allow Red Hat to expand the reach of its hybrid cloud management solution for enterprises. As an existing member of the Red Hat Enterprise Virtualization Certified Partner program, ManageIQ has worked closely with Red Hat to provide customers with unified monitoring, management and automation solutions. With ManageIQ the Red Hat hybrid cloud management portfolio will include CloudForms, Enterprise Virtualization, and ManageIQ hybrid cloud operations management tools.

    “Industry and customer response to Red Hat’s vision for the open hybrid cloud has been overwhelmingly positive because it offers the best of both worlds: the ability to tap into the public cloud when and where it makes sense, while leveraging existing investments for cloud infrastructure,” said Paul Cormier, president of Products and Technologies at Red Hat. “For enterprise cloud initiatives, effective cloud management is critical. ManageIQ offers robust features, including orchestration, policy, workflow, monitoring and chargeback, that deepen Red Hat’s cloud management capabilities and bring the promise of open hybrid cloud a step closer for the industry.”

    Third Quarter Earnings

    Red Hat also announced financial results for its fiscal year 2013 third quarter. It saw an 18 percent increase from last year in total revenue, with $343.6 million. Subscription revenue for the quarter was $294.2 million, up 19 percent.

    “Strong execution, industry leading solutions and our ability to deliver a compelling ROI to our customers, all contributed to continued momentum and strong third quarter revenue growth in the face of a challenging global economic environment. Red Hat is benefiting from our position as a trusted vendor for IT,” stated Jim Whitehurst, President and Chief Executive Officer of Red Hat. “Since October of last year we have completed three acquisitions, and are announcing a fourth today to expand our portfolio of open source solutions and enlarge our addressable market. As our enterprise customers move to open, hybrid cloud architectures, we are addressing their needs with a clear roadmap based on industry-leading open source technologies.”

    2:08p
    IBM To Acquire StoredIQ for Big Data Capabilities

    IBM announced it has entered into a definitive agreement to acquire StoredIQ Inc., a privately held company based in Austin, Texas.

    Adding to IBM’s big data initiatives the StoredIQ capabilities will help clients respond more efficiently to litigation and regulations, dispose of information that has outlived its purpose and lower data storage costs.   It will add to IBM’s Information Lifecycle Governance and enhance the ability to have effective governance over the vast majority of data, including efficient electronic discovery and its timely disposal, to eliminate unnecessary data that consumes infrastructure and elevates risk.

    StoredIQ software provides scalable analysis and governance of disparate and distributed email as well as file shares and collaboration sites. This includes the ability to discover, analyze, monitor, retain, collect, de-duplicate and dispose of data. In addition, StoredIQ can rapidly analyze high volumes of unstructured data and automatically dispose of files and emails in compliance with regulatory requirements.

    “CIOs and general counsels are overwhelmed by volumes of information that exceed their budgets and their capacity to meet legal requirements,” said Deidre Paknad, vice president of Information Lifecycle Governance at IBM. “With this acquisition, IBM adds to its unique strengths as a provider able to help CIOs and attorneys rapidly drive out excess information cost and mitigate legal risks while improving information utility for the business.”

    IBM intends to incorporate StoredIQ into its Software Group and its Information Lifecycle Governance business. Building on prior acquisitions of PSS Systems in 2010 and Vivisimo in 2012, IBM adds to its strength in rapid discovery, effective governance and timely disposal of data.

    2:44p
    Why Data Center Managers Shouldn’t Feel Threatened by Colocation

    Colocation has been a viable option for a long time, yet some data center managers fear that it can be a threat to their job security. I’d like to spend a few minutes listing some reasons why colocation should be considered a part of any healthy data center strategy for all enterprises.

    Colocation is not outsourcing. There are common elements, but colocation offers a more flexible approach, in my opinion. At a minimum, with colocation you lease space with power and HVAC. The equipment you use and how you architect it is up to you. That includes your control of the lifecycle management. In a traditional outsource arrangement the assets are part of the deal and management of those assets is the responsibility of the outsource partner. Outsourcing is a valid option, but one that requires a lot more attention to the contract implications on the lifecycle of the IT infrastructure.

    Colocation is not an all or nothing decision. In reality, colocation is part of a complete data center strategy that might include outsourcing, cloud computing, disaster recovery and your own private facility. If you are just beginning to contemplate colocation, then think of it as a relief valve for your data center needs. What will make a colocation project successful is having a clear line of sight to what you want to put in the facility and why. There is no one size to fit all users, so understand your own enterprise security and data management issues and carve out something that works. Once you’ve built comfort in colocation, you can easily expand and diversify. Those are two factors that are harder to do in an outsource arrangement.

    Innovation is another great reason for colocation. Transition to a new compute platform for an existing data center is usually a migration plan that requires retrofit of existing space and reconfiguration of the legacy environment. That means that there is seldom a simple set of changes to make the migration happen. With colocation, you can find a facility to match your needs, build in the estimated capacity and start the application and data migration with fewer dimensions of change.

    The second innovation benefit of colocation is that you typically own the assets. They are not tied to an outsource contract or to a physical facility that you own. Upgrades and refresh/replacement can happen at your tempo and within your cost structure. If you had a long term outsource commitment to a specific spend or number of units managed, you might be hindered in upgrading. The bottom line is that you have more flexibility to manage the ebb and flow of technology because you have control over the key elements that allow you to innovate.

    Owning a data center can also be a hindrance to innovation. Many of us have legacy infrastructure. If it is more than 10 years old it is probably in need of some serious upgrades to get power and HVAC per square foot up to snuff as well as assuring all the safety and fire codes are being met. Negotiating with another department to spend their budget dollars on your data center can be difficult. Even if you are successful, that upgrade will take time and be disruptive. You might even miss a full cycle of technology innovation while you wait. With colocation you can find a facility that is ready to meet current and future demands.

    To be fair, there are a number of items you need to be accountable for in a colocation project:

    1) Pay attention to what you are signing up for. Colocation providers come in all of shapes and sizes.
    2) Think about how they will operate in a disaster.
    3) Plan for the 3-5 years out and at least one major refresh during that time.

    At the end of the day, the business is looking to you for uptime, innovation and cost management. Colocation may be just the right option.

    To get more useful enterprise class data center management strategies and insight from Nemertes Research download the Q3 Data Center Knowledge Guide to Enterprise Data Center Strategies, compliments of Vantage Data Centers.

    2:47p
    Facebook Tests Immersion Cooling

    Facebook engineers immerse one of the company’s servers in dielectric fluid in a recent test. (Photo: Facebook)

    Will submerged servers come to Facebook? The social network is the latest company to conduct tests in which servers are submerged in dielectric fluid. Facebook engineers Tin Tse and Veerendra Mulay put together an “infrastructure hack” to test an immersion cooling method in which they submerged one of the company’s recent server designs in mineral oil and turned it on.

    The liquid cooling test was included in a roundup of the Facebook engineering team’s favorite hacks from 2012. ” The test was successful; we were able to run the server at temperatures above 110 degrees Farenheit,” the engineering team reported.

    There’s no indication that Facebook is ready to convert its data centers to immersion cooling, a technique that its advocates claim can produce large savings on infrastructure, allowing users to operate servers without a raised floor, computer room air conditioning (CRAC) units or chillers. But 2012 was a year of progress for “swimming servers” and other liquid cooling techniques. Here’s a recap:

    Intel Embraces Submerging Servers in Oil: Intel has just concluded a year-long test with immersion cooling equipment from Green Revolution Cooling, and affirmed that the technology is highly efficient and safe for servers. The testing, conducted at an Intel data center in New Mexico, may mark a turning point in market readiness for submerged servers, if recent experience with Intel’s embrace of emerging data center designs is any indication. “We continue to explore server designs, and we’re evaluating how (immersion cooling) can change the way data centers are designed and operated,” said Mike Patterson, senior power and thermal architect at Intel. ”It’s obviously quite a change in mindset.”

    3M Demos New Immersion Cooling Technique: 3M demonstrated a data center cooling concept called “open bath immersion cooling,” which it says is simpler and less expensive to implement than other pumped liquid cooling techniques. The system is an example of passive two-phase cooling, which uses a boiling liquid to remove heat from a surface and then condenses the liquid for reuse, all without a pump. The servers are immersed in 3M’s Novec, a non-conductive chemical with a very low boiling point, which easily condenses from gas back to liquid. Each processor is capped with a copper plate coated with a material that enhances boiling, improving the efficiency of the heat transfer. The vapor generated by the boiling Novec rises to a condenser integrated into the tank and cooled by waters, and then condenses back to liquid for reuse.

    Closer Look: Iceotope Liquid Cooling: Iceotope describes its solution as “free cooling anywhere.” It encapsulates servers in heat pipe modules containing 3M’s Novec fluid as its heat removal medium. Each server motherboard is completely immersed in a sealed bath of liquid coolant which passively transfers heat away from the electronics to a heat exchanger formed by the wall of the module, where water is continuously re-circulated and cooled. The company says the system can work with water supplies of up to 50 degrees C and still keep servers cool.

    Hardcore Becomes LiquidCool, Eyes Server Market: In July, Hardcore Computer retooled its business to focus on licensing its liquid cooling technologies for servers and high performance computing. As part of that shift, the company is changing its name to LiquidCool Solutions and will discontinue manufacturing PCs, workstations and servers.
    Hardcore, best known in the data center sector for its Liquid Blade immersion cooling system, has adopted a contract manufacturing model, with research and development and prototype work still done in house. The company plans to license its technology and intellectual property to server makers, with an eye toward other markets down the road.

    U.S. Defense Department to Cool Servers With Hot Water: The U.S. Department of Defense (DoD) will soon begin cooling its servers with hot water. The DoD said this week that it will convert one of its data centers to use a liquid cooling system from Asetek Inc. The move could clear the way for broader use of liquid cooling in high-density server deployments at the DoD, which says it will carefully track the efficiency and cost savings from the project.

    Hot Water Cooling? Three Projects Making it Work: The phrase “hot water cooling” seems like an oxymoron. How can hot water possibly help cool servers in high-density data centers? Although the data center community has become conditioned to think of temperatures between 60 and 75 degrees as the proper climate for a server room, there are many ways to keep equipment running smoothly with cooling technologies featuring significantly higher temperatures. There are three recent examples of this trend.

    3:07p
    The 12 Days of Christmas in the Data Center

    Jeffrey S. Klaus is the Director of Data Center Solutions at Intel Corporation, where he has managed various groups for more than 12 years. Klaus’s team is pioneering data center power and thermal management solutions, which are sold through an ecosystem of data center infrastructure management (DCIM) software and hardware companies around the world.

    Jeff-Klaus-smJEFF KLAUS
    Intel

    Another year’s end, and we’re in the midst of another holiday season. Besides anticipating time off, family celebrations, and gift giving, every IT professional should be anticipating—and planning for—the challenges relating to data center energy management in 2013.

    On the First Day of Data Center Christmas: IT Transformation

    The data center has moved from a support business to a mission-critical resource. Next year, I could argue that the data center will become the most-critical resource. The elevation of the data center is being driven by demands for transaction speed and exploding numbers of devices and applications used for sales, service, operations, HR, and practically every functional area. Business users will continue to expect more from the data center. They want to improve their productivity with increasingly self-service capabilities, customization, on-demand services, and, above all, reliability that translates to highly available data center services.

    Second Day: Organizational Disconnects

    Historically, the various IT and facilities teams worked separately. Rarely did hardware, software, networking, and facilities teams come together, and if they did, they rarely understood each other. The 2013 outlook, with escalating energy costs and a continued sluggish global economy, calls for increasing focus on power optimization, and that means providing tools that not only work for all of the various teams, but encourage cooperation among the teams.

    Third Day: Affordability of Servers and Storage Drives Up Demand

    Dramatic server/storage price reductions over the last decade have led to mass migrations of tasks to online and automated platforms, thus driving up energy consumption in the data center. Power and cooling have become significant portions of the budget; some argue power has become the single biggest expense.

    Fourth, Fifth and Sixth Days: Virtualization, Clouds, and Mobility Change Energy Profiles

    Rapid change is nothing new in the data center, but 2013 will see several major technology trends gaining wide-scale acceptance. Virtualization is expanding from servers into desktop infrastructure, and users are demanding the flexibility and rapid provisioning that is only possible within a private or public cloud environment. Mobility adds another layer of complexity, as employees bring their own smart devices to work, thus driving up network traffic and server workloads with apps and anytime, anywhere access to data center resources. The data center is being bombarded with service requests, and large companies are already hitting the power restrictions of their facilities as well as the limits of some local utility companies to meet their needs.

    Seventh Day: Natural Disaster Preparedness

    The 2011 earthquake and tsunami in Japan and this year’s hurricane season that included Sandy’s devastation of New York and surrounding states are vivid reminders that every data center should be continually refining its disaster plans. The 2013 challenge will be to ensure that disaster plans include prolonging operation with backup power supplies. Disaster recovery should be elevated to a data center best practice, supported by a management solution that offers on-the-fly server adjustments to minimize power draw.

    Eighth Day: Battling Methodologies and Tools

    Natural disasters are one of the driving forces fueling growth of co-location (colo) facilities. Since many colo companies position their services as insurance for any power outage situation, some are among the early adopters of intelligent energy management solutions. Others have developed their own power management tools, and these will increasingly impact off-the-shelf DCIM solutions.

    Ninth Day: The Search for Holistic DCIM Solutions

    The ongoing debates about energy management approaches are driving the demand for and evolution of holistic DCIM platforms. Data center teams should look for solutions based on real-time data collection versus less-accurate predictive models. With fine-grained thermal and power monitoring, a DCIM solution should enable a data collection that feeds into holistic analysis and ultimately control of energy behaviors.

    Tenth Day: Budget-Restricted Technology Roll-Outs

    Of course, even the best solution doesn’t automatically override the budget restrictions stemming from global economic uncertainty. Therefore, data center managers will likely aim for smaller-scale trials and proofs of concepts than originally planned. A phased-in deployment should still be designed to achieve the same results over the long term, with each phase essentially self-funding the next phase with the proven gains in energy efficiency.

    Eleventh Day: Vendor Consolidation

    DCIM will continue to mature and, along with economic pressures, the rapid rate of change may likely lead to vendor consolidation. This will include large vendors buying up smaller tool vendors, to accelerate the development of their platforms. Maturation ultimately benefits the customer, however, and so the challenge here will be to avoid investments in solutions that may get swallowed up by competitors.

    Twelfth Day: Inability to Predict the Future

    As the year comes to a close, we are left with many unknowns about the DCIM market and how energy management in the data center will look a year from now. How will the market size compare to the 2013-2014 predictions? What will it take to move the technology to the next level?

    We will all be watching and analyzing market movements, but ultimately data center demand will drive the technology. And this demand is growing at a healthy pace. Slow economy or not, energy costs are not going to suddenly plummet. More likely, energy demand will drive up prices, and governments will continue to increase energy taxes. DCIM solutions that build in proactive, fine-grained energy management capabilities are the best—and perhaps only—way to keep the data center sufficiently supplied without breaking the budget.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    3:22p
    Data Center Jobs: CyrusOne

    At the Data Center Jobs Board, we have a new job listing from CyrusOne, which is seeking a Vice President, Datacenter Systems in Saratoga, California.

    The

    [Error: Irreparable invalid markup ('<strongvice [...] systems</strong>') in entry. Owner must fix manually. Raw contents below.]

    <p>At the <strong>Data Center Jobs Board</strong>, we have a new job listing from CyrusOne, which is seeking a <strong>Vice President, Datacenter Systems</strong> in Saratoga, California.</p> <p>The <strongvice President, Datacenter Systems</strong> is responsible for leading CyrusOne&#8217;s efforts to lead the industry in terms of data center systems automation in order to further the long-term vision of global deployment of the industry&#8217;s most innovative, massively modular data centers and data center colocation services. This includes global leadership for systems development and customer automation scenarios, reporting into the office of the CTO. To view full details and apply, see <a href="http://jobs.datacenterknowledge.com/a/jbb/job-details/778040">job listing details</a>.</p> <p>Are you hiring for your data center? You can list your company’s <a href="http://jobs.datacenterknowledge.com/a/jbb/post-job">job openings</a> on the <a href="http://jobs.datacenterknowledge.com/a/jbb/find-jobs?sb=1&amp;sbo=1">Data Center Jobs Board</a>, and also track new openings via our jobs <a href="http://jobs.datacenterknowledge.com/a/jbb/find-jobs-rss?sb=1&amp;sbo=1">RSS feed</a>.</strongvice></p>
    5:00p
    Friday Funny: New Year in the Data Center

    It’s Friday! We’ve made it to the end of the week. And that means it’s time for our Friday caption contest, with cartoons drawn by Diane Alber, our fav data center cartoonist! Please visit Diane’s website Kip and Gary for more of her data center humor.

    This time, we are featuring “Happy New Year in the DC.” Diane writes: “The new year is coming up, so I thought Kip and Gary would celebrate it in the data center.”

    The caption contest works like this: We provide the cartoon and you, our readers, submit the captions. We then choose finalists and the readers vote for their favorite funniest suggestion. Scroll down and add your suggestion in the comments below.

    Also, congrats go out to Jason Woodrum for the winning caption — “I would strongly advise you stay out of the hot aisles this year, Frosty.” — for “Snowman Surprise.”

    CSC offers Diane’s 2013 Kip and Gary Data Center Comic Calendar for free. You can pre-order one through Diane’s site.

    May you have a great holiday season and a New Year filled with laughter! And for the previous cartoons on DCK, see our Humor Channel.

    5:01p
    The Year in M&A: The Top 10 Data Center Deals of 2012

    Equinix expanded its presence in the Frankfurt market with the acquisition of ancotel GmbH, which includes the Kleyer90 facility shown above . It was one of the year’s top 10 deals. (Photo: ancotel)

    In the data center world, there are two factors that drive acquisitions: geography and technology. Deals can help providers enter new markets and  expand their footprint, or help them keep pace in areas where technology is evolving rapidly. Not surprisingly, 2012 was a busy year for mergers and acquisitions (M&A). Here are 10 deals that capture the year in M&A:

    Digital Realty’s Ongoing Acquisition Spree

    Data center developer Digital Realty Trust has a history of growing through acquisition, and was busy on the M&A front this year. The big deal was paying about $1.1 billion (715 million pounds) to acquire three large data centers in the London market from UK provider Sentrum. Digital Realty was also busy expanding in the Dallas market, acquiring the Convergence Business Park, an 819,000 square foot data center and office campus in suburban Dallas and 400 South Akard Street in Dallas, also known as The Databank Building. Also in Texas, it acquired 8025 North Interstate 35, a fully-leased data center facility in Austin, Texas, for $12.5 million. Another notable deal was the acquisition of a 575,000 square foot redevelopment property in Franklin Park, Ill. for $22.3 million, marking its entrance into the suburban Chicago market. The three-building property has an existing tenant and considerable land to develop data centers.

    Equinix acquires Ancotel and AsiaTone

    After years of building out its data center footprint in the U.S., Equinix is using acquisitions to extend its “Platform Equinix” across the globe. In 2012, Equinix acquired ancotel GmbH of Frankfurt, Germany in a deal that further boosts the strength of its network in Frankfurt, one of the world’s most important meeting points for Internet traffic and financial trading. Equinix also completed a $230 million acquisition of Hong Kong-based data center provider Asia Tone in an all cash transaction valued at $230.5 million, which strengthens the colocation specialist’s position in the Asia-Pacific region, including China.

    365 Main Buys 16 Sites From Equinix

    Even as Equinix was making deals to grow internationally, it pruned its U.S. network by divesting 16 of the facilities it acquired in its 2010 purchase of Switch & Data, a provider known for its focus on second-tier markets. The buyer was a familiar name. 365 Main came off the sidelines, teaming with Crosslink Capital and Housatonic Partners to pay $75 million for the facilities, which total 280,000 square feet of data center space. As Equinix sharpened its focus on its core high-connectivity markets, the 365 Main team joins a number of emerging players in targeting growth in second-tier markets.

    VMware Acquires Nicira in $1.2 Billion Embrace of SDN

    Amid growing buzz about the transformative power of software-defined networking (SDN), VMWare made a bold move with its $1.2 billion acquisition of Nicira, a leading player in the SDN world. Nicira’s software platform manages a network abstraction layer, which lets users create virtual networks that operate independently of the underlying physical network hardware. This acquisition was quite sizeable, and sparked a software-defined networking frenzy, as Oracle acquired Xsigo, a specialist in I/O virtualization, just a week later.

    AMD Buys SeaMicro

    AMD’s deal to acquire SeaMicro for $334 million was a disruptive one for the low-power server landscape, as SeaMicro had been using Intel chips in next-generation servers offering dramatic reductions in power and space usage. The deal strengthened AMD’s position in the push toward low-energy servers, which involves a crowd of newcomers (most notably Calxeda and Tilera) in addition to Intel, which has since adapted its Atom mobile chips for use in servers. The crown jewel in the acquisition is SeaMicro’s networking fabric, which allows hundreds of low-power processors to work together.

    Dell acquires Quest Software

    Data center management is a key focus for a growing number of players who want to provide a “single pane of glass” to allow companies to track and manage their data center assets and capacity. This was the theme of one of 2012′s larger industry deals, as Dell acquired Quest Software for $2.4 billion in a transaction that strengthened Dell’s offerings for data center management software.  The purchase is part of Dell’s broader strategy to increase its enterprise IT business, which offers better growth and profit margins than the consumer PC business. Software is a key part of that effort, as data center managers seek improved tools to manage their increasingly complex infrastructure.

    Cisco acquires Meraki

    In November, Cisco announced its intent to acquires  privately held cloud networking company Meraki for $1.2 billion. Cisco (CSCO) will gain new cloud-based network offerings with the acquisition. Meraki’s wireless, switching, and security solutions, delivered on edge and branch networks will expand Cisco’s network offerings by providing scalable solutions for mid-market businesses. Meraki will also strengthen Cisco’s Unified Access platform, which seeks to unite wired and wireless networks, policy and management into one integrated network infrastructure.

    Bell Canada Buys Q9 Networks

    Bell Canada partnered with a group of investors to buy data center provider Q9 Networks for $1.1 billion ($1.06 billion US). It was the latest in a series of deals in which telcos have bought up data center companies to expand into the growing market for cloud computing services. The deal is also a huge win for private equity firm ABRY Partners, which bought Q9 Networks in 2008 for $361 million. The transaction marks the second time ABRY has acquired a data center provider and sold it to a telecom provider for a substantial premium, a model previously followed in ABRY’s 2007 purchase of Texas provider CyrusOne.

    SAP’s $4.3 Billion Cloud Bet on Ariba

    If this list was based on price, this deal would be much higher up. SAP America acquired Ariba for $4.3 billion to boost cloud offerings. Ariba is a cloud-based business commerce network, and it was acquired for $45.00 per share, or about $4.3 billion. Ariba’s buyer-seller collaboration network will be combined with SAP’s existing solutions and create new models for business-to-business collaboration in the cloud. The acquisition makes the list because it shows SAP’s bet on cloud – whereas the company was hesitant to do so 5 or so years ago as Salesforce.com rose the ranks. This deal shows that there is no escaping the enterprise momentum to shift workloads to the cloud.

    iNET Interactive Buys DCK, AFCOM

    We might be biased here, but it’s been great at our new home. iNET Interactive acquired Data Center Knowledge in April, continuing to build its focus on the data center sector. In late 2011 iNET acquired AFCOM, the leading industry association ford ata center managers, along with the Data Center World conference series. iNET Interactive is a web-centric media company serving special interest communities through prominent online properties, events, and publications.

    8:40p
    PEER 1 Bought by Cogeco in $635 Million Deal

    PEER 1 Hosting has been acquired by Cogeco Cable in a $635 million deal that will combine two Canadian service providers. The transaction is the latest example of a telecommunications company acquiring a data center service provider in hopes to boosting its revenue from cloud computing and other hosting services.

    Cogeco has offered to buy shares of PEER 1 for $3.85 per share, or about a 32 percent premium to their previous market value. The board of directors for PEER 1 has unanimously supported the offer, following the advice of a special committee created to evaluate the offer.

    The Web Host Industry Review looks at the strategy behind the sale of PEER 1:

    According to Phil Shih, managing director at Structure Reasearch, the deal might actually see Cogeco push PEER 1 further in the direction of providing managed hosting in Canada.

    “From a Canadian market perspective, this acquisition is mostly about colocation,” says Shih. “Managed hosting revenue is mostly generated from the US and now the UK. PEER 1 only  began to sell hosting in Canada a few years back. So it will be very interesting to see what Cogeco’s plans are for the international operations. My sense is that Cogeco is going to take advantage of the PEER 1 brand and platform to make a stronger push into managed hosting in Canada. Managed hosting is where Cogeco wants to go and where it is getting stronger growth.”

    As with similar acquisitions in the past, there may be a significant opportunity in addressing the existing base of Cogeco customers with PEER 1’s hosting services. Cogeco serves a substantial number of small businesses with its telephone and connectivity offerings.

    For full coverage, see Liam Eagle’s story at The WHIR.

    << Previous Day 2012/12/21
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org