Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, September 24th, 2014

    Time Event
    12:19a
    Report: Another Ohio Town Competing for Amazon Data Center

    Officials in Hilliard, Ohio, have approved property tax incentives, worth up to $5.4 million total, to make the town attractive for a future Amazon data center construction project.

    The officials expect Amazon to invest from $225 million to $300 million in the local economy if the company’s subsidiary Vadata decides to build there, Columbus Business First reported. Another central Ohio town, called Dublin, is also actively pursuing the build.

    A large data center construction project can give a major boost to a small town’s economy, and government officials in cities and states around the country have been using generous offers of tax exemptions to lure such projects.

    A recent Facebook-funded study concluded that Facebook’s decision to build a data center in Forest City, North Carolina, for example, generated an economic impact of $707 million on the local economy. The figure includes direct investment by the company and economic activity in the area related to the project.

    Officials in Dublin (about six miles north of Hilliard) are offering Vadata a free 70-acre piece of land, worth about $7 million, to build the Amazon data center there. Dublin City Council approved the incentive Monday, according to the report.

    Hilliard is offering 100 percent property tax abatement, whose $5.4 million economic benefit Amazon will receive over 15 years.

    In addition to local incentives, the State of Ohio is dangling $81 million worth of tax breaks in front of the company in hopes of attracting the build.

    Amazon has not confirmed whether it was planning to build a data center or whether it was planning to build data center in Ohio. Company spokeswoman said it was always searching for good places to build a data center.

    12:30a
    Report: Google to Build 120 MW Data Center in Netherlands

    Google will invest $773 million in a data center project in Eemshaven, a seaport in Groningen, Netherlands. The Google data center will have about 120 megawatts of power capacity, and construction is set to begin in 2016.

    Netherlands’ technology profile continues to grow with continued data center investment. The region has strong infrastructure, cheap property prices and supportive legislation, as well a perfect climate to keep cooling costs down. Google will leverage renewable energy for the data center.

    The Eemshaven facility will span 44 hectares and was chosen because of stable Dutch energy supplies, spokesman Mark Jansen told Reuters, confirming the project. Employment in the area will get a boost, with Google expected to create 150 jobs.

    Google has operated a data center in Eemshaven, owned by TCN, for over six years The company has three other large European data centers — in Ireland, Finland and Belgium. It has been searching for a suitable location for the new data center under a secret code name since the summer of 2012.

    Apple, another tech giant big on renewable energy, is said to be considering a data center in the area, with similar reasons for choosing the region.

    Listing the Netherlands as a strategically important market, Digital Realty Trust partnered with Dutch Telecom KPN last summer to build a data center in Groningen. Colt operates a data center a few hours south of the city and recently expanded with 1.65 megawatts of power and about 110,000 square feet of space.

    It has also been reported that Microsoft is building a 17.5 megawatt data center in Noord-Holland, just south of Groningen, and north of Amsterdam.

    3:30p
    HP Offers Free Software-Defined Storage to Server Buyers

    HP and Intel have partnered on an incentive program to give away a 1 terabyte HP StoreVirtual Virtual Storage Appliance (VSA) license at no cost to all purchasers of servers powered by Intel’s latest Xeon E5 v3 processors.

    Looking to push adoption of software-defined storage HP will distribute licenses for more than 72 petabytes of capacity and give customers the ability to download HP StoreVirtual VSA software and obtain a license for 1TB of capacity.

    “The cost of shared storage is still a common server virtualization roadblock for SMBs and enterprise remote office sites,” said David Scott, senior vice president and general manager of storage at HP. “By offering no-cost VSA software with Intel Xeon E5 v3–based processor servers, HP and Intel are making SDS available to the world for free, giving customers access to hypervisor- agnostic, hardware-independent rich data services while preserving choice and lowering costs.”

    HP says its StoreVirtual VSA software taps unused compute and storage capacity within a server to provide resilient shared storage for virtual servers running VMware, Microsoft Hyper-V or Linux Kernel Virtual Machine. With the release of its Gen9 servers HP adds automated VSA deployment as well, with a single-click configuration for a solution supporting virtualized applications and shared storage.

    VSA can scale out to dozens of nodes and 1.6 petabytes of capacity, the company said.

    HP is also offering a free trial of its StoreOnce VSA software for 10TB of hardware-independent, software-based backup and deduplication.

    3:30p
    View from Velocity 2014: Scaling Teams Is Key

    Colleen Miller is a journalist and social media specialist with more than two decades of writing and editing experience. Colleen has covered the data center industry, including cloud, big data and storage.

    In my four-plus years with Data Center Knowledge, I’ve been to my fair share of technology and data center conferences. The sessions and panelists are often future focused, advancing high expectations about the automated, faster, bigger and stronger technology solutions.

    The Velocity: Web Operations & Performance Conference, which convened its 2014 East Coast event in midtown Manhattan last week, certainly had its share of projections about the future. (“Infrastructure is code” is one example of that kind of bold stance.)

    Yet, this conference was a bit different because of its focus on DevOps, the growing class of professionals who understand “both sides of the fence.” For more background on this trend, see DevOps stories on DCK.

    However you think about infrastructure and cloud and “as a service,” there still is a strong people component to it all. I caught a thread that wove through the sessions that I attended: the faster, stronger Web depends on people as much as it does infrastructure and applications.

    For example, Mikey Dickerson, U.S. Digital Service Administrator, who was on the team that got HealthCare.gov working, explained that the issue with the website was there was not one company nor one person who was responsible for ensuring that it worked.

    So, in addition to fixing bugs and troubleshooting, the team set up the basic “war room” with the multiple vendors present (55 official vendors of record, only 20 or so interacted with Dickerson) and had twice daily meetings to keep human communications flowing.

    Clearing communications logjams

    To me, it seemed the biggest accomplishment of the government team was not technology, but clearing communications obstacles so that professionals could do their jobs.

    The process was career-changing for Dickerson, who has left Google to stay with the government to unravel more IT and bureaucracy challenges. He urged others to join in the important work.

    Another fascinating keynote was delivered by Justin Arbuckle, Chief Enterprise Architect at Chef and formerly of GE Capital. The focus of his talk was cybernetics – no, not Scientology’s “dianetics” – cybernetics is the science examining the laws governing how organisms, machines and organizations maintain their identity, and fulfill their purposes within their environment.

    Arbuckle said that DevOps lets the organization respond quickly by having immediate information for feedback. He quoted management theorist Stafford Beer, who said “organizations should maximize freedom of their participants, within practical constraints of the requirement for the organization to fulfill their purpose.” And that’s what operations and performance professionals are bringing to their organizations: vital information and feedback, not just infrastructure and applications to support customers as well as everyone in the corporation accomplish the task at hand.

    There were a number of sessions discussing management and employment challenges, such as being the “one woman DevOps team” and the challenge of scaling a team through hiring.

    Building teams without cloning or unicorns

    Camille Fournier, CTO of Rent the Runway, asserted that “cloning yourself at work” is impossible, so the manager’s challenge is to be a “multiplier.” While tech skills are foundational, as teams are required to grow in the expanding DevOps arena, then leaders need to focus on appropriate hiring. She noted, “You are not leading computers around, you are leading people.”

    Bethanye McKinney Blount, who is a production engineer at Facebook, also presented a pointed presentation on hiring. To put it simply, “you can’t scale systems if you can’t scale teams,” Blount said.

    Her talk’s title was telling — “Build a Better Unicorn: Scaling Production Engineering Teams in the Real World.” She debunked the concept of hiring the “unicorn” or “rockstar” for your team.

    unicorn

    The unicorn is individual who is unique and perfect and has rarefied status. Unfortunately, “unicorns decide what they like to do and they forget about everything else,” she said.

    She discussed how leaders need to focus on key questions. What does the ideal team look like for me and my organization, and what tech skills does our team already have and what tech skills do they need?

    Considering the human factors

    One panel focused on gender bias in the Operations field, aptly titled, “A Woman’s Place is at the Command Prompt,” and discussed workplace issues such as equal treatment of women in the work environment, on-call rotations and performance reviews.

    It was noted that while more women are “learning to code,” many opt out of technical positions because of biased work environments, thus creating a “leaky pipeline.”

    The panel advised hiring managers to look at the language used in their position advertisements. The terms “ninja, guru or rockstar” and perks like “beer and foosball” are red flags that the workplace is “like a frat house,” and qualified women candidates will likely avoid your organization.

    Generally, it was a quite engaging and thought-provoking conference!

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:18p
    RagingWire Lights Up Final Phase of Ashburn Data Center

    RagingWire, an NTT Communications Group company, has made an additional 39,000 square feet of floor space and 8.1MW of critical IT power available at its VA1 data center in Ashburn, Virginia. This is the final phase of the facility’s build-out.

    The Sacramento, California-based company has seen solid sales in Ashburn since it launched the facility in 2012. Most of the capacity fit out previously has been leased out, and a big chunk of the new phase has been presold, which reflects strong demand for data center space in Northern Virginia, one of the country’s biggest data center markets.

    The final phase consists of three PODs, known as Vaults 3,4 and 5. About one third of the phase has already been sold, one of the vaults going to an unnamed global IT company.

    In the previous phases, the 18,000-square-foot Vault 1 and tje 13,000-square-foot Vault 2 are sold out.

    “RagingWire has established itself as a top competitor in the Northern Virginia market,” said Michael Levy, senior analyst for data centers at 451 Research. “The firm’s new capacity will enable it to continue to deliver wholesale and retail data center colocation solutions to meet the strong market demand there.”

    RagingWire has dubbed its 150,000-square-foot VA1 facility “The Bolt” because of its lightning bolt shape. Its five vaults are supplied with 14.4 megawatts of critical IT power. Each of them, save for Vault 1, have 13,000 square feet of high-density raised floor space and 2.7 megawatts of critical power.

    “With a healthy stream of capital from parent company (NTT owns 80 percent) RagingWire is slatted to pursue an aggressive pipeline in multiple new markets over the next year,” said Levy.

    Northern Virginia, and especially Loudoun County, were made for data centers, with abundant fiber, cheap and reliable power from Dominion Virginia and attractive tax incentive programs. According to 451 Research, the region is poised to overtake the New York metro as the top data center market in the U.S. by the end of this year.

    4:30p
    Data Center Jobs: Power Design Inc

    At the Data Center Jobs Board, we have a new job listing from Power Design, Inc, which is seeking a Senior Electrical Designer – Mission Critical in Florida.

    The Senior Electrical Designer – Mission Critical is responsible for coordinating the best design solution by applying industry knowledge while considering customer needs and budgeting requirements, acting as the primary lead and providing technical assistance for multiple projects during bidding, design, permitting and construction, providing efficiency reviews and/or value engineering options for existing designs, and preparing detailed drawings/models for a project using the application of theoretical and practical design knowledge. To view full details and apply, see job listing details.

    Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed.

    4:49p
    Broadcom Intros StrataXGS Tomahawk Switches for Scale-Out Data Centers

    Broadcom announced a new line of StrataXGS Tomahawk data center switches that deliver 3.2 Terabits per second (Tbps) switching capacity, extreme port density and SDN-optimized engines in a single chip (SDN stands for Software Defined Networking).

    Catering to next-generation cloud fabrics, mega data centers and high performance computing environments the new switches pack in more than 7 billion integrated transistors and enable migration to all-25Gbps per lane interconnect and a 2.5x increase in link performance, the company said.

    “Our StrataXGS Tomahawk Series will usher in the next wave of data centers running 25G and 100G Ethernet, while delivering the network visibility required to operate large-scale cloud computing, storage and HPC fabrics,” said Rajiv Ramaswami, executive vice president of Broadcom’s Infrastructure and Networking group. “This is the culmination of a multi-year cooperative effort with our partners and customers to prepare for this transition.”

    Standard developed by Broadcom, Google, Microsoft, others

    Broadcom defined and co-founded the 25G/50G Ethernet specification as an industry standard, according to Ramaswami.

    With 3.2 terabits per second of raw speed, the latest data center switch is alone in the market to support 32 ports of 100 gigabit-per-second Ethernet links. Christian Plante, senior product line manager for the StrataXGS product lines points out in a blog post that this single chip, no bigger than the back of your hand, can switch the equivalent of 1.5 million Netflix streaming movies at the same time.

    Industry heavyweights formed the 25G Ethernet Consortium this past summer to collaborate and form an industry-standard, interoperable Ethernet specification for 25 Gbps and 50 Gbps (dual-lane link)Ethernet links. Google, Mellanox Technologies, Arista Networks and Microsoft joined Broadcom in the effort.

    Deep instrumentation, new packet processing engines

    Featuring extensive application flow and debug statistics, Broadcom’s new BroadView instrumentation feature set enables data center operators to have full visibility of network and switch-level analytics, the company said. The Tomahawk also lets operators look at link health and utilization monitors, streaming network congestion detection and packet tracing capabilities.

    The new data center switch line also features new FleXGS packet processing engines, which enable operators to adapt to changing workloads and control their networks. Broadcom offers an extensive suite of user-configurable functions for flow processing, security, network virtualization, measurement and monitoring, congestion management and traffic engineering.

    These engines provide in-field configurable forwarding and classification database profiles and more than 12 times greater application policy scale compared to previous-generation switches.

    5:05p
    Verizon Adds Direct Links to AWS for Enterprise Clients

    Verizon Enterprise Solutions has added Amazon Web Services (AWS) to the list of cloud providers available via its Secure Cloud Interconnect (SCI) service in an attempt to make an Amazon enterprise cloud more palatable for security and network-performance-conscious companies.

    SCI enables clients to manage a multi-cloud environment that allows for dynamic bandwidth allocation, application performance throughput, quality of service and usage-based billing.

    Verizon’s SCI is aiming for Amazon enterprise cloud customer dollars, and has now added the largest public cloud provider to the list of options.

    SCI uses AWS Direct Connect to enable pre-integration of networking and data center interfaces to create a secure, direct link to AWS.

    Enterprises can register and select the AWS region and service desired to automatically provision dedicated high-speed connections quickly.

    SCI with AWS Direct Connect is available immediately in Northern Virginia and Silicon Valley. London and three Asia-Pacific locations (Sydney, Singapore, Tokyo) are planned for late October.

    Verizon has also started to let other carriers sell services in more of its data centers, recognizing that it needs to take an open approach when it comes to serving enterprise infrastructure and network needs.

    Managed service providers like Verizon, Level 3 and Datapipe are recognizing the benefits of embracing AWS rather than viewing it as competition to their services. Data center providers like Equinix, Telx and CoreSite are also enabling Direct Connect in order to appeal to enterprises that desire hybrid infrastructure.

    The ability to connect directly to multiple clouds brings more agility to the enterprise.

    “As more organizations look to hybrid computing, SCI offers enterprises a very viable option,” said Thierry Sender, director of technology at Verizon. “The service offers enterprises the right blend of security, private connectivity, performance, simplicity and efficiency, while enabling a wide range of applications and use cases for organizations.”

    5:54p
    Startup Saisei Unveils Real-Time Network Performance Management Solution

    Silicon Valley networking startup Saisei has unveiled software-based Network Performance Enforcement (NPE) solutions for network analysis and control.

    The company describes its technology as an edge solution that provides full network traffic visibility and policy control of physical and virtual workloads, enabling network “micro segmentation.”

    The Saisei product family consists of FlowCommand, FlowEnforcer and FlowVision software that can be distributed as virtual machine images to run on hypervisors or packaged on commodity x86 hardware. FlowCommand rasies routed IP link utilization from 50 percent to more than 95 percent, the company said.

    The company claims FlowCommand can concurrently monitor up to 5 million flows on a 10G network link 20 times per second and control every flow based on user-defined policies.

    Saisei also says its patented flow processing engine enables support for the management and control of up to 1 billion external hosts.

    Saisei is a member of Intel’s Network Builders program and uses the Intel Data Plane Development Kit software libraries. This plays a critical role in software-defined networking and network functions virtualization, as its customer migrate over to SDN and NFV.

    Earlier this summer Saisei joined the Open Networking Foundation to contribute and collaborate with the community on networking innovations.

    6:44p
    DataTorrent Data Stream Processing Marries Pivotal Hadoop Distro

    DataTorrent, a startup with a data stream processing platform on Apache Hadoop, has joined the network of partners of Pivotal, which has certified the solution for its Hadoop distribution called Pivotal HD.

    DataTorrent is a major win for Pivotal’s One Partner Program set up to cross-sell between its own and the partners’ prospects and customers. It is a prominent startup with some heavyweight venture capital backing and a founder who was involved in Hadoop’s development in its early stages.

    DataTorrent RTS, the company’s core product, went into general availability less than four months ago. Phu Hoang, its CEO and co-founder, started at Yahoo in 1996 when the company had only six engineers and didn’t leave for more than 11 years.

    Hoang was involved in developing Hadoop when he was at Yahoo. Doug Cutting, who was one of Hadoop’s forefathers, was working at Yahoo when the open source framework was born in 2005, naming it after his son’s toy elephant.

    DataTorrent’s $750,000 in seed funding came in 2012 from a group of investors that included Jerry Yang, a Yahoo co-founder and its former CEO. The company raised $8 million in a Series A round in June 2013, led by August Capital.

    Unlike batch processing, which Hadoop was originally developed to do, data stream processing enables Big Data analytics in real time. The use of data stream processing has been on the rise, as demand for decision making on the fly grows.

    DataTorrent RTS is designed to accelerate data ingestion and processing but also has enterprise-friendly aspects, such as scalability and availability.

    7:14p
    Oracle Releases OpenStack for Oracle Linux Into General Availability

    Oracle has launched OpenStack for its own Linux distribution into general availability. It is based on OpenStack Icehouse, the ninth release of the open source cloud-building software. It allows users to control Oracle Linux and Oracle Virtual Machines through OpenStack in production environments.

    Technology incumbents continue to play friendly with OpenStack. More enterprises are interested in it, so enterprise-centric vendors are releasing integrated solutions, as well as ensuring their products and services work with OpenStack in general.

    Another recent example is VMware’s announcement of its own OpenStack distribution. Oracle wants to make it easy for enterprises to begin using OpenStack with its flavor of Linux and Oracle VM, which competes with VMware’s server virtualization software.

    Oracle’s distribution is available as a free download from the Oracle Public Yum Server and Unbreakable Linux Network (ULN). An Oracle VM VirtualBox image of the product is also available to help customers get started with OpenStack easily.

    The distro can integrate with third-party software and hardware and supports any guest operating system supported by Oracle VM, including Oracle Linux, Oracle Solaris, Microsoft Windows and other Linux distributions.

    Other integration and support include:

    • Integration with MySQL Enterprise Edition
    • Integration with Oracle ZFS Storage Appliance: Oracle OpenStack for Oracle Linux includes the Oracle ZFS Storage Appliance Cinder plugin, which provides customers with an enterprise-grade storage option
    • Support for Ceph storage software: Ceph provides object, block and file storage from a single distributed computer cluster on commodity hardware. Ceph has been gaining in popularity since its creation by DreamHost co-founder Sage Weil. Last April Red Hat purchased InkTank, a company that spun out of DreamHost to offer professional support for Ceph.
    • Customers can use Oracle Clusterware in Oracle Linux Support subscriptions to protect OpenStack services, ensuring deployments remain fully functional in case of hardware failure.
    7:46p
    As US Government Agencies Continue Cloud Push, Revenue Opportunity for Service Providers Grows

    logo-WHIR

    This article originally appeared at The WHIR

    On Tuesday the Department of Defense (DoD) acting CIO Terry Halvorsen announced it will perform a database consolidation that could lead to a $10 to $20 billion dollar savings in a few years. The agency intends to evolve from a legacy system to a newer one that will better serve the organization. A cloud-based database may be the solution as other government services are being shifted into commercial clouds.

    “As we do those reviews, we’ll begin to isolate what is the best database…what has high value, what is accurate and [whether] its cost is good,” Halvorsen said in a conference call with reporters. “And we will start collapsing the other databases into the single database.”

    He also said that the Defense Information Systems Agency (DISA) will continue as the DoD’s cloud broker. However, there will be changes to how the DoD uses outside computing. The DISA will approve security plans for the military seeking to use outside cloud services.

    Other governments such as Australia and the UK are moving to public cloud services.

    “Moving into the commercial cloud will be less expensive while enabling enhanced network storage and protection in a more agile environment,” Halvorsen said.

    The acting CIO seems to be proceeding in accordance with the policies of the previous CIO Teri Takai. In April, Takai said the DoD could save money by moving to the cloud, but it needed to set security requirements that outline the details of this move.

    This was in contrast to previous information that the DoD had no need for public cloud services. The WHIR previously reported that the Defense Information Systems Agency wasn’t moving forward with a $450 million cloud computing contract due to lack of interest from the DoD.

    Earlier this month the E-Commerce Times reported a pilot program in which the DoD has been moving more services to the cloud. The DISA approved a protocol that allows the use of outside cloud vendors for higher security level data. Levels 3 to 5 of its Cloud Security Model are being stored on the AWS GovCloud. It is the only company with higher level data storage; CGI Federal and Autonomic Resources are only authorized for level 1 and 2 data.

    “DISA is working with the services to implement several commercial cloud pilots in the very near term while continuing to work with other cloud providers on the provisional authorization process.” Mark Orndorff, DISA’s program executive officer for mission assurance told E Commerce Times. “The AWS authorization is limited to only approved pilots using the AWS Infrastructure as a Service offering at this time, and is not currently extended to include other shared services.”

    While AWS for the moment has a lock on government cloud services in this pilot program with DoD and DISA, the potential revenue opportunities for other cloud providers going forward is huge.

    “Assessments of FedRAMP-compliant offerings from providers such as HP, Lockheed Martin, AT&T, Akamai, Microsoft and Oracle — along with a cloud solution offered by the U.S. Department of Agriculture — are under way. We continue to work closely with the FedRAMP program office and cloud providers to add to the list of approved cloud offerings,” said Shawn McCarthy, research director at IDC Government Insights to the Times. “We predict that DoD will spend about US$165.7 million on cloud efforts in federal fiscal year 2014 — but that is for all types of cloud, not just the security levels recently approved for AWS.”

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/us-government-agencies-continue-cloud-push-revenue-opportunity-service-providers-grows

    9:43p
    Gartner Names Schneider, Emerson, CA, Nlyte DCIM Leaders

    Gartner has released its first Magic Quadrant (MQ) report on Data Center Infrastructure Management, laying out the market and positions for several DCIM providers across the four quadrants of leaders, challengers, visionaries and niche players.

    While there is a lot of interest in DCIM, it’s difficult for customers to determine where to start. Understanding where a DCIM provider’s strengths are is a good thing in an often-confusing market.

    The report adds some clarity to the market, analyzing strengths and weaknesses for 17 players. Gartner isn’t the first to tackle the fairly young DCIM space. Its competitors 451 Research and TechNavio both have taken a stab at defining and segmenting the space.

    Gartner defines DCIM market as space that encompasses tools that monitor, measure, manage and control data center resources and energy consumption of IT and facility components. The market research house forecasts that by 2017 DCIM tools will be deployed in more than 60 percent of larger data center in North America.

    Providers often offer different pieces of the overall infrastructure management picture and use different and complicated pricing models. All vendors in the MQ must offer a portfolio of IT-related and facilities infrastructure components rather than one specific component. All included vendors must enable monitoring down to the rack level at minimum. Building management systems are not included.

    The four companies in the Leaders Quadrant – those proven to be leaders in technology and capable of executing well — are Schneider Electric, Emerson Network Power, CA Technologies and Nlyte Software. All but Nlyte are major vendors that offer several other products and services outside of DCIM, putting Nlyte, a San Mateo, California-based startup, in company of heavyweights.

    Here is Gartner’s first ever Magic Quadrant for DCIM vendors:

    Gartner DCIM Magic Quadrant 2014

    IO, the Arizona data center provider best known for its modular data centers, was named a visionary in the report for the IO.OS software it developed to manage its customers’ data center deployments.

    “We are very pleased with the findings articulated in the Garter Magic Quadrant for DCIM,” said Bill Slessman, CTO of IO, said. “IO customers have trusted the IO.OS to intelligently control their data centers since 2012.”

    The other three quadrants are for challengers, visionaries and niche players, and it’s not a bad thing to be listed in any portion of the MQ. Challengers stand to threaten leaders; visionaries stand to change the market, and niche players focus on certain functions above others, though a narrow focus can limit their ability to outperform leaders. Being listed in the MQ is a win in itself.

    DCIM value, according to Gartner:

    • Enable continuous optimization of data center power, cooling and space
    • Integrate IT and facilities management
    • Help to achieve greater efficiency
    • Model and simulate the data center for “what if” scenarios
    • Show how resources and assets are interrelated

    The report is available here.

    10:00p
    Digital Realty Partners With Carpathia to Sell Hybrid Infrastructure Solutions

    Digital Realty Trust has signed a partnership with Carpathia Hosting, a managed hosting and cloud infrastructure services provider and its long-time customer. The two will jointly market infrastructure solutions that will combine colocation space in Digital Realty’s data centers with the gamut of services Carpathia provides.

    San Francisco-based Digital Realty has been actively trying to get away from its traditional business model of relying largely on leasing wholesale data center space. The company announced early this year that it was going to be using partners to provide more complete solutions to customers, and the Carpathia deal is the latest example of that effort.

    Carpathia pursues customers in the enterprise, government, digital media and healthcare solutions markets. Headquartered in Dulles, Virginia (just outside of Washington, D.C.), it is very active in pursuing cloud services business with the U.S. federal government and has a partnership with VMware to jointly provide VMware’s vCloud services to federal agencies.

    Not a reseller agreement

    Michael Bohlig, director of global alliances at Digital Realty, said the two companies will market combined solutions jointly and align sales teams to refer customers to each other when there is an opportunity.

    “We’re not going to resell Carpathia services,” he said. “We expect to generate revenue together, but we won’t split it.”

    Carpathia isn’t the only service provider that uses Digital Realty’s data centers, and Digital Realty will be helping market services of a company that competes with some of its other customers.

    Commenting on this dynamic, Bohlig said the partnership was about providing more choice to customers. “Ultimately, it’s up to the customer to decide what is best for them,” he said.

    While the majority of its 100-plus properties around the world are occupied by wholesale data center clients, Digital Realty has colocation facilities in California, Texas, Virginia and New Jersey, as well as outside of the U.S., in U.K., Australia and Singapore, among other markets.

    Former AWS man building partner ecosystem

    Digital Realty brought Bohlig on board about six months ago specifically to pursue partnerships with service providers like the one with Carpathia. He joined the company after three years at Amazon Web Services, where he did business development for its CloudSearch product.

    Since he came on board, the company has partnered with other cloud providers, managed services companies, as well as companies that build out data center space and do racking and cabling for customers.

    The data center landlord has struck deals with network carriers Level 3 and tw telecom to offer direct private network connections between its data centers and the big public cloud service providers Amazon Web Services and Microsoft Azure.

    In June, Digital Realty launched a “cloud marketplace,” a web portal where users can buy cloud services from providers that use its data centers.

    << Previous Day 2014/09/24
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org