Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, March 19th, 2015
| Time |
Event |
| 12:00p |
New Singapore Data Center Equinix’s Largest in Asia Equinix has brought online its third data center in Singapore – the company’s largest to date in the Asia Pacific region.
European telecommunications giant Orange and U.S. cloud and hosting service provider Datapipe became the data center’s first two tenants, according to Equinix, a Redwood City, California-based colocation provider.
It is one of the five data centers Equinix said it would open as part of a global expansion program, announced earlier this month. The other locations are New York, Toronto, London, and Australia.
Singapore is one of the fastest-growing Asia data center markets. But data center demand is booming across all Asian business centers as foreign companies expand infrastructure to serve local customers, and as local companies grow together with the market.
Singapore is the network connectivity and business hub for Southeast Asia one of the key hubs for the region at large.
This is the third Singapore data center announcement this week. Monroe, Louisiana-based CenturyLink announced Tuesday the launch of a cloud data center there – its first cloud location in Asia. Local company Singapore Technologies Telemedia announced plans to build a 150,000-square-foot data center.
The new Equinix data center is 385,000 square feet total and can accommodate about 5,000 IT cabinets at full build-out. The first phase that was recently launched has enough space for 1,000 racks.
The company invested $53 million in building out and launching the first phase, bringing its total investment in Singapore data centers to more than $300 million.
Demand on the island comes from a variety of industry verticals, including financial services, network operators, and cloud service providers, according to Equinix.
Jim Poole, a vice president at Equinix, said in an earlier interview with Data Center Knowledge that Singapore was second only to Ashburn, Virginia, in the level of network connectivity.
The new data center will eventually be connected to another Equinix data center on the island with a dedicated fiber link, which will enable customers in one facility to easily extend their infrastructure to the other. Direct access to an already established data center, which according to the provider has a bustling ecosystem of businesses, makes the new facility more valuable.
Creating interconnection hubs for a wide variety of players in the market inside its data centers has always been a big part of Equinix’s business model. | | 3:30p |
How CIOs Become Hybrid Cloud Heroes Oded Haner is the CTO at HotLink. He is an accomplished, strategic, collaborative technology leader with extensive experience in bringing innovative IT technologies to market.
The role of the CIO is difficult. Traditionally, CIOs face three main issues: ever-shrinking budgets, ever-growing expectations and overall lack of predictability. The CIO falls victim to too many responsibilities being thrown at him, and he has to field requests from all different parts of the business for the tools users need to make informed decisions. In addition to managing his regular duties, the CIO is dealing with other surprises.
Predicting the Unpredictable
It is nearly impossible for the CIO to know exactly what the company will need six, 12 or 18 months from now, but budgets need to be set anyway. Beyond mission-critical operations that are easier to predict, things can change quickly and without much notice. This forces the CIO to keep a safety buffer within his budget to take care of those surprises. In growth years, it is wise to use an even higher buffer, as the company is likely to experience a lot of unpredictability. However, at times when budgets shrink and only must-have projects get funded, projects that cannot be easily explained get left out of the budget, which can stall innovation and growth. This conservative stance can mean missing out on new technologies that deliver real value, so the CIO must constantly evaluate and recognize which risk presents more danger – spending or stagnation.
This perennial challenge is particularly vexing now, as CIOs consider cloud-based computing technologies. Whether they are leading enterprises with significant virtual infrastructure footprints or small data centers, CIOs now have to support diverse applications and end-user requirements with a range of technical, service level, regulatory and cost constraints. A variety of needs usually results in a variety of solutions, which requires a hybrid IT infrastructure – one size rarely fits all.
A CIO who takes a step forward into the hybrid cloud does so while dealing with the same budgetary and complexity obstacles he has always faced. And yet, the IT herd is clearly moving together in this direction. Gartner states that 74 percent of enterprises are now pursuing hybrid IT.
The Makeup of a Leading CIO
CIOs that embrace innovation are those who recognize their business interests and needs before everyone else and often months or years before the business does. This means taking control of the budget and pushing the company ahead before it realizes the need is there. So, who are the formidable leaders of hybrid IT deployments, which are poised to take off in 2015? Think of the same people who implemented virtualization years ago, those who saw a new technology that had not yet been adopted by the majority, but which presented huge potential value. These leaders strive to make their data centers more agile and cost-effective years in advance, knowing their businesses will require it.
Because of these IT innovators, virtualization eventually became the standard. Now, these same people are looking to the future for the next advancements, and these include hybrid cloud deployments. These CIOs understand that a part of what they have on premise is no longer within their core strengths because of constantly changing end-user and cost requirements, and that the cloud can help them.
Furthermore, they see that their investments in hybrid IT don’t require heavy-handed capital expenditures and can be phased into their environments using an operational expenditure approach. The pay-as-you-go model of the public cloud is ideal for this, as it opens up the faucet to enable IT as it’s needed, always letting the CIO and IT teams say “yes” and quickly provision the infrastructure and applications as needed. The CIO does not need to predict budgeting months in advance, and, more importantly, he can quickly respond to the business as it grows without ever constraining innovation, hence solving some of his ongoing challenges.
Overcoming the Management Risks
After the CIO considers the hybrid approach and starts to slowly phase it into his environment, he must turn his thoughts to how to manage and fully achieve a cloud transformation. To achieve this cloud innovation, the CIO must manage the greater ecosystem head-on. After deploying a hybrid environment, the CIO does not want to introduce new silos, so it’s critical that he ensures that his IT team can see, manage, administer and maintain the new hybrid environment just as well as it did on-premise virtualization. The benefits and approach to management should offer the same benefits as virtualization did a decade ago, when it removed all of the physical siloes and improved the data center.
IT transformation is never a trivial endeavor. However, armed with the ideas above, CIOs can embrace cloud-based innovation to deploy hybrid environments and demonstrate both economic and business value to the company, as well as provide the strategic leadership entrusted to the position.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 4:00p |
How to Maximize Primary Storage ROI and Data Protection Do more with less seems to be the dogmatic decree that confronts today’s IT professionals as they struggle to reconcile constrained budgets with apparently limitless growth in unstructured and structured data. With costly primary storage rapidly filling up, the need to migrate less active data onto a more cost-effective storage tier is clear.
Storage optimization — matching storage to data’s performance, capacity, and accessibility needs — entails moving infrequently-accessed data off expensive, high-performance primary storage to less costly, lower-performance secondary storage.
In this white paper from Nexsan, we learn how this maximizes primary storage ROI by freeing up more capacity, boosting performance, shrinking the backup window, and reducing its cost by ensuring all of the data that remains on it actually needs to be there.
The sheer volume and variety of ways in which data can be breached is daunting, including silent data corruption due to hardware failure and software failure, malicious attacks from cybercriminals, or human error such as accidental file deletion or overwriting by employees. Yet for all too many data center managers, their response to these threats seems to be a strategy based on … hope.
Unless they can answer the following four critical questions, their data protection strategy is tantamount to just hoping that nothing goes wrong:
- How do you know if all your files are in your backup or archive?
- How do you know if there is a second copy of all your files at your remote site?
- What is the health (integrity) of your files at each site?
- If the files are different, which file is the correct one?
The answers to these questions — and indeed the answer to this dilemma — comes from purpose-built secure archive solutions, which are specifically designed to provide maximum data security, integrity and privacy from the moment a file is ingested into the archive.
Remember, explosive data growth is highlighting the inefficiencies of housing huge quantities of unstructured and structured data on costly primary storage. The vast majority of this data is seldom accessed, and doesn’t require the high performance (and high costs) that primary storage entails. Not all data is equal, and IT managers are applying storage optimization principles to better match storage to data’s performance, capacity, and connectivity needs.
But as these storage optimization initiatives migrate less-active data from primary to archive storage, a crucial issue is frequently overlooked—the risk of data loss. Download this white paper today to learn how Nexsan Assureon secure archive solutions are purpose-built to deliver unrivaled data protection for archive data, whether it’s stored for days or decades. | | 5:11p |
Cloud Decision-Making Moves Up the Corporate Ladder with CIOs, CEOs Weighing in: Microsoft 
This article originally appeared at The WHIR
Cloud is now a mainstream technology for most companies, with 70 percent of the cloud service market now beyond infrastructure hosting, according to a new report released Wednesday by Microsoft. The report, Beyond Infrastructure: Cloud 2.0 Signifies New Opportunities for Cloud Service Providers, shows that cloud technology has moved past the cloud discovery phase for 75 percent of customers, and the maturing market is driving cloud decision-making up the corporate ladder to C-level executives.
The study was compiled by 451 Research for Microsoft based on 1,700 survey responses from different sized enterprises in 10 countries, given in late 2014 and early 2015. This year’s study shows a continuation of trends from last year, when Microsoft identified a tipping point, with 45 percent of companies surveyed moving from cloud discovery to the cloud implementation phase of adoption.
Microsoft’s report shows not only a greater opportunity for cloud service providers beyond the basic elements of hosting but also shows those providers must win over not just an IT manager, as was common in the past, but also a CIO or CTO, who is the primary decision-maker in the majority of cases (52 percent), or even the CEO, who is the primary decision-maker for 44 percent of companies.
“Hybrid cloud infrastructures are becoming the norm for customers,” said Michelle Bailey, senior vice president, Digital Infrastructure and Data Strategy, 451 Research. “As new decision-makers emerge, so too does the criteria for selecting cloud service providers. Trust, uptime, security, performance and technical expertise are today’s differentiators for a business-ready cloud. It’s not just about having datacenters everywhere at the lowest price. Providers need to build a business that aligns to who they are as a company and who they are supporting. Cloud 2.0 is really about value, redefining cloud computing from a technical specification to a business-ready environment. Enterprises are looking for a trusted end-to-end solution, and ultimately this will involve multiple partners.”
The value-adding opportunity for service provides comes from application hosting, managed services, and security services, and the products and services available are the single largest decision-making factor for 23 percent. Other major factors include company qualities for 22 percent, customer service for 21 percent, and price for only 19 percent.
Microsoft expanded its cloud partnerships earlier this week by creating a product package for service providers with Cisco.
The study also notes a looming opportunity for service providers in Windows Server 2003 migration ahead of the end of support in July. This migration will result in an increase in cloud use, Microsoft says, echoing the findings of a Cloud Industry Forum report last summer. Shortly after that report was issued, HP launched a program to take advantage of the Windows Server 2003 migration opportunity.
The report is available for download from Microsoft’s website.
This story originally appeared at http://www.thewhir.com/web-hosting-news/cloud-decision-making-moves-corporate-ladder-cios-ceos-weighing-microsoft | | 6:17p |
UK Government Forms Data Center Business With Ark Taking an unorthodox approach to the problem of public-sector overspending on data center capacity, the U.K. Prime Minister David Cameron’s office, called the Cabinet Office, formed a joint venture with a private data center builder and operator, which will provide data center services to government agencies. The Cabinet Office and Ark Data Centers (its JV partner) announced the deal Thursday.
The U.K. government spends more on IT than any other public-sector organization in Europe. Runaway data center costs have been a struggle different governments have tried tackling in different ways. The U.S. executive branch has been pushing agencies to consolidate government data centers since at least 2010. Australian government has taken a similar approach.
The U.K. government’s new deal with Ark is a big departure from what others are doing. The joint venture, called Crown Hosting Data Centres, is aiming at satisfying all government data center needs. The company will “deliver increased efficiency, improved value and transparency of service utilization across all of the public sector,” the announcement read.
As the new company’s CEO Steve Hall explained in a statement, the move will simplify data center selection process for agencies and “drive the unbundling of large legacy contracts.”
“It provides publicly funded, mandated and regulated organizations with a pre-approved contract that leverages the buying power of the whole of government for the fastest, simplest access to secure data center services,” he said.
Its first customers are the Department for Work and Pensions, the Highways Agency, and the Home Office.
The deal amounts to a £700 million contract for Ark, according to a report by The Register. The same report said the Cabinet Office would own 25 percent of the joint venture, and Ark would own the remainder.
Ark has been doing business with government agencies prior to the most recent agreement. It has data center contracts with Ministry of Defense and Ministry of Justice.
The company has two data centers in the U.K. It has been around for 10 years but was “recapitalized” in 2012, when a new senior leadership team came in. Its current CEO is Huw Owen, who was formerly a president at BT.
Ark’s board of directors includes Right Honourable Baroness Manningham-Buller, life peer in the House of Lords and former director general of the Security Service (MI5), and Brian Fitzpatrick, CEO of Vodafone Carrier Services. | | 7:08p |
Amazon Web Services Beefs Up Elastic Block Store Amazon Web Services has announced that faster, larger Elastic Block Store volumes designed to deliver much higher, consistent baseline performance with burstable IOPS are now available in every AWS region.
Originally announced by the company’s CTO Werner Vogels at the AWS re:Invent back in November, the beefed up EBS systems can also reduce complexity by using fewer volumes to accomplish more, making it is easier to configure and backup.
For use with Amazon EC2 instances, EBS is for I/O intensive or “bursty” workloads and supports two volume types: Provisioned IOPS (SSD), and General Purpose (SSD). Both are designed for single-digit millisecond latencies and five 9s (99.999 percent) availability.
As the default EBS volume type for Amazon EC2, General Purpose SSD serves small to medium databases, development and test, and boot volumes. Now users can create volumes that store up to 16 terabytes (compared to 1 terabyte) and up to 10,000 baseline IOPS, from the previous 3,000. It can also support bursting to higher performance levels, with maximum throughput at 160 megabytes per second. On the other hand, Provisioned IOPS SSD can store up to 20,000 baseline IOPS and operate at 320 megabytes per second.
“With this release, you no longer need to strip together several smaller volumes in order to run applications requiring large amounts of storage or high performance, including large transactional databases, big data analytics, and log processing systems,” wrote AWS chief evangelist Jeff Barr. “The volumes also make backing up your data easier, since you no longer need to coordinate snapshots across many striped volumes.”
EBS has seen several enhancements in the last year. In addition to cranking up the performance, features added include: additional data protection via seamless encryption of EBS data volumes and snapshots, and the ability for customers to create and manage their volume encryption keys. | | 7:16p |
HP Upates Hybrid Flash, Other Data Center Storage Wares At its Global Partner conference this week in Las Vegas, HP announced a range of data center storage solution updates, new hybrid flash storage, and enhanced file solutions.
Looking to bring new features and fast SSD into entry level markets, HP has released a new StoreVirtual 4335 hybrid flash array. The company said that Adaptive Optimization tiering provided by this latest SSD can deliver 12 times the performance with more than 90 percent power and footprint savings over a spinning disk configuration. A 2 node 4335 SAN bundle with 12.4TB of capacity starts at $59,000, according to HP.
Catering to small to mid-sized businesses and filling a gap in the portfolio, HP added a new model to the StoreOnce Backup family. With 31.5TB of usable capacity, the new StoreOnce 2900 backup appliance offers twice as much capacity as previous versions and is compatible with HP StoreOnce Recover Manager for hypervisor-managed data protection of 3PAR StoreServ Storage arrays. HP also noted that in a 12-hour window, the 2900 can protect up to 70TB of data and restore up to 41TB.
HP also refreshed its NAS line of StoreEasy Storage, with new 1450, 1650, and 1850 models based on HP ProLiant Generation 9 servers. HP said it has added enhancements to the StoreEasy Dashboard as well as support for 25 times faster RAID rebuilds. Again courting the entry level market, a StoreEasy 1450 with 4TB of raw capacity starts at $5,497. HP StoreEasy also now includes Vision Solutions Double Take Availability, factory-installed software that provides protection for critical data with real-time, byte-level replication.
HP’s Storage Marketing Vice President Craig Nunes said that the “demand for rapid response to changing conditions is crucial for businesses of all sizes, but for mid-sized companies, access to required storage capabilities can be just out of reach.” Nunes added that, “at HP we’re working with channel partners to simplify storage infrastructure and accelerate growth for customers by bringing hot technologies such as flash storage down into the entry segment.” | | 7:43p |
Latest FieldView Release Features ‘What If?’ Tool, API for Third-Party Software FieldView announced availability of its latest data center management software suite FieldView 2015 DCIM this week. The company claimed it was a big step forward for improving data center resilience.
Data center resilience comes from the capture and analyzing of multiple and wider sets of data and putting that data in broader context. FieldView accomplishes this through modules or plug-ins tailored for specific functions. Modules allow customers to extend their data center infrastructure management software into areas they want and tune the suite for their priorities.
In addition to enhanced look and feel, three new optional data center management modules are available: PowerView, LiveView, and DataView.
PowerView is a power chain analysis tool which helps simulate “what if?” scenarios, such as power chain failures, scheduled maintenance downtime, and load tolerance variations.
LiveView is an API that lets third-party software integrate to show live temperature, power, and other monitored data. A few potential examples of how it extends the DCIM platform are control systems, dynamic, adaptive cooling, computational fluid dynamics modeling, and IT service management.
DataView is an enhanced data warehouse. FieldView said in a statement that it integrates easily with big data applications for IT service management, capacity planning, and third-party reporting.
DCIM is extending and is also playing a big role in the convergence of facilities and IT as a result. By showing both types of data and how they relate to one another, it is enabling acting on the bigger picture based on granular systems data on both sides of the house.
“Data center resiliency is improved by gaining in-depth views into all areas of Facilities and IT operations,” Rhonda Ascierto, senior analyst at 451 Research, said. “The new offering represents a high level of software capability that will appeal to managers of mission-critical data centers.”
The three new modules enter general availability at the end of March. | | 10:42p |
Web Host GoDaddy Worth Up to $2.87B Based on Expected IPO Share Price 
This article originally appeared at The WHIR
In its latest SEC filing, GoDaddy has set an Initial Public Offering price range of $17 to $19 per share when it debuts on New York Stock Exchange, valuing the web hosting company at between $2.57B and $2.87 billion.
GoDaddy stated it will be offering 22 million shares of Class A common stock, which would raise between $374 million and $418 million based on their anticipated share price. This is significantly more than GoDaddy’s original aim to raise around $100 million according to the company’s S-1 filing in June 2014.
Earlier this year, GoDaddy announced plans to list on the NYSE under the symbol “GDDY” and that its IPO would be underwritten by Morgan Stanley, JPMorgan, Citi, Barclays, Deutsche, RBC, Stifel, and Piper.
In 2011, GoDaddy was acquired by a private equity consortium led by KKR & Co LP and Silver Lake Partners for $2.25 billion. KKR and Silver Lake each own 27.9 percent stakes, Technology Crossover Ventures (TCV) owns 12.6 percent, and company founder Bob Parsons owns approximately 28 percent. Following the IPO, the stakes KKR, Silver Lake, and Parsons will have in the company will drop to 23.9 percent, and TCV will hold 10.7 percent.
On the subject of the GoDaddy IPO, market research firm IPO Candy has noted that the company’s organizational structure could be of concern to investors. “Generally speaking, all the controlling entities of GoDaddy have been keen to take money out of the company, not in a bad way so far, but in the form of interest and distributions. They have put incentives in place for existing management, which are also self-serving. It remains to be seen how external equity investors will be treated.”
As Reuters notes, 2014 was a huge year for IPOs (with US IPOs raising the more money than any year since 2000) and 2015 has been relatively lukewarm so far with Box Inc. being one of the few technology companies launching IPOs. But GoDaddy, which has been in business since 1997 and currently manages around a fifth of all domains, is much more well-established than new technology upstarts.
In 2014, GoDaddy reported $1.39 billion in revenue, a 23 increase from 2013, as well as adding 1.1 million customers bringing its total clients to 12.7 million which bring in $114 per customer on average. Yet it also reported ending the year with $143.3 million in net losses, albeit less than the $200 million it lost the previous year.
This article originally appeared at http://www.thewhir.com/web-hosting-news/%E2%80%8Bweb-host-godaddy-worth-2-87b-based-expected-ipo-share-price |
|