Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Wednesday, January 13th, 2016
| Time |
Event |
| 1:00p |
Linear Programming Helps Groupon Optimize Data Center Design This month, we focus on data center design. We’ll look into design best practices, examine in depth some of the most interesting recent design trends, explore new ideas, and talk with leading data center design experts.
Groupon may be the future of merchant discounts, but it uses a mathematical problem solving method formulated in the 1930s to optimize the data center design that supports its popular service.
Linear Programming models are used to maximize specific outcomes given numerous variables. The word “linear” refers to linear relationships between the variables.
The approach is common in other industries, such as transportation, energy, and telecommunications, but it also applies well in data center design, the Groupon team found, since there are clearly desirable outcomes and lots of variables.
The goal was to maximize space and power utilization at the rack level and optimize “striping,” which is distributing servers that perform similar functions across multiple racks to improve reliability, Harmail Chatha, Groupon’s director of global data center operations, said.
At the end of 2015, Groupon launched a new data center in Sacramento, California, leasing wholesale space in the new building the massive RagingWire data center campus there. This was the first site where Groupon used the approach.
Read more: RagingWire Takes Its Massive-Scale, Luxury-Amenities Data Center Model to Texas
Chatha’s team studied the IT environment over the course of several months, trying to understand power utilization levels by various types of servers that support its service and generate variables for the model, including day, time, and seasonality. “The output [of the model] was how many servers we should deploy per rack,” he said.
The approach replaces intuition and prior knowledge, which is what data center operators usually rely on when making such decisions, Chatha said.
From Cloud to Wholesale Data Center Leases
Groupon started in 2008 and for its first several years relied on public cloud services for its infrastructure. Around 2011 the company’s user base reached a size where it made more sense to switch to an on-premise data center model.
Today, it operates primarily out of its own data centers leased from wholesale data center providers on both coasts of the US, as well as in Europe and Asia. Chatha declined to specify how many data centers the company had or where. The company also has retail colocation footprint, which comes into the fold as it acquires other firms, but that footprint usually gets consolidated into the wholesale facilities over time.
Read more: Need for Speed: How Groupon Migrated to Node.js
Keeping up with growth is always a moving target for its data center team. At the time of Groupon’s IPO in 2011 – which was then billed as the biggest IPO by a web company since Google – Groupon had fewer than 1,000 deals available, Nicholas Halliwell, the company’s spokesman, said. At the end of the third quarter of 2015, the company was advertising 550,000 active deals globally, with about 290,000 of them in North America.
Much of the capacity planning work Chatha and his team do revolves around supporting the next holiday season, he said. That’s when demand is highest. They also track new features and their impact on data center demand.
Containment, While Infrastructure to Optimize Efficiency
At RagingWire, a data center provider majority-owned by Japan’s NTT Communications, Groupon has leased a 5,000-square-foot data hall with 1 MW of power capacity, a lot of which is to prepare for future growth, Chatha said. Its deployment is within the data center provider’s CA3 building, which was completed last year.
The Groupon deal is an example of RagingWire’s transition from a retail colocation model to a mix of retail and wholesale, the data center provider’s VP of marketing, Jim Leach, said.
To optimize for energy efficiency, the Groupon team used containment pods, rather than curtains, and took extra measures to make sure there are no air leaks in its cold aisle containment system. “We’ve taken it to the next level,” Chatha said about the meticulous sealing exercise his team went through.
They also deliberately used all-white infrastructure in the environment, which also helps save energy, according to Chatha. “Everything within the environment is white,” he said. “There’s about a 3 percent savings on energy costs when you’re doing white-on-white infrastructure.”
White infrastructure means you need less lighting. It also helps with cooling. “Black color tends to hold heat, where white doesn’t,” he said.
While they may appear relatively small efficiency wins, they add up to substantial savings as web companies grow their data center infrastructure. Small energy savings across a row of racks can turn into millions of dollars in energy cost savings at scale. | | 5:14p |
The Colliding and Complementary Worlds of DevOps, Big Data and Data Management Nitin Donde is CEO of Talena, Inc.
To succeed in today’s data-rich and data-centric world, companies are building new, high-value applications on top of NoSQL, Hadoop and other modern data platforms. According to IDC, the big data market will reach $48 billion by 2019. At the same time DevOps processes are rapidly penetrating the Global 2000, impacting the very companies that are adopting these new data platforms. These teams and their processes are now responsible for managing data infrastructures that are orders of magnitude larger than anything companies have dealt with previously. As a result, big data, DevOps and data management are rapidly intersecting, and the speed at which groups are expected to support this new world order and launch new applications raises a new set of challenges, considerations and questions, including:
- How do data management principles change in the world of Big Data?
- How can agility and security co-exist in modern data environments?
Let’s address each of these issues in more detail.
The Implications of Scale
Big data applications run on scale-out architectures that can reach thousands of nodes and petabytes of data. For example, Apple deploys Apache Cassandra across at least 75,000 nodes and also deploys other big data platforms to power a number of their consumer-facing applications. This scale has a number of different implications for data management principles, including those around backup and recovery. This means that a single human error that accidentally deletes tables can result in the loss of hundreds of terabytes of data, not just hundreds of gigabytes. At this scale, data sets take exponentially longer to rebuild, proportionately increasing the opportunity costs of time spent on this activity not to mention the business impact of the lost data itself. To put it another way, an accidental data loss will mean that dozens of engineers need to halt other business critical projects to rebuild the lost data set – a multi-million dollar bill in revenue loss and opportunity cost.
DevOps teams often have service level agreements (SLAs) whether internal or external. As a result, they often have to rethink their assumptions around data recovery time, built around traditional application data sets, and how to change the underlying recovery architecture to support these SLAs.
The Importance of Self-Service
Companies that employ DevOps principles focus on rapid deployment frequency. In a big data world this means figuring out how best to enable engineering or data science teams self-service access to production data sets to facilitate rapid application iteration: the concept of waiting for data or for custom scripts to be written every time data is needed is completely antithetical to the DevOps movement.
On the other hand, data often contains confidential or personally identifiable information and data breaches remain common and costly. A 2015 report by the Ponemon Institute put the average consolidated total cost of a data breach at $3.8 million, which represented a 23 percent increase since 2013. Consumer trust is also important, and consumers are increasingly aware of how easily hackers can access their data.
Self-service access, therefore, has to be paired with appropriate protection for personally identifiable or other confidential information, whether that’s in the form of data masking or data encryption, or both. Most people don’t naturally think about data masking as part of their “always-on” data strategy, but without it you run the risk of compromising the trust of your users. Even a “small” breach of data can significantly impact the reputation and viability of a business.
Conclusion
At the heart of innovations across markets like IoT and mobile, and industries such as retail, banking and healthcare, is data. Refreshingly, data is also increasingly understood as the currency that drives the value of the companies that use it optimally. As companies continue to migrate off legacy systems in favor of platforms designed to support today’s application needs, they must also plan accordingly to ensure issues around scale and security are fully considered and addressed. These are top-of-mind issues for DevOps teams, and a focus on the entire application lifecycle is key to modern data management. The right planning has big upside and the risks related to lost or compromised data are far too great to ignore.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 7:07p |
Amazon Plans First Cloud Data Centers in Canada Once again following in its rival Microsoft’s footprints, Amazon Web Services is gearing up to launch a cloud region in Montreal. These will be Amazon’s first cloud data centers north of the US border.
The announcement comes after last week’s announcement of the launch of the first AWS data centers in South Korea.
AWS is still the leader in cloud infrastructure services in terms of revenue, while Microsoft Azure appears to be a distant second, although the latter doesn’t break out its cloud services unit’s revenue. But Microsoft has been ahead of its rival in terms of physical reach of its cloud, an increasingly important part of competing for the dollars of cloud customers around the world who care more and more about location of their data.
Read more: The Billions in Data Center Spending Behind Cloud Revenue Growth
Twenty-two Azure regions are generally available today, and four more are awaiting general availability, currently available to certain customers only. There are 12 AWS regions, and five in the works.
Microsoft said in June it was building cloud data centers in Toronto and Quebec City, expecting to launch them this year. Chief AWS evangelist Jeff Barr didn’t mention a target launch date for Amazon’s first Canadian data centers in his blog post announcing the project.
Read more: Safe Harbor Ruling Leaves Data Center Operators in Ambiguity
In some geographies, Microsoft has been building and bringing online cloud data centers faster than Amazon. In India, for example, one of the most important emerging markets for cloud services, Microsoft announced the launch of data centers in three distinct locations last September. Amazon said last year it plans to bring cloud data centers to India sometime in 2016.
The two Washington State-based tech giants were head-to-head in announcing plans to launch data centers in the UK, a crucial market where it has become more important to have physical presence for cloud providers since last year’s annulment of rules that governed storage of European users’ data in data centers outside of Europe. Microsoft said it was building cloud data centers in the UK about one week after AWS said it was doing the same. | | 7:46p |
EU Hands Huge Government Cloud Contracts to BT BT, the UK telecoms giant also known by the name of its subsidiary British Telecommunications, has landed two government cloud contracts with the European Union, together worth more than €24 million over four years.
BT will provide public and private cloud services to more than 50 European government institutions, agencies, and bodies, from the European Parliament and European Council to the European Defense Agency. The company will serve as the European government’s private cloud provider and will be one of the handful of companies allowed to compete for public cloud business.
This is not the first time BT has won government business in Europe. In 2015, for example, it secured a €15.2 million voice services deal with the European Commission, and a €55.7 million contract to provide internet access to all major institutions, agencies, and bodies in all EU member states.
The government cloud will be hosted in numerous data centers around the EU. The company has a mixed data center strategy, using data center service providers in some cases and owning and operating its own data centers in others, Jason Cook, CTO for BT Americas, told Data Center Knowledge in an interview earlier.
Read more: BT Americas CTO on Data Center and Cloud Strategy
Although it offers its own public cloud infrastructure services, BT’s business strategy in that space is to be an aggregator, giving customers access to a variety of cloud and data center providers, including itself. Through this “Cloud of Clouds,” BT customers can access Amazon Web Services, Microsoft Azure, Salesforce, and Cisco’s cloud services, among others, or colocation providers Equinix and Interxion.
BT will be one of five providers that will compete for EU government’s public cloud business. | | 7:57p |
Seven Reasons to Feel Good About Cloud Services in 2016 
By Talkin’ Cloud
2016 could provide many growth opportunities for cloud services providers. Here’s why:
1. Total Cloud Infrastructure Spending Could Grow
International Data Corp predicted total spending on cloud IT infrastructure (server, storage and Ethernet switch, excluding double counting between server and storage) would grow by 24.1 percent to $32.6 billion in 2015. In addition, IDC noted it expected cloud IT infrastructure spending to expand at a compound annual growth rate (CAGR) of 15.1 percent through 2019.
Read more: The Billions in Data Center Spending Behind Cloud Revenue Growth
2. Cloud Security Will Remain a Top Priority
Cloud application security provider Elastica recently found that the cost of exposed data in software-as-a-service (SaaS) may total up to $13.85 million per incident. However, CSPs can resolve security issues for businesses, ensuring these companies can protect their sensitive data that is stored in the cloud at all times.
3. Cloud Storage Is Expected to Become More Widespread
A recent Soliant Consulting study revealed 36 percent of all data could be stored in the cloud by the end of this year. CSPs, meanwhile, could capitalize on the rising demand for cloud storage services by adding these offerings to their portfolios.
Read more: Seven Biggest Cloud Outages of 2015
4. Containers Are Becoming Increasingly Popular
Cloud app containerization could become increasingly popular in 2016 and beyond, thus providing many new growth opportunities for CSPs. In fact, a recent StackEngine study indicated hybrid cloud was one of the top motivators for using Docker containers, and 70 percent of respondents said they are already using Docker or evaluating it within their organizations.
5. There Is Rising Demand for Cloud-Based Video Conferencing Services
Entering the cloud-based video conferencing services market could deliver long-lasting benefits for CSPs. A recent Global Industry Analysts (GIA) report indicated the cloud-based video conferencing services market is expected to grow and could be worth $2.9 billion by 2020.
6. Many SMBs Want Cloud Solutions
Many small and medium-sized businesses (SMBs) continue to explore ways to utilize cloud solutions, which could give CSPs an opportunity to support these companies. Plus, a recent study from Carbonite (CARB) and market IDC indicated SMBs are increasingly turning to cloud or hybrid solutions to achieve business continuity, which could drive growth in the backup-as-a-service (BaaS) and recovery-as-a-service (RaaS) segments.
7. Cloud Opportunities Extend Beyond Infrastructure
Today’s CSPs can offer customers support beyond traditional hosting capabilities. A recent study from Microsoft (MSFT) and 451 Research showed nearly 70 percent of the opportunity for CSPs now centers on application hosting (email and business applications), managed services (backup and disaster recovery) and security services (threat management).
This first ran at http://talkincloud.com/cloud-computing/7-reasons-feel-good-about-cloud-services-2016#slide-0-field_images-51551 | | 10:55p |
Time Warner Pitches Direct Cloud Connectivity via Equinix Data Centers 
By Talkin’ Cloud
Time Warner Cable Business Class is offering connectivity to the Equinix Cloud Exchange, according to an announcement by the Time Warner Cable division this week.
The Equinix Cloud Exchange allows businesses to connect to several cloud service providers including AWS, Microsoft Azure, and Rackspace.
TWCBC delivers Ethernet connectivity to the Equinix Cloud Exchange. With the cloud exchange, businesses have a private network to access their cloud and physical IT infrastructure from anywhere in the US. The connections to the exchange are available from 10 Mbps to 10 Gbps speeds.
Read more: Equinix CEO Unveils Aggressive Plan to Court Enterprises
“The demand for secure and high capacity connectivity to cloud providers continues to increase as businesses migrate to the cloud to take advantage of the cost, agility, and adaptability of cloud services,” Satya Parimi, GVP, Product Management, Time Warner Cable Business Services said in a statement. “TWCBC already offers secure and reliable Ethernet connectivity to our NaviSite cloud platform. With the announcement today, we can now meet connectivity requirements for customers who use another cloud service providers to meet their IT needs.”
“With Equinix Cloud Exchange, customers will have on-demand, direct access to multiple cloud providers, enabling them to select the cloud or clouds that best meet their needs. Working with TWCBC, we can offer enterprise customers reliable interconnection to multiple cloud service providers over a secure private connection,” said Jim Poole, Vice President of Service Provider Marketing, Equinix.
This first ran at http://talkincloud.com/cloud-computing/time-warner-cable-business-class-offers-connectivity-equinix-cloud-exchange |
|