Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Monday, January 4th, 2016
| Time |
Event |
| 1:00p |
Top 10 Data Center Stories of the Month: December 2015 Here are some of the most popular stories that ran on Data Center Knowledge in December.
How the Colo Industry is Changing – Customers are getting smarter about what they want from their data center providers; enterprises use more and more cloud services, and the role of colocation data centers as hubs for cloud access is growing quickly as a result; technology trends like the Internet of Things and DCIM are impacting the industry, each in its own way.
Hot Data Center Startup Vapor IO Raises First Round of Funding – Vapor IO, which came out of stealth earlier this year with a radical new design of the data center rack and sophisticated rack and server management software, has closed a Series A funding round, led by Goldman Sachs, with participation from Austin’s well-known VC firm AVX Partners.
How Microsoft Plans to Win in Enterprise Hybrid Cloud – While Microsoft is behind Amazon in public cloud, it has no need to play catch-up inside the enterprise data center. That combined with the second-largest public cloud business puts it in a good position to dominate in hybrid cloud, which is touted overwhelmingly as the cloud strategy of choice among enterprises.
Forget Hardware, Forget Software, Forget the Infrastructure – Enterprise IT has to forget about hardware, forget about the infrastructure, forget about software, and think more about getting their job done, which is delivering services or applications.
 David Cappuccio, VP and distinguished analyst at Gartner, speaking at the firm’s data center management conference in Las Vegas in December 2015
The Problem of Inefficient Cooling in Smaller Data Centers – The data center on campus operated by a university IT department; the mid-size enterprise data center; the local government IT facility. These facilities, and others like them, are data centers anybody hardly ever hears about. But they house the majority of the world’s IT equipment and use the bulk of the energy consumed by the data center industry as a whole.
Why Hyperconverged Infrastructure is so Hot – Hyperconverged infrastructure did not exist as a concept two or three years ago. Today, it is one of the fastest-growing methods for deploying IT in the data center, as IT departments look for ways to adjust to their new role in business and new demands that are placed on them.
IBM to Take Over AT&T’s Managed Hosting Business – IBM is taking over AT&T’s managed application and managed hosting services business, acquiring the equipment used to support those services and access to AT&T data centers where that equipment sits.
 (Photo by Toby Jorrin/Getty Images)
What (Hardware) You Need to Build an Azure Cloud in Your Data Center – Microsoft is preparing to launch the first preview release of Azure Stack – a private Azure cloud environment a company can stand up in its own data center that will look exactly like the public version of Azure to users and be seamlessly integrated with the public cloud.
Understanding the Different Kinds of Infrastructure Convergence – As company computing demands change, what will the architecture that supports modern businesses and their cloud initiatives look like?
DuPont Fabros Planning Massive Toronto Data Center – DuPont Fabros Technology, the wholesale data center provider that leases lots of space and power capacity to the likes of Microsoft, Facebook, and Yahoo, is expanding into Toronto, a growing data center market the company says is underserved by data center providers.
 At nearly 450,000 square feet, ACC7 in Ashburn, VA is the largest data center in DuPont Fabros Technology’s portfolio. (Photo: DFT)
| | 5:48p |
T5 Raises $70M to Fund Charlotte Data Center Construction T5 Data Centers, an Atlanta-based data center provider that specializes in offering wholesale data center space, has closed a new $68.5 million credit facility to finance new data center construction at its Kings Mountain, North Carolina, campus just outside of Charlotte.
Joe Junda, a managing director at CIT Communications and Technology Finance, which arranged the credit, said Charlotte was a strong data center market. “The combination of favorable tax incentives, low power costs, dependable access to power, abundant fiber connectivity, and a strong base of corporate headquarters position Charlotte as a viable location in the data center market,” he said in a statement.
The latest 130,000-square-foot project is CIT’s second investment in a data center construction project by T5. The financial holding company financed T5’s Portland build last year.
T5, which also has data centers in Atlanta, Colorado, Dallas, Los Angeles, and New York markets, partnered with private equity firm Iron Point Partners on the new Charlotte build.
While providing wholesale data center capacity is its bread and butter, in recent years, T5 has been expanding the variety of services it offers to its tenants. Other wholesale providers, such as the giant Digital Realty Trust, have been making similar moves, recognizing that enterprise customers increasingly want more from their data center providers.
In late 2014, T5 partnered with Carpathia Hosting, which has since been acquired by QTS Realty Trust, a major T5 competitor, to offer custom infrastructure solutions, such as managed hosting, cloud, and colocation. It’s unclear how the acquisition affected the partnership, but in September 2015, T5 rolled out colocation services of its own and direct network connections to cloud service providers at its Atlanta and Los Angeles data centers. | | 6:00p |
Cloud Services are Eating the World Shlomo Kramer is the Co-Founder and CEO of Cato Networks.
The cloud revolution is impacting the technology sector. You can clearly see it in the business results of companies like HP and IBM. For sure, legacy technology providers are embracing the cloud. They are transforming their businesses from building and running on-premise infrastructures to delivering cloud-based services. The harsh reality is that this is a destructive transformation. For every dollar that exits legacy environments, only a fraction comes back through cloud services. This is the great promise of the cloud – maximizing economies of scale, efficient resource utilization and smart sharing of scarce capabilities.
It is just the latest phase of the destructive force that technology applies to all parts of our economy. Traditionally, technology vendors touted benefits such as personnel efficiencies and operational savings as part of the justification for purchasing new technologies – a politically correct way to refer to fewer people, offices and the support systems around them. This has now inevitably impacted the technology vendors themselves. Early indicators were abundant: Salesforce.com has displaced Siebel systems, reducing the need for costly and customized implementations; and Amazon AWS is increasingly displacing physical servers, reducing the need for processors, cabinets, cabling, power and cooling.
Cloud is Eating the World
Marc Andreseen argued in his 2011 Wall Street Journal article that, “software is eating the world.” In my view, this observation is now obsolete. Today, cloud services are eating the world. Cloud services encapsulate and commoditizes the entire technology stack (software, hardware, platforms and professional services). This model is so impactful and irresistible that even capturing only a part of the value is a big win. This is how cloud services now include platforms – including Google, Microsoft, Salesforce.com; and infrastructure, provided by such vendors as Amazon AWS, Microsoft Azure and IBM/Softlayer.
Making IT Simple Again
Customer focus is increasingly shifting into simplification of complex business and IT infrastructure because complexity is both a technical and business risk. It was a simpler world in the past: One provider delivered a total solution for all IT needs. This paradigm was deemed too rigid, so it was replaced by horizontal integration of best-of-breed components. Complexity was a side effect of this change, as customers had to integrate and then run these heterogeneous environments.
We are now seeing the pendulum shift again. Cloud services now offer a vertically integrated solution to multiple business problems. Choice is reduced in the sense that customers can’t dictate the compute, storage or software cloud services will use, but complexity and cost is eliminated en mass. Ultimately, the proof is in the pudding, and if the business value is delivered in a consistent fashion with the right third party validation for quality of service, the details don’t really matter.
Changing of the Guard
The era of cloud requires a new type of company that is agile and lean, like the cloud itself. Very few companies have the courage or the will to cannibalize their legacy revenue streams and embrace a new reality where there is simply less money and resources available to get things done. Building a startup company for the cloud era requires that the company must be designed for the cloud economic model. That means investing in a high-quality software stack, customer support and success teams and self-service/low friction service delivery models. The modern cloud startup must do more with less, because its customers are doing the same.
The Future of Cloud: Security
Network security has yet to be extensively impacted by the cloud. Security technology is considered sensitive by large enterprises, limiting sharing of data in the cloud, and regulations place constraint around customer data handling in the cloud. These forces may slow down the adoption of cloud technologies, but will ultimately give way to the immense value they offer to businesses.
Security will uniquely benefit from vertical integration with distinct domains, such as networking. Here is how: If you can monitor all networking activity (for example, in cloud-based networks), you can more easily detect anomalous activity within the infrastructure (i.e., DNS queries, session initiation and authentication, data exchange) and create a complete profile of the threat. The agility of the cloud enables a rapid deployment of countermeasures without the need to perform slow and risky software updates for on-premise equipment.
Thanks to the innovations and elasticity of the modern cloud, network security can finally become less of a burden on staff and budgets, while the quality of service and the customer experience can dramatically improve. This is not a shot across the bow at network security incumbents. It is a recognition that the transformative power of the cloud will ultimately reach every business in the world, and IT security vendors, like all other IT vendors, will have to make a choice – embrace it or wither.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 6:36p |
CyrusOne Winds Down Ties to Cincinnati Bell Data center provider CyrusOne has wound down business connections with its former parent company, the US telecom Cincinnati Bell, which spun it off in an IPO in 2013.
The telco has sold all remaining stock in an operating partnership with the data center provider, as noted by Seeking Alpha, which tracks the stock market. CyrusOne has been gradually buying back its own stock from its former parent since 2014 and recently bought the last 7 million shares the telco still owned.
Cincinnati Bell acquired Carrollton, Texas-based CyrusOne in 2010 for $525 million from Abry Partners, a Boston private equity firm active in the data center business. CyrusOne thrived, but the parent company struggled to maintain growth and started exploring a spinoff in 2012.
Currently, Cincinnati Bell owns a 9.5-percent common-stock stake in CyrusOne but none of the operating partnership, which was set up following CyrusOne’s IPO. The telco owned more than 70 percent of CyrusOne after the float.
CyrusOne has a massive data center portfolio that stretches across the US, as well as locations in the UK and Singapore. | | 7:05p |
Ian Murdock’s Role in Free and Open Source Software History 
Via The VAR Guy
Ian Murdock, one of the unsung heroes of the free/open source software revolution and founder of the semi-eponymous Debian GNU/Linux operating system, has died. He leaves behind an important but little appreciated legacy as a programmer who helped bridge the gap between the Free Software Foundation and the Linux kernel in its early days.
Docker and Debian reported Murdock’s death on Dec. 30. Murdock was 42.
Murdock founded Debian in 1993, when he was an undergraduate at Purdue University. At the time, the Linux kernel, which Linus Torvalds first released in the fall of 1991, remained quite new. Most existing GNU/Linux systems prior to Debian lacked either professionalism or a commitment to the principles of software sharing and openness that the Free Software Foundation promoted.
“You’d flip through Unix magazines and find all these business card-sized ads proclaiming ‘Linux,'” Murdock wrote of the situation before Debian appeared. “Most of the companies were fly-by-night operations” that did little to advance free software.
In this context, Debian stood out as the first major GNU/Linux distribution that was created expressly to promote free software. As Murdock put it in the 1994 Debian Manifesto, “Debian Linux is a brand-new kind of Linux distribution. Rather than being developed by one isolated individual or group, as other distributions of Linux have been developed in the past, Debian is being developed openly in the spirit of Linux and GNU.”
Debian was also important because it formed the basis for the first major collaboration between the Free Software Foundation and the Linux community. Prior to Debian’s launch, other programmers had used GNU software in conjunction with the Linux kernel to build operating systems, but they had not received the official endorsement of the GNU project or the Free Software Foundation.
Indeed, GNU developers at the time were paying relatively little attention to Linux. They dismissed Torvalds’s kernel in the summer of 1992 as a kernel that “runs only on 386/486 AT-bus machines.” They added, “porting to non-Intel architectures is likely to be difficult as the kernel makes extensive use of 386 memory management and task primitives.”
Their stance was unsurprising. Linux was a monolithic kernel developed “just for fun” by an amateur programmer in Finland. Meanwhile, GNU’s professional programmers were working in Massachusetts on their own kernel, Hurd. Hurd was a microkernel that was supposed to be much more sophisticated than Linux. In 1993, they had little reason to see Linux as anything more than a hobbyist project that would prove no more enduring than other free Unix-like kernels, such as Minix, that long ago passed into oblivion.
Yet their attitude toward Linux changed with Debian. Starting in June 1994 GNU programmers promoted Debian GNU/Linux as “a complete, full-featured system based on GNU and Linux that is easy to install and configure.” By the spring of 1995 the GNU project was distributing Debian on CDs, which marked the first time that it shipped a Linux-based operating system as part of its official software offerings.
Murdock also coined the term GNU/Linux to refer to operating systems based on GNU software and the Linux kernel. It was an alternative to Lignux, the name Richard Stallman had proposed.
Murdock left his leadership position with the Debian project in March 1996, when he was replaced by Bruce Perens. He subsequently worked for Sun Microsystems, Salesforce and Docker, where he was employed at the time of his death.
This first ran at http://thevarguy.com/open-source-application-software-companies/ian-murdocks-significance-free-and-open-source-software-h | | 7:45p |
Toyota to Build Data Center for Connected-Car Data Toyota, the world’s largest automaker, is planning to build a data center specifically to collect and analyze data from cars equipped with a new type of Data Communication Module, an upcoming feature that will enable the company’s next-generation connected-vehicle framework, which will transmit data over cellular networks.
“To build the IT infrastructure needed to support this significant expansion of vehicle data processing, the company will create a Toyota Big Data Center (TBDC) in the Toyota Smart Center,” the company said in a statement. “TBDC will analyze and process data collected by DCM, and use it to deploy services under high-level information security and privacy controls.”
Connected cars are one of the new quickly growing sources of data expected to drive growth in demand for data transmission, storage, and processing capacity, collectively referred to as the Internet of Things.
Toyota did not provide details about the new data center it is planning to build for connected car data, such as size, power capacity, or location. The company made the announcement in conjunction with CES 2016, the big consumer electronics trade show taking place this week in Las Vegas.
It plans to start offering the new DCM as a feature in 2017 models on the US market, with plans to expand to other markets in the future. The company said connecting cars over cellular networks expands their ability to transmit data for products and services.
Toyota wants to build a DCM architecture that will be uniform around the globe by 2019. Today, the devices are different from region to region or from country to country. It is also planning to consolidate global DCM communications.
The next-gen DCM will do things like provide an emergency notification when an airbag is deployed during a traffic accident. The emergency notification system will come as a standard feature, according to the automaker.
Many of the services that use DCM will be provided by companies other than Toyota. The automaker has partnered with a company called UIEvolution, which will build an application that will provide vehicle data to Toyota-authorized third-party service providers. | | 8:00p |
Web Hosting and Cloud Computing Predictions for 2016 and Beyond: Part 1 
Via theWHIR
The evolution of cloud and the web touches nearly all aspects of our businesses and our lives. The past few years have provided some indications about how things can change, and how important it is to be looking for the next trends.
As we think about 2016, cloud and web hosting will continue to influence the world we live in.
Some of the trends outlined in this post reflect how clouds will become places where applications and data converge for greater efficiency… but places of convergence are valuable targets for criminals.
E-Commerce Moves Further Into the Real World
Retail will converge into one platform; Online retailers might venture into real-world locations
We’ve already seen over the past few years how retailers have had to adopt e-commerce in order to compete. But in 2016, cloud technologies promise to get rid of the barrier that separates e-commerce from bricks and mortar shops.
Some of the features that will come are inventory management and logistics, as well as further integration with back-of-house operations like ordering and accounting.
Many of the e-commerce providers like Shopify are offering their own physical Point-of-Sale systems. And companies whose business model is largely based around POS (like Square) are broadening their offerings with useful integrations with e-commerce platforms like Bigcommerce and Ecwid.
Eventually, it will be impossible to provide e-commerce without offering a wide spectrum of integration between accounting, online sales, and physical POS. There are further opportunities for integration with CRM, ticket support, invoicing, etc.
There’s also an interesting trend where successful online retailers like Montreal’s Frank & Oak has used its online success as the basis for a recent delve into real-world stores. So, just as traditional retailers had to adopt e-commerce, online retailers might have to open physical locations to take their operations to the next level.
Website and App Builders Come into Their Own
Web hosts will compete to have the best building utilities; tools to build sites and apps lower the bar but specialists will likely continue to be relevant
Consumer-oriented site builders have a long history, and they’ve come a long way from things like Netscape Composer. The current line-up of tools are becoming more sophisticated, and many have integrations that make it easy to add content from social networks like Facebook, Instagram and Twitter.
Most shared web hosts have had obligatory (and usually quite bad) site building utilities for customers to clumsily build their own sites. The trend for 2016 is that every web host will have to have a modern and sophisticated site builder.
In many ways, the bar has risen for website building tools. Web hosts like Squarespace,Wix and, most recently, PageCloud have put their website builder capabilities at the forefront of their business, showing this is a feature customers care about.
Also, while it’s not exactly a site builder, the WordPress Content Management Systemessentially provides much of the same functionality needed to build a website.WordPress themes (like Divi from Elegant Themes) make complicated design and functionality easy to implement. Web hosts like WP Engine base their entire business around hosting WordPress users, and mass-market hosts like GoDaddy have come out with managed WordPress products.
In 2016, many web hosts without a good site builder will either code their own or acquire companies with the capabilities they need, or even develop services to support open-source tools.
As web hosts begin to take website building more seriously, companies likeSquarespace and Wix will face more competition from traditional web hosts.
While PageCloud has boasted about how its drag and drop interface is revolutionary, it’s doubtful it will catch on because it creates sloppier code than its competitors. In an early preview of PageCloud, it seemed that:
- It’s main focus is on the desktop experience and the mobile experience is an afterthought;
- Element styles are in-line to each elements rather using CSS stylesheets that can change many elements at once and encourage uniformity;
- Limitations in navigation menu tools (combined with the in-line styles) make it difficult to make anything more than a single-page site.
PageCloud’s idea is that regular users just want to make simple pages and don’t care about these details. Despite PageCloud’s ability to signup many initial users, its logic is ultimately flawed because developers would find this a frustrating tool to use – unlike something like WordPress that does a better job of complying with current web conventions and functionality can be extended through plugins.
While you might think website building tools are ideally aimed at DIY consumers who want to bypass specialists, many of these building platforms are used by professionals to simply their jobs. Web development shops tend to use applications like WordPress as the basis for their client sites rather than coding sites from scratch. Any non-standard functionality can be added through readily available plugins or custom-coded as a last resort.
Increasingly, the role and value of digital agencies will shift from their coding abilities towards becoming experts at assembling all the elements of a website (images, text, videos) and arranging them for the client using a variety of tools and seldom touching code.
In 2016, we’ll also be closer towards a reality where more people can build mobile apps, but the biggest developments are really aimed at making things easier for professionals.
This is very clear when it comes to an app development tool like the WaveMaker rapid development platform for hybrid iOS and Android apps. The platform takes a lot of the work out of coding native device functionality (like GPS or camera access) into an application, but it also simplifies application maintenance so that, for instance, continues to work with new OS updates that may scrap some APIs.
New Security Risks for Organizations from IoT and Cloud
IoT devices could be an easy entry point for attackers; New evidence suggests that malware can trick sandboxes and break free from VM isolation; “Ghostware” infiltrates systems and it goes undetected
Just as it seems organizations have gotten used to their security perimeter extending beyond the walls of their data center to external clouds and smartphones, more “things” are about to connect to the network – and this could increase risk. In 2015, we saw proofs of concept for attacks where a compromised IoT device network could provide a foothold within corporate networks to “land and expand” or malware could exploit communication protocols between the devices and the network for a man-in-the-middle attack. Worms and viruses targeting IoT devices could also propagate (and persist) among millions or billions of IoT devices expected to come online in coming years.
According to a report from FortiGuard Labs, the research division of cybersecurity provider FortiGuard, IT will be dealing with more sophisticated malware in the coming years directed at cloud technologies.
Virtualization might not be able to provide the isolation needed to keep threats within VMs. Vulnerabilities like Venom suggest that malware could escape from a hypervisor and access the host operating system, meaning that an infected client system could compromise an entire public or private cloud system.
FortiGuard also predicts a variety of malware it calls “ghostware” because it will erase all indications of compromise, so that organizations cant’s track the extent of data loss or what systems are compromised.
Blackhat hackers are also finding ways to deceive application sandboxes (the bomb disposal container where runtimes can be set off to reveal a malicious payload). Researchers have found “two-faced malware” that behaves differently during aSandbox inspection so it will pass a sandbox inspection and be able to deliver its payload when executed on the system proper.
2015 was unusual because of the commonplace of “mega breaches” that cost organizations millions in damages, and these mega breaches are likely to continue well into the future, according to a recent poll of Chief Information Security Officers.
And it’s not just businesses like Sony and Target in the cross hairs. Joe Adornetto, CISO of Quest Diagnostics, noted that healthcare continues to be subject to attacks attempting to steal health records. “No other single type of record contains so much Personally Identifiable Information (PII) that is often linked to financial and insurance information and can be used for various attacks,” he stated.
Anything with value, essentially, is fare game for criminals and worth their efforts. This is unlikely to change in the foreseeable future.
This first ran at http://www.thewhir.com/web-hosting-news/web-hosting-and-cloud-computing-predictions-for-2016-and-beyond-part-1 |
|