Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, February 4th, 2016
| Time |
Event |
| 1:00p |
RagingWire Bets on Cloud, Big Data with New Focus on Wholesale Much of the future IT infrastructure will be split between massive wholesale data centers and public cloud services, a big part of the latter sitting in those wholesale data centers. Doug Adams, who was recently appointed as president of data center service provider RagingWire (he was previously a senior VP at the company), believes there will be little room for anything between those two models, things like hosting or managed services.
There will be pure cloud on one side and pure wholesale data centers on the other. Managed services, he says, is not a winning model. This is why RagingWire, majority-owned by the Japanese telecommunications giant NTT Communications, is betting on wholesale data center services, its new strategic focus.
The data center services industry is starting to mature and commoditize, and in a mature and commoditized market, pricing is king, Adams explains. The most effective way to slash prices is to reduce cost by building at massive scale, which is why RagingWire builds massive data center campuses.
Read more: RagingWire Takes its Massive-Scale, Luxury-Amenities Data Center Model to Texas
Role in NTT’s Global Data Center Expansion
NTT has big global data center services ambitions, and RagingWire plays a key role in achieving its parent company’s goals, charged with expanding NTT’s presence in North America, the world’s biggest data center services market. NTT’s other acquisitions – Gyron and e-shelter in Europe and NetMagic in India – are tasked with doing the same in their respective markets.
The Japanese giant plans to spend up to $3 billion on data center expansion between now and 2020, and more than half of that will be spent on US data centers, Adams says.
NTT’s goal is to have substantial data center capacity in top 15 cities that together represent 75 percent of the global data center colocation market, he says. Today, the company is only covering nine.
RagingWire’s immediate expansion plans are to establish data centers in New York, Silicon Valley, Chicago, and one other West Coast market, which will end up being Los Angles, Phoenix, or eastern Washington. e-shelter will build in Paris and Amsterdam in the near future, Adams says.
Cloud, Big Data, IoT Driving Wholesale Demand
Cloud services, Big Data, and Internet of Things applications are exploding, and companies behind them are gobbling up data center space.
Microsoft alone leased nearly 30MW of wholesale data center capacity across three locations last year, according to a report by the commercial real estate firm North American Data Centers. Oracle signed two wholesale leases in North America last year, about 5MW each. Apple signed two 6MW leases, and Amazon signed a 2MW lease in Canada and a 130,000-square-foot one in Silicon Valley, although it’s unclear what the power capacity of the latter was, the firm said.
Read more: Who Leased the Most Data Center Space in 2015?
Companies that don’t provide cloud infrastructure but offer have other widely used internet services – the likes of Uber, eBay, Apple, or Twitter – in addition to supporting their end-user web or mobile applications use massive amounts of data center capacity for their Big Data analytics engines.
Uber, for example, signed at least three wholesale data center leases last year, totaling 14MW, according to North American Data Centers. eBay is preparing for another big data center capacity expansion with Switch in Reno, Nevada.
Car makers and consumer electronics companies, who are investing a lot of money into IoT applications, are also taking down wholesale data center space in big chunks to aggregate and process device data, Adams says. “I can’t tell you how many huge databases we have sitting in our data centers now,” he says.
Competition for these companies’ business among data center providers is tough, and cost decides a lot. “We’re a commoditizing and maturing market, and pricing becomes very important.”
RagingWire, of course, is not alone in going after the opportunity to provide data center space to cloud providers and other major Big Data and internet-driven businesses, and there are as many approaches to pursuing it as there are data center providers. DuPont Fabros Technology also recently re-focused on pure-play wholesale, while Digital Realty Trust has a hybrid strategy, combining wholesale with retail colo and interconnection, saying big customers find being able to connect to the rich ecosystem of players in retail colo attractive.
Those are just a couple of examples. There are also companies like Equinix, CoreSite, Vantage, Infomart, and Server Farm Realty, among others, all pursuing the big opportunity to house infrastructure for the cloud’s biggest brands. | | 4:00p |
Open Source or Open Architecture? Big Data Needs Both Chris Selland is VP of Business Development, Big Data Platform, at Hewlett Packard Enterprise.
The act of publishing source code, in and of itself, doesn’t necessarily make a platform more useful. Making that source code extensible matters at least as much, especially in the era of open application programming interfaces (APIs), where many of the most useful apps are made so by other apps. Modern enterprises need both open source software and open architectures to take full advantage of Big Data.
This article will focus on how we reached this point, and provide a blueprint for CIOs who are evaluating open source and Big Data tools.
The Advantage of Extensibility
Think about the applications and services people use every day and how many of these applications integrate with one another seamlessly. For example, Google Maps can extend Uber to provide location tracking, Uber extends OpenTable to facilitate meal delivery, and Netflix extends the Apple TV interface to broaden viewers’ entertainment options, and so on. Openness is crucial for apps that exist in a connected world, creating and consuming information at a pace never before seen. According to a Domo survey of social media usage, each minute around the world:
- Instagram users like 1.7 million photos
- Tinder users swipe over 590,000 profiles
- Vine users play 1 million videos
- Facebook users like 4.1 million posts
- Twitter users send just over 347,000 tweets
Turning all this data – and other data like it – into useful capabilities for customers and actionable intelligence for the enterprise is a massive task that modern enterprises can’t afford to ignore. As a result, Big Data platforms are growing more popular by the day. However, it is important to keep in mind that we’re early in the lifecycle of this technology, and tomorrow’s platforms could look dramatically different from what we have today.
Growth will come in two ways: from open source development and from the flexible APIs to comprise an open architecture.
The Evolving Conversation
Think of the potential intelligence that exists in social media. When customers willingly reveal to you what they want and how they’ll consume it – without commissioning a survey – it can pay to listen. Social data affords us just such an opportunity – if we can extract the signal from the noise. This is just one reason research firm IDC says the overall market for Big Data technology and services is on track to grow 26.4 percent annually through 2018. By that point, the firm expects Big Data spending to top $41.5 billion annually.
Chief executives and their immediate subordinates could drive a big chunk of that growth – as long as it yields better understanding of the factors that drive their businesses as well as of their customers, supply chains, operations, competitive environments. In January, the Economist Intelligence Unit (EIU) surveyed 395 such executives and found that 48 percent believe Big Data is a useful tool while 23 percent say the technology will revolutionize the way businesses are managed.
But that can’t be done with data that is siloed, or where the increasingly massive volumes of data make it more difficult to extract the insight that lies inside that data. What these leaders often fail to realize is that it takes a fully open Big Data system — built on open source software, yet boasting an open architecture – to deliver the value they so crave.
Why Digital Hoarding Isn’t as Bad as it Sounds
Call it the price of never throwing anything away. Sure, storage is getting cheaper. Computers are getting more powerful. Networks are getting faster. None of it matters if your Big Data platform is closed off from accepting information from useful apps – or siloed in a way that prevents you from seeing the interrelationships and cross-correlations between different data sources. However, improving data center economics is making this a less costly, but more complex, problem.
Look at the storage market. The industry is moving toward affordable solid-state flash, which can now be purchased on a per gigabyte basis for $1.50 or less. Companies are cashing in by adopting flash storage in greater quantity as it scales to heretofore unheard-of capacities. On the other side, new applications and devices are generating ever larger volumes of data. In a market where near-infinite storage can be had for so little, there’s little to fear from generating too much data – especially with so many startups basing their business models on monetizing that data.
Couple that with the rapid advance of the Internet of Things (IoT) – machines which can create gigabytes in microseconds and networks that transmit it just as quickly – whether wired or wirelessly – and you have the infrastructure of an Idea Economy.
The opportunity is growing, but so are the challenges. Those who are able to most effectively capitalize in turning raw data into intelligence at scale and faster than rivals will be the large-cap winners of tomorrow.
The Future of Big Data is Open
Next-generation, data-centric businesses (such as Uber, Facebook, AirBnB and so many others) that turn information into products are successful because they’re able to quickly gather and process data from a wide variety of sources. Open source development has given us the lowest cost processing platform in history (i.e., Hadoop) while open architectures ensure that the right data gets to the right place at the right time – and is used in the right capacity to optimize insight.
Think of an open Big Data architecture as a standard combustion engine. Each archetype has intakes. For a carbon engine, it’s air, gasoline or diesel fuel and electric power delivered by a battery. Raw fuel becomes motion. In the same way, Big Data platforms have intakes called data sources: traditional enterprise data (ERP, CRM, EDW), machine data (Internet of Things) and human data (social data, audio, video, text). Analytics are what turn the raw material into intelligence – just as a powertrain turns air, fuel and electricity into motion.
Final Thought
After years of optimizing infrastructures to handle modest volumes of information, affordable storage and compute combined with powerful analytics software makes the very idea of tossing away or ignoring data unthinkable.
We need Big Data platforms to process it all and pick the wheat from the chaff, activity into insight, and we need these architectures to be both open and integrated. Only by ingesting the widest range of information, processing it at breakneck speed, and delivering the insight enabled by open and integrated systems can businesses hungry for new sources of profit find the answers that they seek. For the growing number of CIOs evaluating Big Data platforms that’s both a cautionary tale and a call to action. Is your organization ready to heed it?
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 8:37p |
Nlyte Acquires FieldView in DCIM Software Consolidation Move Nlyte Software, one of the top data center infrastructure management software providers, has acquired FieldView Solutions, also a DCIM software company, which has developed advanced real-time monitoring and data analytics capabilities for data center managers.
Each company’s individual strength is complementary to the other’s, and many data center operators have deployed both. Combination of the two makes for a stronger rival to the big data center infrastructure vendors that also sell DCIM suites, because customers generally prefer a single suite with a single vendor for all the functionality they need.
FieldView’s unique strengths are real-time data collection and predictive analytics. Its software simulates the entire electrical infrastructure chain in a data center and analyzes it to uncover single points of failure or to predict what would happen if one or multiple devices in the chain failed.
Read more: Who is Winning in the DCIM Software Market?
Nlyte’s existing real-time monitoring capability is limited in scalability, the company’s CEO Doug Sabella said. FieldView, whose technology had its start in a custom software project for one of the world’s largest financial services giants, was built for massive scale from the beginning.
The way Sabella put it, Nlyte understands the organs, while FieldView is really good at monitoring blood flow. FieldView can ingest 300,000 to 400,000 data points per minute, the company’s founder and CEO Fred Dirla, said.
“I would call Nlyte and FieldView best-of-breed products in their respective field,” Rhonda Ascierto, a data center technologies research director at 451 Research, said. “This is a good deal. They absolutely complement each other.”
Nlyte has been one of the pioneers in building on DCIM software to create a comprehensive solution that helps IT managers provide services to their users, she said. It integrates with a wide variety of IT service management solutions, server virtualization platforms, and configuration management databases, combining its sophisticated IT asset management capabilities with the tools IT managers rely on day to day.
Read more: Why CA Stopped Selling its DCIM Software Suite
FieldView monitors temperature and power consumption at the individual device level in real time and can either feed that data into third-party systems or deposit it into a data warehouse for further analysis. Its power-chain risk analysis is a separate but substantial capability.
Industry analysts have consistently placed Nlyte in the small group of leaders in the DCIM software market, where it competes head to head with the giants Schnider Electric and Emerson Network Power.
One of the things that differentiates it from the two giants is its pure-play DCIM business model. In other words, Nlyte is not interested in selling air conditioning units or power distribution equipment. “We have no hardware agenda,” Sabella said.
Nlyte has about 220 customers, including federal agencies, financial services firms, technology companies (Cisco, for example), telecommunications firms, such as Verizon, and companies in consumer products and healthcare, according to Sabella. FieldView serves a similar mix of industries but a smaller customer base, Dirla said.
By 451’s estimate, Nlyte has raised about $40 million since its inception in 2003. FieldView has raised about $5.5 million, Ascierto said.
Terms of the transaction were not disclosed. But, according to Sabella, “it was not a fire sale. It was real money.” | | 9:18p |
Identifying Opportunities in Data Center Sales 
By Rob DeVita, via The WHIR
We’ve all been an hour into the latest trending Netflix documentary and the dreaded red circle enters center screen, pausing right at the pinnacle moment. We sigh, roll our eyes, and the show resumes. This all too common pause is a sign of the massive change we are seeing in the data center industry. The way end users are experiencing media is shifting the needs of providers of this content.
Today service providers must become smarter and more agile to keep up with these changes. They must be a resource for customers to better connect. As the growth of cloud computing continues, these clouds physically reside in secure facilities known as data centers.
Industry trends are key drivers of this demand, but it is the fundamental components that need to be understood in order to find success. According to a report by Cisco, by 2019, online video will be responsible for 80 percent of global Internet traffic, with 85 percent of US traffic coming from video.
Sign up for Rob DeVita’s online course, How to Increase Data Center Sales, Make Clients Happy and Keep Them Coming Back for More. More details on the course below.
So what does this mean for a data center owner and customer? How can you be agile in a world that is experiencing rapid growth?
Uncovering data center opportunities doesn’t require advanced analytics and understanding of technology. You only need to ask a few simple questions and have a basic understanding of how the market works to identify new sales opportunities for every customer.
Facility Type
First you must identify what type of facility a platform should operate in. Is it a retail, wholesale, purpose built or carrier hotel environment? To answer that question you must look at what your customer is trying to accomplish.
Do they simply need to colocate in a location where all major networks interconnect? Are they looking for their own secure environment to host data? What about a hybrid approach? If there are certain compliances required to meet this, that is also an item to be factored in.
Power Requirements
After selecting a facility type, it will be time to identify power requirements. Scalable, ‘pay as you grow’ UPS architecture models are being seen more today, though a company must still understand the amount of power consumption that can occur. The power requirements of each element from the cooling system to UPS and critical IT load must be looked at; through a needs assessment power requirements can be determined and a plan can be put into place.
Pricing
The basic fundamentals of how to price and which pricing models are best for customers differ from data center to data center and also from opportunity to opportunity.
You can deliver significantly greater value to your customers by opening up the data center conversation.
Even if they are already established in a third-party data center, the rules and pricing have changed even from just three short years ago. Competition in each market has grown and with that so have the services that are available for your customers.
As you look to increase the value you deliver to your clients, the data center and associated services will help cement you as the go-to resource for all of their IT needs.
About the Author
Rob DeVita leads the sales and marketing strategy for 1547 Critical Systems Realty. He has an extensive background in management, direct and indirect sales, and development for data center technology. He has presented at many new technologies and applications top tier conferences, with sessions including “Cloud and Mobility” at 7×24 Exchange; “Patriot Act & NSA: Protecting Your Data” at HostingCon; and “Connectivity Challenges & Opportunities in 2014, including Adoption of the Open IX Model in the U.S.”; at CRE’s Texas Data Center Summit & Peer 2.0 Conference. He has been a panelist at many IMN events as well.
Rob served as the President of the Dallas-Fort Worth AFCOM Chapter and currently serves on the Board of Directors for the Metroplex Technology Business Council.
Sign up here for his upcoming online course, called How to Increase Data Center Sales, Make Clients Happy and Keep Them Coming Back for More, on Feb. 24, 2016.
The class will arm you with new skills and knowledge to more capably and persuasively sell data center services and close deals by deepening your understanding of how data centers work and the latest trends driving the industry, and how to position and price your company’s services in the marketplace, what to look for in contracts and how to avoid contract pitfalls.
This first ran at http://www.thewhir.com/blog/identifying-opportunities-in-datacenter-sales | | 10:11p |
The Top 3 Cloud Deployment Pitfalls to Avoid During cloud deployments, enterprises encounter common pitfalls that can prevent them from fully leveraging the benefits of the cloud. This webinar will delve into some of the best practices employed in successful cloud deployments and will also present some actionable tips on how to ensure that your enterprise is best prepared to maximize these benefits.
In this webinar, you’ll learn about three of the top cloud deployment mistakes, as well as advice about how to avoid them. These pitfalls include:
- Improper planning, particularly shortsightedness that prevents future proofing
- Choosing between cloud models for your enterprise: public vs. private vs. hybrid
- Security and data governance oversight—and why it’s important to know your data and how to properly secure it
Register Now
Meet the Presenters
Narendra N. Narang
Senior Storage Architect
Red Hat
View bio here
Frank Ohlhorst
Award-winning Technology Journalist
Penton
View bio here
| | 11:18p |
IT Innovators: Turning to Hybrid Cloud to Reduce Time-to-Market and Cost 
By WindowsITPro
For more than 50 years, FICO has provided analytics software and tools across many industries to manage risk, build more profitable customer relationships, optimize operations, fight fraud, and meet strict government regulations. But just a couple years ago, the company reevaluated its portfolio and decided to prioritize a shift to the hybrid cloud. “Many customers liked the functionality of our products, but didn’t necessarily like the upfront effort needed to build out their data centers,” says Mike Trkay, vice president of cloud services for FICO. “So we set out to simplify access and lower the barrier of entry to FICO products.”
The company immediately knew that because of strict compliance requirements, it would have to build its own internal infrastructure as Software-as-a-Service (SaaS), with the ability to securely manage resources in a private data center. The firm also was faced with the challenge of taking existing products and moving them to the hosted model on that infrastructure. Meanwhile, the firm developed other products that would run as cloud products from day one. “We had this understanding that all of these moving pieces would have to come together and work in unison,” Trkay explains.
Like any endeavor that has the potential to be positively disruptive and worthwhile, the path was certainly met with some resistance. For starters, there was the technical challenge of piecing it all together. “We had to determine how to take products not originally meant for the cloud and make them ready for the cloud,” Trkay says. This also involved restructuring how FICO licensed the products and software.
A lesson was learned throughout the process. Although theoretically, trying to “cloudify” existing products seems practical, sometimes it’s actually more efficient to build them from scratch just for the cloud. Trkay compares the process to remodeling a house. “Sometimes it’s just easier to build from scratch than having to tear everything down to studs and then build it back up,” he explains.
On a team level, there was also the challenge of finding employees that had the skill sets needed to navigate this new technology. “We were moving to a platform that was a leading-edge technology, and we quickly found that not a lot of people with that skill set were readily available on the market,” Trkay explains. He adds that FICO didn’t have anybody internally with that skill set, so the company had to hire new team members. Meanwhile, existing staff members needed to be trained to operate and support the new platform. Still, the threat of losing new team members loomed, Trkay explains.
“When you pick leading-edge technologies that are very hot in the market, it becomes challenging to maintain a team because there’s a lot of other companies interested in the technology,” Trkay says. “They often want to come in and poach employees that you just trained.” As a result, FICO focused on creating a strong culture where employees could feel comfortable developing new technologies as a team and know that talent would be retained.
Overall, Trkay and his team are confident that FICO’s shift to the hybrid cloud was most certainly a move in the right direction. “At FICO, our sole purpose is bringing predictive analytics to our customers to aid them in mitigating risk and making better decisions,” Trkay says. “This move is enabling us to make our predictive analytics products easier for potential and existing customers to access.”
Renee Morad is a freelance writer and editor based in New Jersey. Her work has appeared in The New York Times, Discovery News, Business Insider, Ozy.com, NPR, MainStreet.com, and other outlets. If you have a story you would like profiled, contact her at renee.morad@gmail.com.
The IT Innovators series of articles is underwritten by Microsoft, and is editorially independent.
This first ran at http://windowsitpro.com/it-innovators/it-innovators-turning-hybrid-cloud-reduce-time-market-and-cut-cost |
|