Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Tuesday, February 16th, 2016
| Time |
Event |
| 1:00p |
CoreSite Reports Strong Q4 but Shell Capacity in Key Markets Short Shares of colocation data center provider CoreSite Realty spiked over 5.5 percent following its strong fourth-quarter 2015 earnings report last week and upbeat 2016 guidance.
CoreSite signed more rents in 2015 than it has in any previous year and generated the highest investor returns in its sector, but the company is close to maxing out its existing building capacity in several key markets.
On the 2015 earnings call with analysts, CEO Tom Ray avoided addressing this concern in detail, saying only that the team was aware of it and “thinking accordingly.”
Record Results
While the entire data center REIT sector performed well last year, interconnection-focused CoreSite Realty delivered sector-leading 50 percent in total returns to investors.
The company saw record leasing in 2015, signing more than 500 leases totaling about 400,000 square feet of data center space, senior VP of sales and marketing Steve Smith said. Those leases add close to $50 million in rent revenue – a 40 percent increase over 2014 and the highest level of rent signed in company history.
Read more: Who Leased the Most Data Center Space in 2015?
During 2015 CoreSite added 95 new customer logos, a 16 percent year-over-year increase, including 41 net new logos for the quarter ended Dec. 31, 2015.
During Q4, CoreSite played leasing “small-ball,” as Smith pointed out. Most of the 155 new and expansion leases signed in the quarter were for requirements of less than 1,000 square feet — a 36 percent increase over the previous four-quarter average.
High Occupancy Generates High Margins
The CoreSite mid-to-high-teen ROIC is derived in no small part from building out existing powered shell space. An analyst pointed out on the call that CoreSite was now building out the final phases, and capacity will be maxed out in both Chicago and Northern Virginia markets.
Ray confirmed that there has been a noticeable increase in interconnections between enterprise customers and cloud providers. Additionally, CoreSite saw growth of these interconnections increasing at a faster rate in 2015, compared with the rate of increase reported in 2014.

CoreSite’s monthly recurring revenue per cabinet over the last five quarters (Source: CoreSite Q4 2015 Supplemental)
Something Investors Should Consider
When it comes to allocating capital to drive strong near-term results, the correct answer will almost always be to build out existing properties. CoreSite will continue to deliver great numbers leasing to sub-1,000-square foot tenants in existing powered shell space. However, they are currently developing the final phases in two of the strongest Tier I data center markets.
Notably, the highest returns are usually associated with the final phases of any data center development — especially when compared with the costs associated with delivering the first phase of a new project. The question investors should be asking is: What happens when most of the low-hanging fruit has been harvested?
While this is a challenge facing the company moving forward, it comes as no surprise that the CoreSite management team refused to spill any candy, or tip their hand on the call.
Read more: Things Data Center Investors Should Know
Key Earnings Call Takeaways
Best Markets: The strongest leasing markets in Q4 for CoreSite were Los Angeles, the Bay Area, Chicago, and Northern Virginia. Together, these four markets accounted for 80 percent of new and expansion leases signed in Q4 and 95 percent of annual GAAP rent for the quarter.
Booked, not Billed: CoreSite reported signed but not yet commenced leases of $15.9 million as of December 31, 2015, or $27.2 million on a cash basis.
Interconnection Trends: interconnection revenue growth for the fourth quarter was 26 percent over the same quarter in 2014, with calendar 2015 reflecting 25 percent growth over 2014 results. Notably, while growth in copper cross connection revenue weakened, growth in fiber cross connections remained strong in 2015.
CoreSite reported 50 percent growth in its “higher value” logical interconnection products, comprised predominantly of its Any2 Internet exchange, blended IP, and the CoreSite open cloud exchange.
Development: In the fourth quarter, CoreSite had a total of 370,000 square feet of capacity under construction consisting of both turnkey data center and powered shell space. This included projects underway in Santa Clara, Northern Virginia, Boston, and Los Angeles.
Deferred Expansion Capital: CoreSite now reports CapEx required to fully build out the space in accordance with its plans to add full capacity to its data centers as a separate line item. Notably, the total budgeted amount of $30 to $40 million may not actually be spent.
Investor Takeaway
CoreSite scored points with investors when it came to the REIT metrics that really matter. The data center REIT reported year-over-year Funds from Operations growth of 31 percent and grew the annual dividend by 26 percent.
CoreSite introduced 2016 FFO guidance of $3.37 to $3.47 per diluted share, which implies a 20 percent increase over 2015 FFO of $2.86 at the midpoint.
Denver-based CoreSite has now closed out the books on a “winning season” for 2015, both for the company and for the Denver Broncos, the Super Bowl 50 champion.
The focus now should be on acquiring what these teams need to succeed next season. | | 4:00p |
Harnessing the Rise of Hybrid IT Karyn Jeffery is Head of End User Services in Fujitsu’s Managed Infrastructure Services Global Delivery Unit.
Hybrid IT is a pragmatic response to the challenge of merging legacy systems with new environments and platforms. It’s about finding a path to seamless governance, process, and management frameworks, so that new and old can be balanced in harmony. It’s an inevitable consequence of the clash between the pressure to innovate and the need to leverage existing systems.
Markets & Markets predicts that the global hybrid cloud market will grow from $25.28 billion in 2014 to $84.67 billion by 2019. The research study revealed that 48 percent of enterprise respondents plan to adopt hybrid cloud systems and services in the near future.
Why Hybrid?
The simple truth is that many large companies are not willing or able to make a sweeping wholesale change. They may feel that they have too much time, expertise, and money invested in on-premise systems that deliver critical business value. The transition to the cloud can be hugely disruptive and expensive. It’s also vital to consider compliance and regulatory requirements.
Public cloud is not always cheaper for businesses running their own internal IT infrastructure. The ISG cloud comparison index highlighted that prices between major public cloud providers vary by up to 35 percent. Blending old and new is often the most practical strategy. It’s possible to drive innovation through new, agile, customer-facing apps in the cloud but retain your ERP suite in your own data center.
The Right Tools for the Job
Leading organizations get ahead of ompetitors by getting the best new tools in place first. It’s essential to understand the latest tools and the value they can deliver to your business. That means constantly scanning the horizon for technology that’s currently under development, and consulting IT professionals with a deep understanding of business processes to advise on potential applications for them.
There are huge advantages to having a focused lens on hybrid IT. A service orchestrator can provide end-to-end visibility and accountability. You can shape the service levels and expectations across a broad spectrum of capabilities, and secure a consistent experience. You can find the integration and visibility that’s required for a successful blend of public and private infrastructure through innovative market leading workplace technologies, but that’s only part of the puzzle.
You’ll also want an agile methodology for trialing new tools in the cloud, which provides greater flexibility and keeps costs low. When applications meet business requirements they can be admitted into the organization’s app store, but they can also be quietly dropped when they don’t. Cloud providers will free you from long-term contracts, making it easier to monitor application access and retire anything that isn’t being used with minimal disruption.
Breaking Down Silos
Hybrid IT does not, and cannot work, without a major restructuring and breakdown of traditional boundaries and attitudes within your organization. There must be a culture shift for developers, operations, managers, and all your other internal teams. It will be necessary to create new roles, and empower leaders to bring everyone together with a shared set of goals.
An agile, collaborative workflow, pulling in developers, testers, operations, and other experts can drive an aligned vision forward. Cross-functional teams will come together in this new DevOps environment to focus on a common goal, but it won’t happen all at once, and it won’t happen overnight. Breaking down those silos will take time and determination. A well-picked pilot project with the right team and charismatic leadership is an effective way to sell everyone on the merits of a hybrid approach.
Seamless Integration
Virtualization can enable both cloud-based and on-premise business apps to be presented when and where they’re needed in a secure fashion. According to CIO Insight, virtualization has surpassed 50 percent of all server workloads and will reach 86 percent by 2016. Virtualization allows easy scalability and offers a way to run valuable legacy applications independently from local operating systems, enabling your IT infrastructure to handle the increased workload being generated by emerging technology trends.
It can be difficult to integrate new technologies with existing business processes, new applications, and, of course, legacy IT infrastructure. Taking inspiration from the consumer world, enterprise app stores can be a good route to provisioning apps. Windows, web, SaaS, and mobile apps can be made available to users based on a range of factors, such as roles within the organization, device types and status, and even network conditions.
IT adds value by curating the selection of hosted applications, all of which have already been validated with service providers. New tools can drive fresh process innovation, sometimes in unexpected ways. Rather than applying your business process to new technologies, let the tools guide the development of improved processes that seek to place the information employees really need at their fingertips.
The transition may be painful in the short term, but a truly agile hybrid IT system can lead to real integration, a renewed focus on process and system improvement, and faster innovation that will deliver real business value.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. | | 6:46p |
Is Spark Streaming the next Big Thing in Big Data? StreamAnalytix Thinks It Is 
By The VAR Guy
Are streaming big data analytics ready for prime time? StreamAnalytix, which has expanded its open source analytics platform to include support for Apache Spark Streaming, thinks so. And there is a lesson here for the big data ecosystem generally.
As its name implies, StreamAnalytix focuses on real-time data analytics. Its product portfolio previously centered on Apache Storm, an open source platform for real-time distributed data processing.
With the addition of Spark Streaming support, StreamAnalytix now offers an enterprise-level solution for processing streaming data as well. That means handling the types of continuous, rapidly flowing data generated by things like IoT devices and hardware sensors.
StreamAnalytix is only one of a number of companies and organizations working in this area. But it hopes to stand out by focusing on solutions that are based on open source tools (even if they are not themselves open source). The Spark Streaming support makes StreamAnalytix’s platform “the industry’s first open-source based, enterprise-grade, multi-engine platform for rapid and easy development of real-time streaming analytics applications,” the company says.
From a channel perspective, the company’s decision to add Spark Streaming support to its portfolio is significant because it highlights streaming data’s emergence as an increasingly important part of the big data scene. Distributed data analytics via platforms like MapReduce have been around for years. Real-time data analytics of the type handled by Apache Spark in general are also not new.
But the market is now introducing solutions that combine distributed and real-time analytics. That change makes it easier to process very large volumes of data, in real-time, in a highly scalable fashion.
This first ran at http://thevarguy.com/open-source-application-software-companies/spark-streaming-next-big-thing-big-data-streamanalytix-sa | | 7:13p |
Designing Data Centers for the Future In January, we focused on data center design. We looked into design best practices and examined some of the most interesting new design trends. Here are the stories we ran as part of our data center design month:
Data Center Design: Which Standards to Follow? – Codes must be followed when designing, building, and operating your data center, but “code” is the minimum performance requirement to ensure life safety and energy efficiency in most cases. A data center is going to probably be the most expensive facility your company ever builds or operates. Should it have the minimum required by code?
Startup Envisions Data Centers for Cities of the Future – The Project Rhizome team is thinking of ways to design small urban data centers so they fit in urban environments functionally, economically, and aesthetically.
 One of Project Rhizome’s concepts is a community swimming pool heated by server exhaust heat from an integrated data center (Image/Concept: Project Rhizome)
Linear Programming Helps Groupon Optimize Data Center Design – Groupon may be the future of merchant discounts, but it uses a mathematical problem solving method formulated in the 1930s to optimize the data center design that supports its popular service.
ICTroom Unchains Capacity from Size in Modular Data Centers – The more you standardize, the faster you can deliver product and save on upfront costs usually sunk in high-capacity facilities that spend long periods of time underutilized.
 An integrated modular data center by ICTroom (Image: ICTroom)
Equinix Turns to Fan Walls for Data Center Cooling – There’s been a lot of debate over the years about the various pluses and minuses of using raised floors for data center cooling versus simply dropping cold air onto the data center floor from ducts on the ceiling. But there’s a third option: the fan wall. | | 7:39p |
The Data Center Cloud Built This month (February), we focus on data centers built to support the Cloud. As cloud computing becomes the dominant form of IT, it exerts a greater and greater influence on the industry, from infrastructure and business strategy to design and location. Webscale giants like Google, Amazon, and Facebook have perfected the art and science of cloud data centers. The next wave is bringing the cloud data center to enterprise IT… or the other way around!
Here’s a collection of stories that ran on Data Center Knowledge in February, focusing on the data center and the cloud:
Telco Central Offices Get Second Life as Cloud Data Centers – As AT&T and other major telcos, such as Verizon, upend their sprawling network infrastructure to make it more agile through software, most of those facilities will eventually look less like typical central offices and more like cloud data centers.
 AT&T switching facility (Photo by John W. Adkisson/Getty Images)
Cloud Underwater? Microsoft Tests Submarine Data Center – Continuing its long tradition of data center experimentation in the name of efficiency, Microsoft announced it has been testing an unusual new data center concept: placing servers underwater out in the ocean.
 Project Natick, Microsoft’s experimental underwater data center, being deployed off the coast of California (Photo: Microsoft)
Next-Generation Convergence is the Future of Cloud and Data Center – Today, the data center is tasked with supporting more users, who are accessing more applications and resources. All of this translates to creating better data center controls and enabling even greater levels of multi-tenancy.
LinkedIn Designs Own 100G Data Center Switch – Following examples set by other web-scale data center operators, companies like Google and Facebook, the infrastructure engineering team behind the professional social network LinkedIn has designed its own data center networking switch to replace networking technology supplied by the major vendors, saying the off-the-shelf products were inadequate for the company’s needs.
 Pigeon, LinkedIn’s first-generation 100G data center switch (Image: LinkedIn)
What’s Behind Rackspace’s Private OpenStack Cloud Partnership with Red Hat– OpenStack is hard. It’s hard to take the conglomeration of about 20 open source projects, each at its own stage of maturity, collectively referred to as OpenStack, and turn it into a functioning cloud. This has created a whole services market for companies that can help users stand up their own OpenStack clouds, and Rackspace is going after this market hard.
 Red Hat corporate headquarters in Raleigh, North Carolina (Photo: Red Hat) | | 10:37p |
Three Reasons AWS Just Bought Italian SaaS Firm NICE 
By The WHIR
Amazon Web Services has signed an agreement to acquire NICE, a software-as-a-service company based in Italy that helps customers optimize and centralize their HPC, cloud and visualization resources. The terms of the deal were not disclosed, but it is expected to close in Q1 2016.
According to NICE’s sparse website, it will continue to operate under its existing brand, and continue to support and develop EnginFrame and Desktop Cloud Visualization (DCV) products.
AWS didn’t drone on about the acquisition, instead opting for a short blog post written by AWS’ Chief Evangelist Jeff Barr, to briefly sum up the news. While not a lot may be known about the acquisition at this point, it is clear there are three main reasons why AWS pulled the trigger on the deal.
- Customers
While certainly not the core of the deal, AWS will have access to quite the list of customers with its acquisition of NICE. Customers include: Airbus, Alcatel, Audi,Bridgestone, Bosch, CERN, Chevron,ConocoPhillips, Ericsson, Ferrari, Fiat,Harvard Business School, Hess, Honda,Jaguar Land Rover, Lear, Magellan Aerospace, McLaren, MD Anderson Cancer Center, Motorola, Northrop Grumman,Raytheon, Red Bull, Siemens, Suzuki, Toyota, and Yale University.
Though some of NICE’s customers are already customers of AWS to some capacity (Ericsson, for example), there are likely some customers who are not, and these could provide a great opportunity to AWS in the future, even lending themselves to deployments of large-scale HPC jobs on AWS servers, as VentureBeat points out.
“Like AWS, we are a customer obsessed company and we are globally appreciated for the excellence of our support and professional services,” Bruno Franzini, Support and Professional Services Manager at NICE said in a statement. “With the backing of AWS, we will pamper our customers even more.”
- High Performance Computing
In a statement, NICE said that it will work closely with AWS to “create innovative and exciting technologies and services for high performance and technical computing with the goal to help customers to accelerate and grow their business.”
While AWS already offers HPC clusters as a service, CfnCluster, its acquisition of NICE is expected to strengthen its offering with the ability to aim clusters at delivering for certain workloads, according to a report by CBR Online.
Read more: New AWS Tool Offers Free SSL Certificates
- 3D Capabilities
NICE’s other main product, Desktop Cloud Visualization software, is a 3D visualization technology that lets users connect to a Direct/X and OpenGL application hosted in a data center, according to CDR Online.
This allows 3D developers and game designers access to work remotely from any computer, ITProPortal said. The technology could bring more 3D game developers on board AWS while extending its 3D capabilities, which AWS recently built on with the launch of Lumberyard, a 3D game engine.
This first ran at http://www.thewhir.com/web-hosting-news/three-reasons-aws-just-bought-italian-saas-firm-nice |
|