Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, August 5th, 2016

    Time Event
    6:20p
    Data Center REIT CyrusOne Sprints Ahead In Cloud Leasing Race, Yet ‘No Respect’

    By any measure, data center REIT CyrusOne (CONE) just knocked the ball out of the park last quarter, and this leasing momentum continued into the third quarter.

    According to Gary Wojtaszek, CyrusOne president and CEO.  “This was the strongest leasing quarter in the Company’s history, and we believe it is also a record for the industry,” He added, “These results reflect continued strong operational and financial performance, and our ability to deliver data centers at the fastest time to market has enabled our hyper-scale customers to keep pace with their increasing capacity requirements.”

    Since speed to market was a major factor in winning these large-scale cloud deployments, hitting an inside-the-park home run — where a swift runner beats the throw to home plate — is a better analogy.

    It is a real “head-scratcher” how a 34 percent earnings growth rate can disappoint investors.

    Record Leasing

    CyrusOne leased 282,000 colocation square feet (CSF) and 40 MW of power in the second quarter, representing approximately $58 million in annualized contracted GAAP revenue.

    Subsequent to the end of Q2 2016, CyrusOne purchased a 130,000 SF building shell located in Sterling, Virginia near its existing campus. The building supports 12 MW of critical load, and some data halls are scheduled to be complete prior to year-end.

    CyrusOne added another $14 million in annualized GAAP leasing revenue during the month of July, boosting the CyrusOne booked-not-billed backlog to a record $96 million.

    CONE – Q2 2016 Operating Results

    CyrusOne was the last of publicly traded REITs involved in the wholesale side of the business to report results for the quarter ended June 30, 2016. Highlights included:

    • Normalized FFO per share, (a measure of REIT earnings), was $0.67 in the second quarter of 2016, an increase of 34 percent year-over-year.
    • Revenue was $130.1 million for the second quarter, compared to $89.1 million for the same period in 2015, an increase of 46 percent.
    • The increase in revenue was driven by a 44% increase in leased colocation square feet, additional interconnection services, and lease termination fees ($5.0 million of the increase).
    • The weighted average lease term (WALT) of the new leases based on square footage is 112 months (9.3 years), increasing CyrusOne’s portfolio WALT to 53 months.
    • CyrusOne pre-leased 2 MW, or over 75% of CSF under construction, at the recently acquired Chicago Aurora I data center, recently acquired from CME Group.

    Carrollton, Texas-based CyrusOne’s business strategy is focused on the enterprise market, and counts 177 Fortune 1000 logos among its 950 customers.

    Read more: CyrusOne Plans Huge Expansion at CME Data Center Campus in Chicago

    Notably, during the last quarter, CyrusOne energy sector bookings increased by 55 percent, compared to the trailing 12 month average.

    Tale Of The Tape – Shares Fell?

    The Investor Day 2016 presentation, management guided to a doubling of the company in five years, or 20 percent compounded annual growth rate (CAGR). It appears that CyrusOne is considerably ahead of plan.

    CONE - finiviz Q2'16 WTF after record quarter DCK

    Source: Finviz.com

    During the past 52-weeks, CyrusOne shares have traded in a range of $28.81 – $57.00 per share. As of this writing, CONE shares have lost ~6 percent of their value since the recent earnings call.

    Read more: CoreSite Realty: Strong Q2 Overshadowed By CEO Tom Ray’s Departure

    Wall Street analysts must have built some pretty aggressive models to be disappointed by CyrusOne’s Q2 2016 results. During the conference call, it became obvious that analysts were disappointed the high-end of FFO per share guidance for FY 2016 was not increased.

    A hefty increase in G&A, which management felt was necessary to handle rapid growth and operate at a larger scale was at least partially responsible. However, long-term investors will reap rewards from these moves in 2017 and beyond.

    Portfolio Occupancy and Utilization

    During the second quarter, CyrusOne completed construction on approximately 395,000 SF and 61 MW of power capacity in Northern Virginia, San Antonio, Phoenix, Dallas, and Houston.

    This increased total computer room space across 35 data centers to approximately 2,006,000 SF, which represents an increase of 652,000 SF, or 48 percent year-over-year.

    Data center utilization at the end of the second quarter was 92 percent for stabilized properties and 84 percent overall.

    CyrusOne has development projects underway in Northern Virginia, San Antonio, Phoenix and Chicago that will add approximately 259,000 SF and 50 MW of critical power capacity.

    CyrusOne has an additional 851,000 square feet of powered shell available for development as well as 230 acres of land across its markets.

    Read more: Another Huge Quarter for Data Center REITs: What’s Next?

    Investor Takeaway

    According to the company, the Massively Modular design responsible for the speedy large data hall build-outs and accelerated ground-up development is delivering space at well below $7 million/MW and consistently delivers development yields ranging from 16 percent to 19 percent.

    Rodney Dangerfield was a comedian famous for the tag-line, “I don’t get no respect!”

    After the Q1 2016 earnings announcements, I wrote: “Investors and analysts will be eagerly listening to upcoming earnings calls for clues regarding lease signings and other catalysts not currently reflected in earnings models. Despite all the recent success, management will be under pressure to answer to the age-old question: What have you done lately?”

    Evidently, record leasing, a huge backlog, and a development pipeline de-risked by build-to-suits and pre-leasing wasn’t enough for Wall Street, given the huge gains in CyrusOne shares year-to-date.

    Another Dangerfield line was: “I remember the time I was kidnapped and they sent a piece of my finger to my father. He said he wanted more proof.”

    CyrusOne, is delivering ROIC in the mid-to-high teens. Clearly, FFO per share in 2017 will continue to grow at a phenomenal pace. Unlike some of its REIT peers, CyrusOne has plenty of land and powered shell capacity in its key markets.

     

    7:05p
    Extending the Edge: New Thinking on Data Security
    brought to you by AFCOM

    brought to you by AFCOM

    Security, like most aspects of IT infrastructure, has historically been a siloed function. Focused on protecting data, applications, network connections, and with the advent of BYOD (bring your own device) policies, network endpoints, it is a practice that, for most companies, evolved in a reactive fashion – new technology acquired and implemented based on a specific need.

    It is not uncommon for a medium-to-large company to have 50 or more different security technologies in place. While fiscally inefficient, this approach has been somewhat effective up to this point in dealing with the types of attacks launched against it.

    The threat landscape is currently changing more rapidly than ever, forcing businesses to shift to a more forward-thinking security model. The need to effectively address attackers who constantly evolve focus, attack approaches, and targets has never been greater. The need calls for a proactive approach and an overarching security plan.

    Cybersecurity Today

    Cybercriminals, buoyed by jurisdictional issues and anonymity, today have greater expertise and more resources at their disposal than ever before. The internet brings everyone one mouse click closer to these bad actors with some commenting that they seem to be bolder and more aggressive than ever before. As a result, we are seeing an uptick not only in the number of threats, but also in the complexity of attacks and the severity of the damage they inflict.

    Fortunately, companies are noticing this change in the security landscape, and are taking steps to address it. A recent study conducted by IDG Research Services and commissioned by IT services and solutions company Datalink, polled more than 100 IT executives at large U.S. companies and found that when asked for their companies’ top five considerations on where to invest IT dollars, respondents most frequently mentioned improvement of IT security. The vast majority of them (75%) also indicated that security is a more important issue than it was two years ago.

    Clearly companies that don’t want to fall victim to cyberattacks need to craft and implement a comprehensive IT security strategy that ensures the confidentiality, integrity, and availability of their data. However, the dozens of legacy security technologies they must contend with make this a significant challenge. Adding to the difficulty is a worldwide cybersecurity workforce shortage estimated by Cybersecurity Ventures to reach 1.5 million by 2019. Mirroring this staffing shortage but on a micro level, IT security is becoming a specialization for many organizations, and the members of this typically small group are asked to shoulder a major burden with limited resources.

    A Paradigm Shift: The New Security Perimeter

    Companies have traditionally set up a security perimeter around their network. However, it is increasingly clear that the more appropriate focus – especially for companies that may not have the resources to secure every aspect of their operations initially – is the data center. This is where an organization’s applications and data reside, and they are typically its most valuable digital assets and the most frequently target by cybercriminals.

    Of course, creating a security perimeter around the data center is more easily said than done. The “perimeter” that may have been easy to identify five years ago becomes ever more nebulous with each new advance in network technology and morphology. Technology changes such as cloud computing, hybrid cloud, elastic networks… they all make it much more difficult to determine where your company ends and the rest of cyberspace begins. These changes become even more complex when considering the increase in complexity of business relationships, including partners, vendors, and service providers. When all things are considered, it is a significant challenge, but it can be done.

    Establishing a new perimeter does not, however, mean abandoning the concept of local security. Ideally the measures currently in place should remain, but as elements of a comprehensive strategy.

    Rethinking Data Center Design

    If there is one thing that is clear about the surge in cybercrime in recent years, it is that combatting it will require changes at the foundational level. Visibility into the transactions taking place across the network is the key to better security. Unfortunately, the current approach to segmenting the infrastructure makes tracking system requests and the ultimate destination of the data returned very difficult.

    In rethinking the way their data center operates, organizations must consider a number of critical questions:

    Where will monitoring take place? Where information is gathered is important, as addresses change along the path. If you are only looking at ingress and egress points, you will miss the east/west interactions between applications that reside on the server infrastructure.

    What devices and events should be monitored for maximum effect? Application level? System level? User level? All of these? None of these? The “right” answer depends on the business and sensitivity of the data flowing through the “veins” of the infrastructure.

    What security capabilities are required from the technology? Does the organization need log information, packet capture, and external threat intelligence? Does it have the ability to collect, store, process, and correlate the information?

    How are alerts processed? Everyone must be on the same page about how issues are triaged, escalated, and remediated.

    The answers are different for every organization. A “one size fits all” approach definitely does not work for cybersecurity. Instead, a company’s strategy must be tailored around its business objectives, its risk tolerance, and its capabilities. That’s why organizations should consider getting assistance in crafting a comprehensive plan.

    Replacing Outdated Perceptions

    There was a time when in order for security technology to do its job, it had to slow network traffic to a crawl. Consequently, companies had to choose between being well defended and being productive. Since productivity pays the bills, it often won out. The perception that security functionality will put a stranglehold on data flow and application performance persists today.

    In order to get buy-in on security initiatives, IT must make it clear to the business that new technology is powerful, fast, and accurate. It can quickly inspect incoming traffic, detect and deal with suspicious queries, and allow the rest of it to pass freely.

    Another misperception is that disparate security technologies cannot be brought together into a cohesive plan. The best systems today have open APIs (application program interfaces) that simplify integration with other technology.

    Looking Ahead

    It’s clear that the next generation data center must have the security of information architected into the design and not treated as an afterthought. But in order for data security to be truly effective, a company’s strategy must address people, processes, and technology.

    People are and will remain the weakest link going forward. When implementing new technology, companies must ensure personnel tasked with supporting it are properly trained and comfortable with their responsibilities. Similarly, processes must be reviewed and augmented to properly utilize new capabilities and functionality. So often customers acquire and deploy the latest technology only to not utilize the capabilities to their fullest. This is akin to buying a premium automobile loaded with options, then not bothering to learn how the options function. Defining and measuring success with metrics is critical.

    Engaging business stakeholders early and often during appropriate security technology evaluation and selection processes is important in helping build support and acceptance for the technology. Assurance that appropriate policy is in place is also crucial. Executive leadership must know how their security directives are being translated into protective action. Cross-linking the technology solutions back to organizational policy and standards is imperative.

    Extending security to the perimeter, taking a new approach to data center design, training personnel, and updating processes to account for new security technology – all significant challenges – are just the beginning, however. Staying ahead of cybercriminals requires a collaboration among IT security experts to share their experience and their expertise with their peers. The more quickly information, both on trends and specific types of threats, can be collected, digested and shared, the faster measures to deal with them can be implemented and the more effective IT will be in preventing any negative impact on the business.

    The post originally appeared at AFCOM.com.

    7:14p
    As Rackspace Mulls Private Equity, We Ask: Why Do Public Companies Go Private?
    Brought to You by The WHIR

    Brought to You by The WHIR

    On Thursday news broke that managed cloud services provider Rackspace is in advanced talks with a private equity firm. If the deal goes through, Rackspace will be the latest tech company to go private after being a publicly traded company, following Solarwinds, Dell, and others before it.

    IPOs have not been kind to tech companies this year, and the slowdown is hitting investment banks hard. According to a report by the San Francisco Chronicle this week, revenue for U.S. investment banks dropped 20 percent year-over-year to $16.1 billion in the first half of 2016. Investment banks’ IPO revenue fell 58 percent from $1.1 billion in the first half of 2015 to $450 million in the first half of 2016.

    This trend, according to Bulger Partners managing director Doug Melsheimer, in an interview with Fortune, is not too surprising “given how much everyone complains about the burden of being a public company and how much money is swirling around the private equity landscape.”

    We asked Structure Research founder and managing director Philbert Shih to provide some insight into why a company that is already public would want to go private.

    “There are obvious financial benefits for management and shareholders given that a buyout typically involves a very healthy premium on the current stock price,” Shih says. “One of the primary benefits of going private is to focus on a long-term strategy and spend less time meeting quarterly expectations and complex regulatory and compliance requirements. This is a unique point in Rackspace’s history and going private will allow it to execute on some of the big decisions it has made – i.e. the shift to a managed third party cloud model – without the pressure from shareholders to hit numbers and continually drive immediate value.”

    For firms like Solarwinds, going private is the best choice for future growth of the company. Solarwinds CEO Kevin Thompson told NetworkWorld earlier this year: “It is never an easy decision to go private because it’s a change in the strategy and course you were on, and ultimately you need to get 100 percent alignment with your board and your management team.”

    This post originally appeared at The Whir.

    7:20p
    Data Centers’ Water Use Has Investors on High Alert

    (Bloomberg) — Data centers, used by governments and large corporations to house their computer systems, have one big environmental problem: They get hot.

    To keep them from overheating, large data centers can pump hundreds of millions of gallons of water a year through the facilities, according to company reports. That high demand for water has some investors concerned, especially in places where natural water resources are becoming ever more precious, like tech-heavy California.

    “We definitely want our portfolio companies to be cognizant of their water use and take the appropriate steps to minimize their water use and recycle water,” said Brian Rice, portfolio manager at the California State Teachers’ Retirement System, which manages about $189 billion in assets as of June 30. He cited water usage as a concern at data centers as well as at other portfolio companies, such as those in agriculture.

    Golden State

    California—home to companies running some of the world’s biggest data centers—houses more than 800 of the facilities, the most of any U.S. state, according to Dan Harrington, research director of 451 Research LLC, a technology consulting firm.

    Water usage there is especially a concern as the state’s drought pushes into its fifth year. California Governor Jerry Brown issued an executive order in May to extend statewide emergency water restrictions, establishing long-term measures to conserve water.

    The water risk to investors of California-based companies operating data centers will not affect them gradually, said Julie Gorte, senior vice president of sustainable investing at Pax World Management LLC. “It will probably come in one big splashy moment,” she said.

    As a result, some sustainable-minded investors are trying to enhance their understanding of water risk before it becomes a liability, said Cate Lamb, head of water at investor environmental advocacy group CDP. The group held a series of workshops this year for investors to discuss their most crucial water reporting needs, such as isolating water risk of individual assets. The number of institutional investors committed to its water engagement program with companies has grown to 617 from 150 in 2010.

    Operational efficiencies at data centers have a direct link to companies’ profitability and pose an increasing risk for investors in a “tense” climate change environment, said Himani Phadke, research director at the Sustainability Accounting Standards Board, a non-profit that writes corporate sustainability reporting guidelines for investors.

    Companies, like investors, are trying to get ahead of the risk.

    Corporate Response

    Bill Weihl, director of sustainability at Facebook Inc., said the company uses a combination of fresh air and water to cool its data centers. In 2015, Facebook said it used a total of 221 million gallons of water, with 70 percent of that consumption at its data facilities. “We designed our data centers to use about half the water a typical data center uses,” he said in e-mailed answers to questions.

    Around Facebook’s Prineville, Oregon, data center in particular, water efficiency has become “a big issue,” Weihl said. The center is east of the Cascade Mountains, a region that tends to be drier than western side of the state, and businesses must compete with farmers and a growing local population for water.

    Weihl said rainwater capture and reuse, which is used for irrigation and toilet-flushing at the center, saves 272,000 gallons of municipally treated water per year. Facebook is also working with the City of Prineville and its engineers on the town’s water plan, which includes water mitigation and recycling “grey water” from buildings, he said.

    Water consumption at eBay Inc.’s Salt Lake City-based data center rose 14 percent in 2014 to 31,354 gallons, according to the online retailer’s sustainability report, while its Phoenix facility saw usage drop 3 percent to 57,421 gallons. A company spokeswoman declined to comment.

    Google declined to say how much water the company’s data centers use, but said that the company redesigns its cooling technology on average about every 12 to 18 months. The company has also designed data centers that use air instead of water for cooling, it said.

    “There is no ‘one size fits all’ model — each data center is designed for the highest performance and highest efficiency for that specific location and we’re always testing new technologies to further our commitment to efficiency and environmental responsibility,” vice president of data center operations Joe Kava said in an e-mail adapted from an earlier blog post.

    Growing Issue

    The environmental impact of data centers is poised to grow as the world produces more data each day. Carbon emissions from data centers already represent 0.2 percent of the world’s total carbon dioxide emissions, compared to 0.6 percent for airlines, according to a 2010 McKinsey & Co. report. And more companies are developing larger data centers as they transition to cloud computing, increasing the demand for water needed for cooling their data servers, said Pax World’s Gorte.

    The need to boost water unit efficiency at data centers is driving some companies to open up locations near water sources and cooler climates. Menlo Park, California-based Facebook, for example, began operations at its overseas data center in Lulea, Sweden in 2013 near the Arctic Circle. Mountain View, California-based Google operates a total of 15 data centers with four located in northern Europe.

    Investor concern about corporate water use will only continue to grow, said William Sarni, director and practice leader of Water Strategy at Deloitte Consulting LLP.

    “Over the past few years, we have seen a dramatic increase of interest in water as a business risk and also as a business opportunity issue,” said Sarni. “I see it accelerating.”

    8:17p
    Switching the Switch Gear: Schneider Electric’s Push for SSIS

    It is not a radical or even new idea:  If every part of a power distribution system were adequately shielded, none of the common elements that can gum up such a system — air, humidity, erosion, insects, rodents, the occasional exploding beer can — would negatively impact its performance.  Systems whose average maintenance cycles extended to only about three years, could suddenly be serviced in ten.

    Schneider Electric’s term for the concept makes a sound that the system itself, when it runs properly, should not make: SSIS.  Its Shielded Solid Insulation System has been a key element of its Premset medium-voltage switchgear for a few years now. . . just not in the United States, not until last May.

    Now, as SE product launch manager Joe Richard told Datacenter Knowledge, the process is under way to get SSID approved for use in data centers stateside.

    “We have thousands of cubicles sold and installed worldwide in a variety of markets,” said Richard, “most importantly, in the data center market.  So now, we’re bringing this switchgear technology into the U.S., having it ANSI-tested and U.L.-listed, and ready for the U.S. market.”

    Premset is comprised of medium-voltage 5 kV and 15 kV switchgear, delivering up to 1200 amps of continuous current and 2500 amps of interrupting current, which Richard described as fitting neatly within the sweet spot of what data centers are looking for in terms of power delivery.

    “Because data centers use so much power, they’re bringing power in anywhere from 15 kV voltage class up to 38 kV voltage class,” he explained.  “Bringing it anywhere above 15 kV, from a utility, they almost immediately step it down to bring into their facility.  But once it’s inside of a data center facility, distributing at 15,000 volts or 5,000 volts is a good way to decrease the amount of cables that you’re using, save some money on cable usage, and also be able to better control your total fault value available in your facility.”

    But what differentiates this product line is its total insulation.  In a typical power delivery system, air is the principal dielectric between all of the current-carrying components.  What prevents faults or crossovers between phases in such a system is the physical separation between bus bars, which isn’t really “separation” if you count air as real molecules.

    SSIS uses dielectric epoxy as a solid insulating material around each bus bar and current-carrying component.  Layered on top of that epoxy is a conductive layer that is grounded, the benefit of which is a positive incidental touch-safe rating.

    “It means that, from the inside of the switchgear, there’s no exposed, medium-voltage live part,” said Richard.  “So this in itself increases the safety value of the gear.  It means that, during maintenance, even when the gear is open, there’s still no exposed, live parts anywhere inside of the gear.”

    The shielding provides not just adequate protection, but frankly, any protection at all, he continued, for current-carrying components from the environment at large.  Moisture, humidity, dust, chemicals, and things with legs, tails, and/or antennae are known to degrade the service lifetime of switchgear.  And the shielding, in turn, protects the insulation from water tracking and creepage, extending its service lifetime as well.

    “Shielded Solid Insulation System extends the life of the switchgear,” he argued, “makes it more reliable over a long period of time; and incredibly, increases the safety of switchgear for anybody who may be around it; and protects the assets of the data center. . . I like to refer to Premset as the ‘Model T’ of SSIS technology.  Right now, this is a brand new technology, and we’re focused on this individual, medium-voltage switchgear offer, and trying to enhance this in the medium-voltage world.”

    << Previous Day 2016/08/05
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org