Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, February 25th, 2016

    Time Event
    1:00p
    On the Bleeding Edge: the Future of Processors

    While the server market remains firmly in the steel grip of Intel’s x86 architecture, there are serious challengers, including from the company that has beat Intel in the smartphone market. Here’s is a collection of in-depth reporting from around the web on the latest and biggest developments in processor technology and the global processor market.

    As the Sunset of Moore’s Law Nears, What’s Next?

    Intel co-founder Gordon Moore, 2007 (Photo by Justin Sullivan/Getty Images)

    Intel co-founder Gordon Moore, 2007 (Photo by Justin Sullivan/Getty Images)

    Upcoming release of International Roadmap for Devices and Systems, the biennial forecast of the future progress in processors, will for the first time not be centered on Moore’s Law. The physical limit of how small process technology can get is now very well within our sight. Judging by chipmaker predictions, once five-nanometer process technology arrives, sometime around 2021, the physics that governs the way chips behave today will no longer apply. At that scale, we enter the unpredictable realm of quantum mechanics, and it’s unclear which way technological progress will turn at that point. Will we finally get viable quantum computers, or will engineers and scientists focus on optimizing other elements of the computing systems? This piece by re/code’s Arik Hesseldahl summarizes where we are now, and asks some fundamental questions about the future of computing: Global Chip Industry Readies for a Future beyond Moore’s Law

    Is Google Flirting with Qualcomm behind Intel’s Back?

    When Bloomberg reported early this month that Google was going to publicly endorse Qualcomm’s server chips, it was huge news. There was so much there: one of the world’s biggest buyers of server processors had allegedly found an alternative chip supplier to Intel, the company that has pretty much monopolized the server processor market and has been relying on data center processor sales to compensate for shrinking PC sales.

    Google data centers in the Dalles, Oregon, 2006 (Photo by Craig Mitchelldyer/Getty Images)

    Google data centers in the Dalles, Oregon, 2006 (Photo by Craig Mitchelldyer/Getty Images)

    And, that alternative supplier would be Qualcomm, the world’s largest maker of smartphone chips, the company that beat Intel at its own game in this space. But, as Wired later reported, Google seems to have pulled out of its scheduled appearance at a Qualcomm event at the last minute. The Wired piece explains why this was such a big deal and where things are headed in the world of processors inside some of the world’s largest data centers: Google’s Hardware Endgame? Making Its Very Own Chips

    China’s Server Chip Ambitions

    A Chinese man wears a mask as he waits to cross the road near the CCTV building during heavy smog on November 29, 2014 in Beijing. (Photo by Kevin Frayer/Getty Images)

    A Chinese man wears a mask as he waits to cross the road near the CCTV building during heavy smog on November 29, 2014 in Beijing. (Photo by Kevin Frayer/Getty Images)

    Growth of the Chinese server market is vastly outpacing growth of the global server market, while Chinese server vendors are biting off an increasingly bigger chunk of the global server market. Chinese government and Chinese tech companies want to continue on this trajectory, ideally using Chinese technology. This piece in The Next Platform describes in-depth China’s server processor ambitions: China Lays the Chip Foundation tor Its Next Platform

    The Man behind the Chips in Apple Devices

    The Apple logo hangs in front of an Apple store in New York City. (Photo by Spencer Platt/Getty Images)

    The Apple logo hangs in front of an Apple store in New York City. (Photo by Spencer Platt/Getty Images)

    While people like Apple CEO Tim Cook or Jony Ive, the company’s design chief responsible for the way Apple products look and feel, get the bulk of attention, the company generally keeps the more technical side of things tightly under wraps, and little is known about people who make sure the legendary devices actually work. In a rare glimpse under the hood, Apple, apparently in the hopes of lifting up its struggling shares, gave Bloomberg Business a tour of its processor development facilities and an interview with Johny Srouji, a Christian Arab from Israel who’s in charge of Apple’s chip design. The article is a fascinating look at the hidden world inside Apple and the team that makes Apple possible: The Most Important Apple Executive You’ve Never Heard Of

    4:30p
    The New Business Relationship Manager

    Brian Cohen is CEO of StrataCloud.

    We’ve been talking about business-IT alignment for as long as the term “IT” has been around. Back in the day, before the cloud and mobile apps were central to IT strategy, it was easy to pay lip service to the IT department’s mandate to “work with the business.” After all, IT still made most of the decisions and users didn’t have much choice other than to accept them.

    My, how things have changed. Now business people have the know-how and the power to go it alone when it comes to information technology. Once shadow IT began to proliferate at many companies, CIOs had a choice: they could rein it in (an unpopular decision) or just roll with it (introducing security, complexity and cost risks). Yet savvy CIOs know that there’s a middle ground in which business units can still have a healthy dose of freedom and flexibility. IT should still play a central role in guiding and managing technology decisions in light of business risks, existing technology infrastructure, budgets and overarching business goals.

    The CIO’s job is too big to give enough attention to the day-to-day workings of business alignment in a large company. IT departments can look to hire or promote from within a business relationship manager (BRM) who can effectively bring business requirements and goals back to the IT department for prioritization and delivery. He or she also presents ideas and strategies to the business to help meet objectives in a services-oriented fashion. The BRM helps manage requests in the context of standardization and integration requirements. While not a new concept in IT, the BRM is a role that has never been more important than today, as many IT organizations are leading their companies on a journey of digital transformation.

    Finding and Nurturing BRMs

    The BRM is someone who is a promoter—of IT and the capabilities it can bring to help the business achieve its goals. That means that the individual is someone who should have a solid understanding of the business – its vertical, its competition, its challenges and its customers. BRMs are tenacious and diplomatic, with a sales orientation to their jobs. In some respects they are similar to a product manager – possessing a deep understanding of the technology portfolio, along with an equally strong grasp of customer needs. We have seen BRMs come from sales, accounting, operations and IT departments alike. While the individual does not need technical skills per se, analytical skills and the ability to translate business requirements into product or service specifications are highly valuable. BRMs can help guide a company through change in a balanced and objective way.

    BRMs can even help navigate politically charged environments. Several years ago, a Fortune 50 company needed to reorganize multiple business units and go through an overall downsizing exercise during a tough business climate. In order to do so, the company required access to sales performance data across multiple business units that wasn’t readily available. Without the good fortune of having BRMs, the business unit leads would have delivered arbitrary advice on how they should spend the new budget and which projects should be shuttered. Instead, the BRMs, who understood the priorities of the business units and also understood how IT worked in regard to dependencies, the cost of shutting down inflight projects and so on, came up with the most efficient and achievable plan. This involved prioritizing internal IT resources and third-party software and consultants to gain access to data and complete the projects deemed most important to the business. The net result was a data-driven corporate reorg that took place in approximately 120 days, and which would not have been possible without the liaison work of the BRMs.

    Helping the BRM Succeed

    As mentioned above, a BRM can be highly effective through being opportunistic and anticipating business needs. A great way to get started is to develop a plan for each business unit or group, containing goals, measures and deadlines. Both IT and the business sponsor need to sign off on the plan, and the BRM should be accountable for helping facilitate and document progress. Metrics should relate squarely to the business: for instance, the success of a new customer self-service portal can be measured by tracking whether it improves customer satisfaction and reduces costs, and by how much.

    The role of the BRM is an exciting, visible one. He or she could be closely involved with revenue-driving initiatives, such as loyalty programs or developing new technology-based services. The BRM may even help drive the business in a new direction that had not been envisioned previously, through their in-depth knowledge of the latest and upcoming technologies. BRMs may also help build best practices across an organization by piloting a service for one business unit, such as a new customer communications tool, which later expands into other divisions.

    Increasingly, the IT department as a whole should be focused on partnering with the business, and re-aligning all processes, skills and services toward discrete business and customer needs. Yet the reality is, IT culture is hard to change and technology-specific tasks are all-consuming. Most IT staffers have plenty on their plate in regard to analyzing quickly-changing technology trends and optimizing results in areas such as cloud computing and software defined infrastructure. Through hiring or appointing a BRM for key lines of business, the CIO has the help of a uniquely qualified team who can straddle both sides of the fence while keeping a constant pulse on business needs and marketplace opportunities.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

     

    6:27p
    CyrusOne Reports Record 2015, Plans Big New Jersey Expansion

    The parade of data center REITs reporting exceptional Q4 and full-year 2015 results has just become even more impressive.

    CyrusOne (CONE) crushed results across the board during 2015, including record leasing of 30MW across more than 200,000 square feet of data center space in the fourth quarter alone. The company is expanding capacity across six markets, but its biggest expansion plans are in New Jersey.

    CyrusOne CEO Gary Wojtaszek said the flexibility for his customers to lease anywhere from a single rack to 10MW of capacity was a key reason for success in 2015. He also pointed to the company’s ability to deliver data halls in just a few months’ time at less than $7 million per megawatt.

    Leasing Overview

    Leasing activity was split fairly evenly between existing customer expansions and more than 170 new logos added last year. The signing of more than 1,400 leases contributed to a record $42 million backlog for future customer deployments. CyrusOne expects to book between $19 and $24 million of annualized revenue in the second half of 2016, with the balance recognized in 2017.

    The company has historically focused its sales and marketing on Fortune 1000 enterprises. It has evolved from a Texas-based data center landlord for the energy sector to add financial firms, connectivity, and large cloud providers to the mix. Last year, about 80 percent of leases it signed included an interconnection product. CyrusOne now leases to eight of the largest public cloud providers.

    During 2015, 84 percent of signed contracts contained escalators for monthly recurring revenues. Most notably, the average lease term for deals executed in Q4 2015 was just under nine years. These leases boosted the weighted-average lease term for the entire portfolio by 12 months, up to 40 months, which is an all-time high.

    Big New Jersey Expansion Plans

    Last year CyrusOne closed its $400 million acquisition of Cervalis, a colocation and disaster recovery data center provider with extensive operations in the New York market. Many Wall Street firms were key customers, with financial services representing two-thirds of its revenue.

    On the earnings call, Wojtaszek revealed that CyrusOne is now looking to expand its operations into “god’s country,” aka Northern New Jersey. Wojtaszek is agnostic regarding the New Jersey expansion and is willing to purchase land for ground-up development if a suitable facility can’t be bought at an attractive price.

    New Jersey has not exactly been a hot bed of data center leasing activity, with only 2MW of leasing reported for last year. Many financial sector clients have moved operations to Northern Virginia, which has data center tax incentives and much lower power costs.

    CyrusOne competitor DuPont Fabros Technology earlier this year said it was exiting the New Jersey market and selling its large data center there. CyrusOne may be one of the potential buyers.

    But there are more forces at play than just New Jersey market dynamics. DuPont Fabros is changing its business strategy, which is now radically different from CyrusOne’s, and its execs say New Jersey just isn’t a good fit for its new pure-play wholesale data center focus.

    Read more: Why DuPont Fabros is Getting Out of New Jersey

    Tough Times for Core Energy Vertical

    On the call, CyrusOne acknowledged that this was a “tumultuous period” for its energy-sector customers. However, management pointed out that over 80 percent of its energy-sector leases are with companies whose annual revenues exceed $5 billion.

    Notably, one top-20 energy customer accounts for about 1 percent of CyrusOne’s revenue and has a lease coming due in 2016.

    The vertical has continued to grow within CyrusOne’s footprint at a steady 7 percent on average, despite the drop in oil prices. CyrusOne operates 255,000 square feet in Houston, where space utilization of grew to 88 percent last year, up from 85 percent in 2014.

    Outstanding 2015 Performance

    By almost any measure, 2015 was a banner year for CyrusOne.

    CONE - 4Q'15 Earnings s11 DCK snip metrics

    Source: CONE – Q4 2016 Earnings presentation

    Big Dividend Increase

    Strong earnings growth has resulted in an incredible 138 percent average growth rate in the quarterly dividend distribution to shareholders since the CyrusOne IPO.

    CONE - 4Q'15 Earnings s17 DCK snip

    Source: CONE – Q4 2015 Earnings presentation

    What to Expect in 2016

    Predictably, strong leasing activity has accelerated CyrusOne construction spending in 2016. CyrusOne has development projects underway in Dallas, San Antonio, Houston, Phoenix, and Northern Virginia that will add approximately 355,000 square feet of data center space total.

    Key 2016 guidance metrics:

    • Adjusted EBITDA: 2016E of $258-$268 million, an increase of 24 percent vs 2015 at the midpoint
    • NFFO per share: 2016E of $2.45-$2.55, an increase of 15.2 percent vs 2015 at the midpoint
    • Capital Expenditures: 2016E of $320-$345 million, an increase of 41.5 percent vs 2015 at the midpoint
    • New Development: 2016E of $316-$337 million, an increase of 40 percent vs 2015 at the midpoint

    An alliance agreement has recently been signed with Australia-based Megaport to deliver SDN-enabled elastic cloud interconnection services in CyrusOne data centers.

    Management guided analysts to model a more normalized churn rate of 6 to 8 percent for 2016.

    Investor Takeaway

    Notably, CyrusOne’s quarterly interconnection revenue is growing at over 50 percent on average, albeit off of a small base, just 7 percent of revenues. The company is in the early stages of building its cloud ecosystem but already has more than 11,000 cross-connects. Notably, it only has 23 customers in Northern Virginia, which is the most active US data center market and one of the most active interconnection markets.

    CyrusOne appears to be well-positioned to benefit from industry trends, including C-level technology execs wanting more flexibility in their IT stacks, the rise of hybrid cloud, and accelerating trends in third-party data center outsourcing.

    CONE shares traded up 4 percent after the earnings call on February 24 to close at $38.64 per share, or 15.4x 2016E FFO at the midpoint. The forward yield based upon the new $0.38 per share is a very attractive 3.93 percent, given the high growth rate. I remain constructive on CONE shares for 2016, despite the fact that they are trading near their 52-week high.

    7:00p
    IBM, VMware Partner to Combine On-Prem VMware with IBM Cloud
    By The WHIR

    By The WHIR

    IBM and VMware have partnered to help companies more easily extend their applications running on VMware’s software to the IBM Cloud, making it easier for enterprises to have hybrid environments consisting of on-premise infrastructure and public cloud.

    At the IBM Interconnect cloud and mobile conference this week in Las Vegas, the companies announced an architecture and cloud offering they jointly designed that will enable VMware Software-Defined Data Center environments, consisting of VMware vSphere, NSX and Virtual SAN.

    VMware SDDC is the company’s unified hybrid cloud that works across public, private, and managed clouds.

    Under the partnership, IBM will use its “CloudBuilder” tools and workload automation capabilities to automatically provision pre-configured and custom workloads to the cloud that are validated by VMware’s SDDC architecture design patterns. Additionally, VMware has extended vRealize Automation and vCenter management tools to deploy and manage environments on the IBM Cloud, as if they are part of a customer’s local data center.

    The two companies will also be jointly marketing and selling new offerings for hybrid cloud deployments, including workload migrations, disaster recovery, capacity expansion, and data center consolidation. Customers will be able to quickly provision new or scale existing workloads to the IBM Cloud. Through IBM’s international network of data centers, they also have the additional reach and scale to start locally and scale globally while also complying with data residency and other regulatory mandates.

    Many enterprises faced with the difficulty of moving their on-premise applications to a public or private cloud are pinning their hopes to a hybrid cloud model that promises the best of both worlds. Hybrid cloud adoption is set to triple over the next few years, according to a recent report.

    “We are reaching a tipping point for cloud as the platform on which the vast majority of business will happen,” IBM Cloud senior vice president Robert LeBlanc said in a statement. “The strategic partnership between IBM and VMware will enable clients to easily embrace the cloud while preserving their existing investments and creating new business opportunities.”

    To boost its hybrid cloud capabilities, IBM has been engaging in partnerships and acquisitions, having picked up hybrid cloud specialist Gravitant late last year.

    VMware and IBM, however, are competing for hybrid cloud leadership against a host of other alliances such as Microsoft and Red Hat. When it comes to the hybrid cloud market, it seems that going alone isn’t always the best option.

    This first ran at http://www.thewhir.com/web-hosting-news/new-partnership-will-help-hybrid-clouds-span-vmware-sddc-and-the-ibm-cloud

    8:02p
    Starbucks Sustainability Czar to Lead Microsoft’s Green Data Center Strategy

    It’s no secret that Microsoft already has a lot of cloud data centers around the world. And the company is planning to build a whole lot more as it attempts to bite further into Amazon’s stranglehold on the cloud services market.

    As it continues to build out its global cloud data center empire, Microsoft has to make sure it’s doing it in the most environmentally responsible way it can. It is one of tech’s biggest names and as such, it is under a lot of scrutiny by environmentalists and the public.

    To help the cause, Microsoft has created a new role, dedicated specifically to data center sustainability. Not corporate sustainability, not energy strategy, not data center strategy, but data center sustainability. This week, the company announced it has hired Jim Hanna, who until recently led environmental affairs at Starbucks, to fill that role.

    “Microsoft is committed to building the most hyperscale public cloud that operates around the world in more regions than anyone else,” Rob Bernard, Microsoft’s chief environmental strategist, wrote in a blog post announcing the appointment. “This focus on growing the cloud means that we are making big investments in our data centers, where we are increasingly focused on sustainability.”

    The company’s cloud is currently served out of more than 20 regions around the world. Each cloud region usually consists of more than one data center. Microsoft says it has already invested $15 billion in this global data center network, and it’s not stopping there.

    Like other hyperscale cloud giants, Microsoft’s data center sustainability strategy consists of a combination of efficiency improvements and investment in renewable energy.

    The company spends a lot of resources on research focused on building a more efficient infrastructure. Some of the innovations to come out of those efforts include its ITPAC data center modules, which maximize the use of outside air for cooling, and more experimental ideas, such as installing small gas-fueled fuel cells directly into data center racks, or converting methane from a waste processing facility into energy to power servers.

    Microsoft has been investing in renewable energy development with the goal of cleaning up its data center power supply since about three years ago.

    A lot of the money that pays for Microsoft’s renewable energy purchases and energy efficiency research comes from its unusual practice of charging its internal divisions for carbon emissions they are responsible for. The company instituted its internal carbon fee in 2012 and so far has found it very effective.

    Any major web-scale cloud data center operator’s sustainability efforts are complicated by one factor: they cannot build their own data centers fast enough to keep up with growth, so they end up leasing a lot of capacity from commercial data center providers, many of whom have their own view of sustainability and renewable energy.

    Last year, for example, Microsoft leased more than 27MW of data center capacity total from three different wholesale data center providers: DuPont Fabros Technology in Northern Virginia, Vantage Data Centers in Silicon Valley, and Digital Realty Trust in the Chicago area, according to research by the commercial real estate firm North American Data Centers.

    Read more: Who Leased the Most Data Center Space in 2015

    More data center providers now go to greater lengths than usual to provide renewable energy options to their customers, but not all of them do – by far – which means major cloud providers don’t really have full control of the way their cloud infrastructure is powered. There are other constraints, such as lack of direct access to renewable energy in many places around the world, including in many American states.

    Read more: Cleaning Up Data Center Power is Dirty Work

    What all this means is that Jim Hanna’s new job may be exciting or any number of other adjectives, except one: simple.

    11:16p
    IT Innovators: Enabling Quick Iterations Based on Customer Feedback
    By WindowsITPro

    By WindowsITPro

    Not too long ago, mobile users considered themselves lucky to find a random WiFi connection that was fast enough to check an email or two. Today, users are surprised when they can’t connect to the Internet, and they expect to be able to safely conduct all manner of business and transactions using an Internet connection and their mobile device of choice.

    Providing the Internet connection is the (relatively) easy part; providing a secure connection that is accessible from anywhere in the world—along with apps that will meet each customer’s specific needs—is the challenge.

    ViaSat is a provider of satellite broadband and wireless services, infrastructure and technology that securely connects consumers, businesses, governments, and the military to the Internet anywhere in the world. To meet customers’ connectivity needs of today and tomorrow, the company turned to the hybrid cloud.

    “We have deployed a hybrid cloud infrastructure and platform that allows our engineers to quickly iterate on services and applications for our customers,” says Josh Barry, director, cloud, ViaSat. “These apps provide everything from login and authentication, network function virtualization and metering to billing and other backend services.”

    Barry says ViaSat made the strategic decision to shift to a hybrid cloud infrastructure several years ago.

    “Our company’s senior leadership understood the need for greater agility in responding to customer needs through software innovation, and the cloud was the way to make this happen,” he says. “We’ve built a substantial part of our platform with OpenStack cloud software as the foundation. We’re also working with a few partners to ensure we have closed loop monitoring and remediation for our growing infrastructure footprint.”

    Barry says ViaSat has realized a number of benefits from the shift to a hybrid cloud model across its base of stakeholders, including improving the resilience and responsiveness of services the company develops and providing end users with a better experience.

    “Our in-house software developers now have the ability to self-provision resources and design their software to scale infrastructure resources automatically via APIs as needs change,” he says. “That makes our developers more productive and responsive to the next set of customers, who are the companies, consumers and governments who use our services. Finally, there are the end users, who are the individuals using our services to access applications and data they need via the Internet.”

    With the move to the hybrid cloud model, and all the functionality that move enabled, ViaStat had to fundamentally rethink how the IT department would support developers.

    “We had to [give IT] the tools it needed to improve agility while maintaining our governance and policy structures, says Barry. “So, the affect on our IT department is evolving, but overall it’s been very positive because our entire company is better equipped to deliver new value to our customers and compete.”

    Deb Donston-Miller has worked as a tech journalist and editor since 1990. If you have a story you would like profiled, contact her at Debra.Donston-Miller@penton.com.

    The IT Innovators series of articles is underwritten by Microsoft, and is editorially independent.

    This first ran at http://windowsitpro.com/it-innovators/it-innovators-enabling-quick-iterations-based-customer-feedback

    << Previous Day 2016/02/25
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org