Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Thursday, March 10th, 2016

    Time Event
    1:00p
    Will Hybrid Cloud Vanish Into Thin Air? Data Center Execs Weigh In

    Corporate CIOs and CTOs are wrestling with how to balance cost, performance, security, and compliance, while retaining flexibility and control of increasingly complex IT stacks.

    These are a crucial questions which investors, corporate decision makers, and industry professionals must consider carefully prior to deploying the next round of capital.

    A Real World Example

    In the case of Juniper Networks, the company set a target for a zero data center footprint for corporate IT in order to innovate faster and improve how corporate applications functioned.

    Read more: Going to the Cloud: Stories from the Frontlines

    Between 2011 and January 2016, Juniper went from 18 data centers to just one with only 50 racks, resulting in significant cost savings and increased efficiency.

    Is this an anomaly, or a glimpse into the not too distant future?

    Industry Experts Weigh In

    Will the hybrid cloud fade away over time as enterprise customers continue to migrate existing workloads and launch new applications specifically designed for the public cloud?

    Data center REIT analyst Vincent Chao asked the question and received three different answers from presenters at the recent Deutsche Bank Media, Internet, and Telecom conference.

    Don’t Bet the Farm On Hybrid

    DuPont Fabros Technology CFO Jeff Foster felt that the hybrid cloud is just a temporary solution along the way to public cloud adoption.

    DuPont Fabros management team is now under the leadership of CEO Chris Eldredge, who took the reins last year after coming over from NTT Communications. Eldredge and the board concluded that the 100 percent wholesale data center approach was the right answer for the company after it briefly tried a mixed business model, offering both wholesale and retail coloscation space.

    Foster shared that retail colocation was a business that could fade away over time, as more workloads continue to migrate to the public cloud. However, he was also quite candid in pointing out that not pursuing a retail strategy was partially a function “of the hand that was dealt.”

    DuPont Fabros runs a lean operation with just 113 employees, which contributes to sector-leading operating margins. Foster estimated the company would have to hire four to five times the headcount in order to provide retail product and service offerings. That just isn’t in the cards.

    Read more: DuPont Fabros Firing on All Cylinders: Time to Step on Gas?

    Notably, DuPont Fabros has also tweaked its business model away from 100 percent triple-net leases in order to compete for large cloud deployment “RFPs” from the likes of Amazon Web Services and Oracle.

    Hybrid Isn’t Going Away

    QTS Realty Trust CIO Jeff Berson believes enterprise hybrid cloud architecture will be a permanent part of the IT landscape. He shared a well-known public cloud CEO’s view was “…the only enterprise businesses that could be 100 percent hosted in the cloud were the ones that were designed from scratch to operate in the cloud.”

    However, Berson is actually agnostic as to how IT architecture will evolve over time. QTS offers a full range of products and services, including a suite of in-house cloud and managed services to go along with wholesale and colocation offerings.

    Cloud options help to create an inflection point, which drives legacy deployments toward third-party data center environments, according to Berson. QTS employs engineers to help enterprise customers migrate from legacy data centers or integrate with public cloud providers.

    Berson mentioned that QTS’s leasing success has been driven by cloud deployments, including the top eight top public cloud providers. Additionally, some cloud providers look to wrap products around the FedRAMP compliant QTS Federal Cloud, in order to access that market.

    Read more: QTS Managing for Long-Term Growth, Expanding in Seven US Markets

    Berson sounded quite confident that hybrid cloud architecture is here to stay, and enterprise customers will always need customization, compliance, and support.

    Solutions Vary By Vertical

    Digital Realty Trust presenters included CEO Bill Stein, CFO Andy Power, and CTO Chris Sharp. This team of executives felt that whether hybrid cloud solutions are permanent or just a stepping stone was highly dependent on the enterprise vertical market.

    Content and social are at one end of the spectrum, where a large percentage of customer deployments are outsourced to public cloud providers.

    DLR - Mar2016 s25 snip DCK

    Source: Digital Realty – March 2016 presentation

    On the other end, financial services and healthcare customers have significant regulatory and compliance hurdles. Stein felt that customers in the financial vertical will never trust 100 percent of data to the cloud.

    Digital believes that storage underpins the hybrid cloud environment. The Telx acquisition allows Digital to provide colocation solutions next to existing large-footprint storage and compute nodes.

    Digital currently has all of the product offerings that it needs, because it partners with customers like IBM, AT&T, and Equinix to provide hybrid solutions for enterprise customers.

    Read more: Digital Realty Leans on IBM, AT&T to Hook Enterprises on Hybrid Cloud

    Stein felt that Digital’s global footprint, and consistency of product offerings and documentation helps to give them a leg up on most of the competition.

    International Update

    Singapore remains Digital’s hottest international market, with 100 percent of existing inventory leased at attractive margins, and considerable interest in the next phase from existing customers. Stein described the London metro as being “pretty strong,” with Amsterdam “picking up,” as well.

    Digital is seeing interest from all of the public cloud providers in expanding across the EU geography. Stein was asked about a possible Interxion acquisition, and likelihood of acquiring any of the eight Telecity data centers which EU regulators require Equinix to divest prior to closing.

    He responded three parameters must be met: 1) Strategic fit; 2) All acquisitions must be accretive to earnings year one; and 3) An acquisition must be “leverage neutral,” (meaning that there will be no games played with using more debt to make a potential acquisition look more attractive).

    Want to know how data center providers did in the fourth quarter? Curious about investing in data center companies in general? Visit the Data Center Knowledge Investing section for everything you need to know about this high-performing sector.

    4:00p
    Cloud IaaS – One Size Does Not Fit All

    Tim Beerman is Chief Technology Officer for Ensono.

    “Do more with less.” It’s an increasingly common phrase heard by CIOs as they watch their budgets remain flat or shrink, and the scope of their IT responsibilities grow.

    IaaS cloud services is here to rescue. Well, possibly. Established and new cloud offerings promise to increase productivity, reduce hardware and software costs, and provide scalability. Of course, these services come at a cost both in money spent on services and resources as well as training to leverage them appropriately. Optimally implementing a new cloud service to meet specific enterprise requirements can be challenging and time consuming. And more, leveraging cloud IaaS often requires CIOs to hire and/or train new resources to understand the capabilities and limitations of the cloud platform. Understanding this, CIOs are beginning to look to experienced IT service providers as a cost-effective way to implement and manage the cloud service infrastructure.

    An experienced IT service provider that understands best practices aligned with a particular enterprise IaaS can make an enormous difference in the ROI of the entire endeavor. Unfortunately, a service offering that fits one client may not fit another. CIOs will generally be well served if their IT partner can provide the insight required to help establish a meaningful baseline based on the client’s vision and cloud service capabilities. This guarantees the clients’ short-term goals and future needs are met, and that the service can be effectively managed and its value determined.

    With customization options available in many complex IaaS offerings, how can you leverage a service provider to determine the best combination of capabilities? The following approach could be a great place to start:

    1. Ensure your provider gets an accurate understanding of the current business process and the applications that drive that process. Leverage the service provider’s experience. Service providers have typically solved similar issues in the past. By understanding how business processes are being implemented and looking for similarities across current and past engagements, there are often best practices that can speed up the process. This may take some investment, however, it ensures the service provider can best meet the business needs.
    1. Identify gaps in the current process and understand elements best served by cloud IaaS. Reviewing current processes with internal stakeholders and the client allows for a collaborative effort to pinpoint the exact processes that are working, but most importantly, those that may not be working or working less than optimally. Cloud services implementation can be tricky, and initial needs and requirements frequently evolve over time. In some cases the best approach is not to move an entire application to the cloud, or at a minimum there are likely decencies on other applications or data that won’t transition to an IaaS platform. It can be very beneficial to not only choose a service provider that has expertise in designing and managing cloud IaaS, but that can also provide similar services for those parts of the infrastructure that cannot reside in cloud IaaS,
    1. Forecast and planning. Anticipating the future is critical to ensuring objectives are consistently met over the long term. Business needs inevitably change, particularly as companies grow or make operational changes due to dynamic market conditions, and cloud services and infrastructure evolve over time as well. A service provider can help shepherd you through the rapidly expanding cloud services environment. With the right services and capabilities they can help you quickly navigate the evolving cloud landscape and provide insight into how the cloud is projected to shift in the coming years. The right service provider should offer the flexibly both technically and commercially help leverage cloud IaaS in the right ways to achieve performance, scalability and cost efficiencies as your company evolves.

    Cloud IaaS providers build services to cover a wide range of future and current needs and services. While it may seem efficient, it can leave many clients with less reliability or more services and cost than they actually need. But, by using this approach, clients are optimally positioned to get the most out of their cloud service. Having an experienced partner, one that understands the client’s needs and infrastructure, as well as the IaaS offering, can speed up implementation, improve results, and save money.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:09p
    How Juniper IT Went from 18 Data Centers to One

    Mainly because it is a technology company, going all-in with cloud was not something Juniper Networks’ IT leaders had to sell to the board. It was almost the other way around.

    “It was a fundamental understanding,” Juniper CIO, Bob Worrall, said about the decision several years ago to shut down the company’s then sprawling data center footprint and transition to a cloud-centered IT model. “We have a very tech -savvy board, most of them from firms here in the [Silicon] Valley. Four years ago, they all saw the writing on the wall.”

    The San Francisco Bay Area native joined Sunnyvale, California-based Juniper about eight months ago, after four and a half years as CIO of Nvidia. He replaced Juniper’s former CIO Bask Iyer, who left for VMware.

    From 18 Data Centers to 60 Racks

    When he arrived at Juniper, the transition had been mostly completed. Juniper IT had gone from 18 data centers to one with less than 60 racks in it.

    Most applications were moved to cloud services of all flavors: Infrastructure-as-a-Service, Platform-as-a-Service, and Software-as-a-Service; public and private.

    A lot of the data center footprint was also reduced by taking a hard look at all the applications that were running in Juniper data centers, getting rid of redundancies, and replacing some legacy apps with more modern ones, such as SAP’s enterprise software.

    Worrall’s team completed the final phase of a large SAP implementation in January. It is a private cloud service running in a T-Systems data center in Houston.

    The small data center footprint that’s left is running legacy apps that haven’t found a new home in the cloud yet. While the goal is to phase the ones that are close to “end of life” out, moving the rest to the cloud, and shutting down the last data center by next summer, Worrall accepts the possibility that he and his colleagues may not find a cloud solution for all of them. “That will be the long tail of that story,” he said.

    Worrall will be speaking about Juniper IT’s transition to a cloud-first model at next week’s Data Center World Global conference in Las Vegas (more details about the event below).

    Engineering Data Centers Kept In-House

    But that’s not all. Besides supporting corporate needs, there’s a whole other side of any major tech company IT team’s job, and that is providing the infrastructure for building and testing products.

    The strategy for the engineering side is to have two physical, on-premise data centers for testing and simulation. One of them is already up and running on Juniper’s new campus in Bangalore, and the second one is currently being built out in Quincy, Washington.

    In addition to supporting the engineering function with “the plumbing,” as Worrall calls things like networking and management of critical application assets, such as source code repositories, his team participates “heavily” in the engineering and product release processes.

    “Juniper IT is also a customer of Juniper,” he said. His organization tries to behave just like a normal customer would, which was a big part of the reason business leaders pushed so hard for the transition to cloud. They saw their customers make the transition and wanted to stay on the same page with them.

    Worrall’s team uses Juniper products, makes engineering requests, and gets services from the company’s service organization just like other customers do, Worrall said. “That whole customer-number-one notion makes IT very important at Juniper, and I think they take it very seriously.”

    Join Juniper CIO Bob Worrall and 1,300 more of your peers at Data Center World Global 2016, March 14-18, in Las Vegas for a real-world, “get it done” approach to converging efficiency, resiliency and agility for data center leadership in the digital enterprise. More details on the Data Center World website.

    6:49p
    Rackspace Unveils Second-Gen Bare Metal Cloud Server Based on Open Compute
    By The WHIR

    By The WHIR

    Rackspace has updated its OnMetal Cloud Server line this week to deliver enhanced connectivity and performance to customers with intensive data processing, compute power, and quick scaling and deployment needs. OnMetal v2 went into general availability Thursday, the company announced.

    The next generation of OnMetal servers are still bare-metal, single-tenant, and API provisioned in minutes. They run Microsoft and Linux workloads with OpenStack-powered public cloud flexibility and bare metal speed and security, Rackspace said.

    Built on second generation Open Compute servers with Intel Xeon E5-2600 v3 processors, OnMetal v2 Cloud Servers now integrate with Cloud Networks, and are slated to integrate with RackConnect 3.0 in the second quarter, to improve networking security. They now offer international availability with decreased latency, and regional coverage in the US and UK. They are also 250 percent faster writing and 40 percent faster in reading, and top out at 800 GB of local boot drive storage. RAID 1 mirrored storage with two hot-swappable discs help improve uptime and reduce the risk of data loss.

    Read more: Rackspace Intros Dedicated Servers That Behave Just Like Cloud VMs

    “The demands of modern cloud architecture and workloads like Cassandra are pushing the industry to find performance anywhere it can,” Jonathan Ellis, co-founder at DataStax said in a statement. “Running core infrastructure on hardware like Rackspace’s OnMetal is the closest thing we have to an assured advantage: lower latency and more requests served with no changes to the code.”

    Rackspace has also added to the selection of guest images, and will add Windows support will become available in the second quarter of 2016.

    Political social networking app Brigade reports that using first-generation OnMetal Cloud Servers improved its query times by 36.5 times, from 7.3 seconds to 0.2 seconds.

    Rackspace has been shifting employees from its public cloud department to its private and hybrid cloud computing departments as the pace of new signups slows. The company is pursuing the private OpenStack cloud market, however, through a new partnership with Red Hat.

    This first ran at http://www.thewhir.com/web-hosting-news/rackspace-onmetal-cloud-server-updates-target-performance-security

    11:47p
    Bill Gates: Quantum Cloud Computing is Coming (Soon)

    Talkin Cloud logoWhile you’re busy explaining to some clients why cloud is the best option for their data, Bill Gates is predicting that quantum cloud computing could arrive in as soon as 6 years.

    In an Ask Me Anything (AMA) on Reddit, Microsoft founder Gates said it isn’t clear when it will work or become mainstream, but he does think it will happen in the next decade.

    “Microsoft and others are working on quantum computing…There is a chance that within 6-10 years that cloud computing will offer super-computation by using quantum. It could help solve some very important science problems including materials and catalyst design,” Gates said.

    As ZDNet explains, “quantum computers use subatomic quantum bits, or qubits, which can be in multiple states at once.”

    “This means they can carry out more calculations in parallel and could offer new ways of solving problems that traditional digital computers find very hard to do,” according to the report.

    Cloud companies including Alibaba’s Aliyun have invested in quantum computing; last year Alibaba announced that it had co-founded a quantum computing lab with the Chinese Academy of Sciences.

    This first ran at http://talkincloud.com/cloud-computing/bill-gates-quantum-cloud-computing-coming-soon

    << Previous Day 2016/03/10
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org