Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, July 28th, 2017

    Time Event
    12:00p
    Digital Realty Buys Stake in Cloud Connectivity Enabler Megaport

    Digital Realty Trust has acquired a small stake in Megaport, the Australian technology firm that makes it easy for companies to connect their servers to the networks of top cloud providers, such as Amazon Web Services, Microsoft Azure, and Google Cloud Platform.

    Digital Realty paid $8 million for a 4.9 percent stake in Brisbane-based Megaport in June, the data center REIT revealed in its second-quarter earnings report Thursday. The two companies have been partners since last year, when Megaport started offering its cloud connectivity services to companies inside Digital Realty’s facilities.

    It’s not uncommon for companies to seek ownership stakes in strategically important partners, and the Megaport partnership is extremely important for Digital, which has made a serious move into the retail colocation and interconnection services market only recently, acquiring Telx in the US in 2015 and an eight-data center portfolio in Europe from Equinix last year.

    See also: Digital Realty Study: Direct Connects to Cloud Bring 50X Less Latency

    All players in this space have been beefing up their capabilities for providing private network links to cloud providers; these capabilities are seen as crucial to enabling the enterprise IT architecture of the future, where companies have a mix of physical IT assets they control and virtual infrastructure in cloud providers’ data centers, all interconnected as hybrid clouds.

    A colocation data center where many enterprise customers and service providers can interconnect their networks is an ideal place to enable such hybrid architectures.

    Fiscally conservative Digital spending millions on a stake in Megaport underscores the importance of interconnection in the market. The San Francisco-based REIT has integrated Megaport’s software-defined networking platform into its own interconnection platform, launching in November the Digital Realty Service Exchange, a cloud connectivity platform for its customers.

    Read more: Digital Realty Challenges Equinix With Cloud Connectivity Platform

    “We wanted to really maintain close ties [to Megaport as a] software development partner, because it’s integrated in our interconnection platform,” Jarred Appleby, chief operating officer at Digital Realty, said on the company’s earnings call Thursday. “Staying close in terms of service development roadmap and go-to-market.”

    Instead of buying a physical interconnect between its network and AWS, for example, and doing all the engineering work required to implement the actual interconnection, Megaport’s SDN platform sets up the link for the customer automatically.

    Interconnection deals signed in the second quarter will be responsible for $7.6 million of annualized revenue for Digital Realty, the company said. But the benefits of interconnection go beyond direct revenue associated with it; providing the ability to interconnect with other companies’ networks is seen as making a colocation facility more attractive for a company looking for somewhere to keep its servers.

    See also: Digital Realty — How to Survive a Market Meltdown and Come Out on Top (Digital Realty CEO Bill Stein on The Data Center Podcast)

    3:00p
    Visa Plans Its First Non-US Data Centers — In London and Singapore

    Credit card company Visa announced this week the first international expansion of its data center footprint. Two new facilities will double the number of server farms the company operates, adding London and Singapore to its current data center locations in Virginia and Colorado.

    As recently as a couple of years ago, Visa’s stated policy was to keep all data processing facilities within US borders, a policy that was evidently made obsolete when the company began its acquisition of Visa Europe in 2015. According to Visa, the completion of that deal in June 2016 began a multi-year process to combine the two companies into a single global organization with a shared technology platform, as it brings 3,200 European clients onto VisaNet — the company’s global transaction processing network.

    The new London facility will be a retrofit of Visa Europe’s legacy data center in the UK. Visa says that when completed, the 10,000-square foot facility will bring increased resiliency for clients in the region and accelerate speed-to-market for new payment innovations in the European market.

    See also: Visa’s Data Center Has a Moat

    “The launch of our state-of-the-art data center in Europe is a critical milestone, enabling all our clients and partners to take advantage of Visa’s global technical resources and assets,” Bill Sheedy, CEO of Visa’s European operations, said in a statement.

    The Singapore data center will also be a 10,000-square foot facility; it will represent the company’s first transaction processing center in Southeast Asia. According to Visa, it will serve clients, cardholders, and merchants both in the region and in Visa’s global network.

    “As home to our Asia Pacific headquarters, Singapore is already a major hub for the Visa business,” said Chris Clark, a group executive of Visa’s Asia Pacific operations. “With our new processing facility in Singapore, we’re strengthening our ability to meet rising demand for digital payments, while driving the pace of payment innovation across the Asia Pacific region.”

    The new data center in Singapore will be Visa’s third major investment in the Asian city-state in the last two years. In September, Visa launched at its Singapore headquarters the first international campus of Visa University — a purpose-built learning center with a “digital-at-the-core” curriculum.

    See also: Land-Constrained Singapore to Study Underground, High-Rise Data Centers

    Last year Visa introduced the Singapore Innovation Center as something of a think tank to bring together the company’s technologists with its clients, partners, and the Asia Pacific tech community “to develop the next big idea in payments.”

    The upcoming Singapore data center is likely to support the infrastructure back-end for some of those big ideas.

    3:30p
    Cloud Provider Spend on Intel Data Center Chips up by One-Third in Q2

    Ian King (Bloomberg) — Intel Corp. has a message for investors who are betting the semiconductor maker’s dominance is waning: Not so fast.

    The company’s Data Center Group — which sells server chips, a business where Intel has almost 100 percent market share — posted sales of $4.4 billion in the second quarter, a gain of 9 percent. Revenue in the personal-computer processor division rose 12 percent even as the overall PC market shrank. The company also gave an upbeat forecast for third-quarter and annual revenue, sending the shares up as much as 4.5 percent in extended trading.

    Though most of Intel’s sales come from PC chips, the more-lucrative server business has propelled profit and accounted for most of the company’s revenue growth since 2011. Under Chief Executive Officer Brian Krzanich, Intel has been branching out into new markets such as memory chips that could help make up for the PC market’s persistent decline. At the same time, Intel’s newest products for servers and personal computers may be driving a spike in demand.

    “You have to take your hat off to Intel in terms of the ability to foster growth in a very flat space,” said Daniel Morgan, a fund manager at Synovus Trust Co., which owns Intel shares. “They are definitely bucking the trend.”

    Intel shares rose as high as $36.55 following the announcement, after closing at $34.97 in New York trading. The stock has lagged behind peers this year with a 3.6 percent drop, compared with a 21 percent advance in the Philadelphia Stock Exchange Semiconductor Index.

    “The market, both consumer and business, continue to look for higher performance,” Chief Financial Officer Bob Swan said in an interview. “The higher the performance the consumer and business needs, the better for us. We generate higher average selling prices from those units.”

    In the data center business, while demand was down from companies and governments, revenue from cloud providers increased 35 percent. Makers of networking equipment are also buying more Intel chips, Swan said. Sales in the company’s memory-chip division jumped 58 percent, and the unit is on course to be profitable this year, he said.

    Second-quarter net income climbed to $2.8 billion, or 58 cents a share, from $1.3 billion, or 27 cents, in the same period a year earlier. Sales rose 9 percent to $14.8 billion. Excluding certain items, profit was 72 cents a share. On that basis, analysts had projected 68 cents in profit on revenue of $14.4 billion.

    Third-quarter revenue will be about $15.7 billion, the Santa Clara, California-based company said Thursday in a statement. Analysts had projected $15.3 billion, according to the average of estimates compiled by Bloomberg. Annual sales will be as much as $61.3 billion, topping the average prediction of $60.2 billion.

    Intel is facing a revived challenge from longtime straggler Advanced Micro Devices Inc. The smaller chipmaker, which for years has tried and failed to make a significant dent in Intel’s dominance in either the PC or data-center markets, predicted a jump in sales of as much as 26 percent in the current quarter as it begins offering new server chips it says can compete with Intel for the first time in nearly a decade.

    Earlier this month, market researchers said the PC market lost ground again in the second quarter after showing some growth earlier this year, signaling that computer makers are struggling to retain the interest of consumers. Worldwide shipments totaled 60.5 million in the second quarter, a decline of 3.3 percent from the same period a year ago, according to IDC.

    4:00p
    Selecting the Best Database for Your Organization, Part 2

    Franco Rizzo is Senior Pre-sales Architect at TmaxSoft. 

    Editor’s Note: The first part of this article examined enterprise database strategies for: on-premise/private cloud and hybrid cloud. Next up, how to approach your strategy for public cloud; appliance-based and virtualized environments.

    Public Cloud

    The main advantage with public cloud is its almost infinite scalability. Its cost model, too, is an advantage, with pay-as-you-go benefits. It offers faster go-to-market capability and gives an enterprise the ability to utilize newer applications, as using legacy applications in the cloud can be challenging.

    As in a hybrid cloud, sprawl can also be a problem in the public cloud. Without a strategy to manage and control a public cloud platform, costs can spiral and negate the savings and efficiency. But keep in mind that the public cloud may open the door to shadow IT, creating a security issue.

    Data visibility is another downside; once data goes into a cloud, it can be hard to determine where it actually resides, and sovereignty laws can come into play for global enterprises. Trust in the public cloud is an issue for CIOs and decision makers, which is why hybrid – the best of both worlds – is such a popular deployment option.

    Public clouds also are often homogenous by nature; they are meant to satisfy many different enterprises’ needs (versus on premise, which is designed just for one company), so customization can be challenge.

    While a public cloud is Opex friendly, it can get expensive after the first 36 months. Keep TCO in mind when deploying a workload: its lifecycle and overall cost benefit, as well as how the true cost of that application will be tracked.

    Latency issues can occur, depending how an enterprise has architected its public cloud and how it has deployed applications or infrastructure, which can greatly affect the quality of user experience. To improve performance, distributing apps and data close to a user base is a better solution than the traditional approach, where everything is in one data zone.

    Disaster recovery will be built in, so there is no need for enterprise to architect it on its own. Security with a public cloud is always a challenge, but can be mitigated with proper measures such as at-rest encryption and well-thought-out access management tools or processes.

    Appliance Database

    Traditionally, this is an on-premise solution – either managed by a vendor or in an enterprise’s own data center. There are many popular vendors that provide this solution, and using one vendor to control the complete solution can offer performance and support gains.

    However, this also can be a disadvantage, because it locks an enterprise into a single vendor, and appliance-based databases tend to be a niche, use-case-specific option. Vendor selection is an essential process to make certain that the partnership works both in the present and the future.

    Appliance databases, because of their specialized, task-specific nature, are expensive.  They can be cost-effective over time if they are deployed properly.

    Virtualized Database

    One advantage of virtualization is the ability to consolidate multiple applications onto a given piece of hardware, which leads to lower costs and more efficient use of resources.

    The ability to scale is built into a virtualized environment, and administration is simple, with a number of existing tools to administer a virtualized environment.

    With virtualization, patching can sometimes be an issue; each OS sits on top of a hypervisor and IT may have to patch each VM separately in each piece of hardware.

    It’s best to plan for a higher initial Capex, because the cost of installing a database needs to be accounted for. An enterprise can opt for an open-source solution like KVM, but this solution often requires additional set-up expenses.

    A con is that the enterprise itself will be the single point of failure; if hardware fails, VMs go down. Fault-proof disaster recovery is a major concern and must be well architected.

    There can be network traffic issues because multiple applications will be trying to use the same network card. The actual server and enterprise must be purpose built for the virtualized environment.

    Virtualization is ideal for repurposing older hardware to some extent, because IT can consolidate many applications onto hardware that might have been written off. It is well suited to clustering; being able to cluster multiple VMs over multiple servers is a key benefit as far as disaster recovery.

    It comes with a Capex, but over time, Opex is reduced because of consolidation (a lot of processes will be automated), so lower operational expenses and savings over time lead to a quicker return and lower total cost of ownership. However, licensing costs can get expensive.

    An enterprise can achieve better data center resource utilization because of the smaller footprint, which saves on the costs of running servers. It also allows an enterprise to host multiple virtual databases on same physical machine while maintaining complete isolation of the operating system layer.

    Selecting the Right Database

    As you can see, selecting a deployment option is not a trivial matter. Therefore, how can a CIO or SI mitigate the risk of choosing one over another? Cost can’t be the only driver.

    Just as mainframe eventually led to cloud, enterprises may find success if they can enable the simple path from legacy on-prem databases to a private cloud with APIs to the public cloud. This allows for the legacy architecture to connect to mobile, IoT and AI, potentially a launching pad for a hybrid cloud architecture with best of breed public cloud services: storage, applications etc.

    Every enterprise has its own challenges, goals and needs, and there is no one-size-fits-all recommendation when selecting a database. Carefully examine your own infrastructure as well as ROI expectations, long-term business goals, sovereignty laws, IT capabilities and resource allocation to determine which of these databases is the right one for your enterprise – now and years down the line.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    5:01p
    Iron Mountain Buys Denver Colocation Firm Fortrust for $128M

    Iron Mountain, the real estate investment trust that in addition to storing and managing digital data and physical documents for its clients also provides data center colocation services, has acquired Mag Datacenters, the operator of Fortrust, a Denver colocation provider.

    This is nearly-70-year-old Iron Mountain’s first acquisition in the data center market, as it continues to invest in expanding this part of its business. Known for its high-security underground data center and storage facility inside a repurposed limestone mine outside of Pittsburgh – which among other things stores historical documents and original tapes of classic films and music albums for Hollywood studios and record companies — the company also has data centers in Northern Virginia and in Boston, where it’s based.

    For its $128 million in stock and cash, Iron Mountain is getting Fortrust’s 210,000-square foot Denver-area data center, which includes about 70,000 square feet of rentable space and 9MW of power capacity, three-quarters of which is leased to about 250 customers. There’s also expansion potential for another 40,000 square feet and 7MW of power.

    Together with its other facilities, the Denver site will bring Iron Mountain’s total data center capacity to 30MW and 70MW of expansion potential, the company said in a statement.

    The acquisition gives the company its first West Coast presence, providing its existing customers with the option to lease a remote data center location with the same provider.

    5:54p
    Here’s What Wall Street Is Saying About Amazon’s Earnings

    (Bloomberg) — Amazon.com Inc. shares are lower after the company forecast a potential quarterly loss for the first time in two years, a reminder to investors that its reshaping of the worlds of retailing and cloud-computing industry doesn’t come without a cost.

    The company on Thursday said it’s boosting spending on new warehouses to meet growing e-commerce demand, data centers for its Amazon Web Services division, video programming to keep customers engaged and gadgets like the Echo line of voice-activated speakers to stay on the cutting edge of the emerging smart-home market. This comes after shares hit all time highs Thursday, briefly making Jeff Bezos the richest man in the world.

    While analysts remain optimistic about the future of the e-commerce giant and growing revenue, the report underscored the high cost of its business model. Here’s a roundup of what Wall Street is saying.

    Goldman Sachs Group Inc., Heath Terry

    “We continue to believe that we are in the early stages of the shift of compute to the cloud and the transition of traditional retail online and that the market is underestimating the long-term financial impact of both to Amazon. As Amazon continues to generate high cash returns on cash invested despite the growing scale of its investments, with significant option value in early stage efforts in AI, Voice, and robotics, we believe growth acceleration like that we saw in second quarter is likely to continue, in contrast with consensus estimates.”

    Jefferies Group LLC, Brian Fitzgerald

    “Amazon delivered a strong print despite heavy investment to support growth. Revenue came in ahead of consensus and high end of guidance while operating income was below consensus. Margin guidance came in below expectations on unfavorable currency exchange and continued investment in India, fulfillment, digital content, and Amazon Web Services (AWS). Unit and revenue growth accelerated in the quarter on strong Prime membership growth while AWS delivered solid revenue growth of 42 percent year-over-year. We reiterate our Buy and raise our price target to $1,250.”

    Cantor Fitzgerald LP, Kip Paulson

    “We’re reiterating our Overweight rating, increasing our price target to $1,150 from $1,050, and raising our estimates to reflect another impressive quarter, with net sales growth accelerating to 26 percent and with solid guidance that assumes a continuation of 20-28 percent growth in the third quarter, partially offset by aggressive investments in fulfillment capacity, headcount, video content, AWS, and other areas.”

    Credit Suisse Group AG, Stephen Ju

    “Despite the operating profit shortfall in 2Q17, the better than- expected revenue as well as the broad-based acceleration across all of its operating segments in our view validates the company’s rationale to continue investing…We maintain our Outperform rating and our updated investment thesis for Amazon shares is predicated on the following longer-term factors: 1) re-establishment of e-commerce segment operating margin expansion, 2) ongoing margin benefit due to shipping loss moderation, and 3) upward bias to AWS revenueforecasts.”

    Citigroup Inc., Mark May

    “We view the second quarter results as supportive of the long-term prospects for Amazon, given the resilience of AWS to pricing changes and accelerating growth in the retail business. Our earnings numbers are being revised lower in the near-term, however, to reflect the higher level of investment.”

    Amazon has 38 buy ratings, five holds and one sell with an average 12-month price target of $1,142, according to data compiled by Bloomberg. Shares closed Thursday at $1,046.

    << Previous Day 2017/07/28
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org