Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, June 14th, 2017

    Time Event
    12:00p
    DataBank Continues Buying Spree, Acquires Stream’s Dallas Data Center

    DataBank, a part of Digital Bridge since July of last year, has kept its buying spree alive, announcing acquisition of a data center in Legacy Business Park in Dallas from Stream Data Centers, which becomes its largest site in the red-hot Dallas data center market.

    Located on a 16-acre site, the initial 145,000-square foot facility can be expanded to 265,000 square feet, the company said.

    The facility, DFW3, will leverage the area’s high concentration of nearby fiber carriers and will also be interconnected with DataBank’s two other data center locations near Dallas.

    “This new flagship data center represents our largest and most advanced facility in the Dallas area,” said Raul K. Martynek, CEO of DataBank. “The continued demand for quality data center space in North Texas by both our existing enterprise and content customers made the investment case quite compelling. We are excited to be bringing the facility on-line in October of this year.”

    Consistent with analyst forecasts earlier in the year, 2017 is shaping up to be a record year for data center acquisitions, and Digital Bridge has played a role in making it so. The latest deals were announced just this month: Digital Realty Trust has agreed to acquire DuPont Fabros Technology for $4.95 billion, and Peak 10 is buying ViaWest from the Canadian telco Shaw Communications in a $1.7 billion deal.

    In January, DataBank gobbled up Salt Lake City-based C7 Data Centers, as well as two data centers located in Cleveland and Pittsburgh, considered “key interconnection assets,” from 365 Data Centers.

    Keeping with Digital Bridge’s goal of becoming a major force behind the current wave of industry consolidation, the Boca Raton-based company in March acquired Vantage Data Centers–the largest wholesale data center landlord in Silicon Valley—in March and its campuses in Northern California and Quincy, Washington.

    DFW3 is the second data center in Legacy Business Park sold by Stream. The other, a 150,000-square foot facility, is now owned by Research in Motion, the Canadian tech firm best known for the Blackberry product line. Stream also sold its data center in Dallas-close Richardson to TD Ameritrade back in 2015.

    The Dallas-Fort Worth market is the third-largest data center market in the US. Legacy Business Park and the Far North Dallas submarket make up one of the most active markets in the country for large corporate relocations and expansions, with recent announcements from Toyota, JP Morgan Chase, Liberty Mutual, FedEx, Fannie Mae, and others.

    1:00p
    Broadcom’s New Switches to Supercharge Virtual Data Center Networks

    Broadcom has made the first significant update to its StrataXGS Trident line of chip-based switches in about two years.

    The XGS line has been one of the leaders in network virtualization inside data centers, and the new line offers several benefits, including power and cost savings, better programmable support for new software-defined networking technologies, and higher switching throughput and densities. All of these will appeal to large-scale data center customers. The older chips are found in a wide assortment of networking equipment from Hewlett Packard Enterprise, Dell, IBM, and even Cisco.

    When the earlier Trident line was introduced, networks had several tiers to consolidate traffic. Those days are over, and now we have “fast, fat, and flat” networks to handle higher traffic loads between virtualized servers and hyperconverged racks.

    The new Trident 3 line will maintain backwards compatibility to existing XGS installations, according to company officials. Switches with this new silicon can be programmed to handle software-defined network virtualization and service chaining protocols, including VXLAN, GPE, NSH, Geneve, MPLS, MPLS over GRE, MPLS over UDP, GUE, Identifier Locator Addressing, and PPPoE, among others. The architecture also supports programmable telemetry, both in and out-of-band packet monitoring.

    The new switches can scale from 200 Gbps to 3.2 Tbps, and handle networks from 10 to 100 gigabit Ethernet. This means they can be used in a wide range of applications, from smaller LANs to top-of-rack high-density environments.

    The first and largest chip models (the BCM56870, which runs at 3.2 Tbps and BCM56873, which runs at 2.0 Tbps) will begin sampling now, with the smaller models available in the second half of 2017 (Image by Broadcom):

    The company claims that they can deliver eight times higher network burst absorption and congestion controls when compared to earlier XGS generations and a three-time improvement in access control list processing. All of the models are based on 16nm chip designs that provide for tremendous power efficiency.

    “Trident 3-based platforms will form an important part of Extreme’s world class switching portfolio, enabling unprecedented 100GbE economics in the enterprise and end-to-end in-field upgrades of critical new switch functionality,” Eric Broockmann, VP and CTO of Extreme Networks, said in a statement.

    Pricing will be under $3,000 for volume purchases.

    3:30p
    Alibaba Cloud to Launch Data Centers in India, Indonesia

    Two years ago, Alibaba Cloud’s President Simon Hu declared, “Our goal is to overtake Amazon in four years, whether that’s in customers, technology, or worldwide scale.”

    The cloud computing arm of the China-based e-commerce giant Alibaba announced at its Computing Conference this week in Shanghai that it plans to launch data centers in Mumbai and Jakarta in a move certainly in that direction.

    The decision to expand in Indonesia goes hand-in-hand with the country’s 1,000 Start-ups Movement initiative launched last year, aimed at establishing 1,000 ventures by 2020, with a target cumulative valuation of $10 billion.

    Alibaba said it anticipates that both India and Indonesia data centers will open during the current fiscal year, ending on March 31, 2018.

    Together with the recently announced data center in Malaysia, Alibaba Cloud will significantly increase its computing resources in Asia. When the three new facilities open, the total number of locations will grow to 17, covering mainland China, Australia, Germany, Japan, Hong Kong, Singapore, the United Arab Emirates, and the US.

    “I believe Alibaba Cloud, as the only global cloud services provider originating from Asia, is uniquely positioned with cultural and contextual advantages to provide innovative data intelligence and computing capabilities to customers in this region. Establishing data centers in India and Indonesia will further strengthen our position in the region and across the globe,” Hu said in a statement.

    According to Synergy Research Group, Alibaba is sixth in the world behind AWS, Microsoft, Google, IBM and Salesforce in infrastructure, platform and hosted private cloud services (not including Salesforce’s more substantial SaaS business).

    Alibaba entered the cloud computing business in 2009, just three years after Amazon launched its cloud division, AWS — and Alibaba’s cloud computing effort is one of the most ambitious projects the Chinese e-commerce giant is pursuing.

    It comes at a time when China continues to create more and more restrictions on US-based cloud providers. China’s controversial Cyber Security Law went into effect on June 1 and is creating Excedrin-size headaches for some of Silicon Valley’s giants and others, wanting to remain in or expand to the country.

    This particular law is controversial on a few fronts: First, it requires that foreign companies store data only on servers in China. This condition could handcuff multinationals accustomed to a global internet computing environment.

    Additionally, only technology deemed “secure” can be employed; and if officials suspect any wrongdoing, foreign entities must cooperate with investigations. That includes giving the government full access to data. Many fear the law will grant Beijing an unprecedented and uncomfortable level of access to others’ data.

    A further requirement regarding certification could mean technology companies will be asked to provide source code, encryption or other critical intellectual property for review by security authorities.

    It’s yet another move by the Chinese government that, regardless of intent, gives local favorites like Alibaba Group an edge. The company’s cloud customers doubled in the final quarter of 2016, Bloomberg reported.

    Only time will tell if the move by China adds customers to Alibaba’s coffers and takes them away from US competitors.

    On a side note, Alibaba Cloud also announced that it has established a global partnership with Tata Communications, which will provide direct access to Alibaba Cloud Express Connect via Tata Communications’ IZOTM Private Connect service.

    4:00p
    How to Fix Your Data Growth Problems with Object Storage

    Clayton Weise is director of cloud services for Key Information Systems.

    Walk into almost any data center today – in any industry and in any region – and you’ll likely hear snippets of the same conversation happening everywhere. Enterprises are feeling the blowback of data explosion. In a landscape where organizations see data stores multiplying annually by terabytes and petabytes, traditional block and file storage options can’t keep up. This scalability crisis is leading many companies to object storage.

    Data Growth Trends Make the Case for Object Storage

    We live in the pack rat age of data growth – no one deletes data. And there is a whole lot more data weighing down traditional storage options. With more regulations on the books demanding data retention, more Internet of Things (IoT) devices creating and capturing data, and more appetite for information and services, capacity challenges have become acute.

    Consider a few numbers that tell the story:

    1. 13 zetabytes: The estimated amount of data created, captured or replicated globally in 2016
    2. 13 million petabytes: A slightly more fathomable way to grasp that level of annual growth
    3. 415 terabytes: The amount of new data per second on average reflected in the above numbers
    4. 3,000 of today’s hard disk drives: What you would need every minute to hold an overall 13 zetabytes of annual data growth
    5. 44 zetabytes: The data high-water mark IDC expects we’ll hit by 2020

    File and block storage (or NAS and SAN) can’t scale to the heights these numbers are propelling enterprises. File and block storage approaches made sense when data growth was slower. Today, though, enterprises are global, and distributed teams need rapid access to large files. This is a pain point object storage eases.

    File storage isn’t fast; its hierarchy tree slows down when it has to manage billions of files. And both file and block storage present scalability issues. By contrast, object storage is relatively flat, and it’s easy for IT teams to quickly access individually tagged files. And it’s an option with nearly unlimited capacity; it’s hard to imagine the enterprise that could ever exceed object storage’s scalability limits.

    Why the Time is Right for Object Storage

    From law enforcement agencies storing body cam video files to health organizations housing electronic medical records data to entertainment companies looking for better ways to store progressively more complex video files, enterprises across industries are adopting object storage. They do so because they need instant access to data (either to satisfy compliance regulations or customer demands), they want pay-as-you-go predictable cost structures and they must cut their management expenses. Object storage meets these needs, while also adding a layer of security thanks to its ability to track change history and roll back to previous versions of the data if, for example, ransomware infects the system.

    The rising popularity of object storage has a lot to do with other technological advances, as well. There are now widely available tools on the market for accessing data and tuning performance in object storage environments. Further, some key features we see in systems like IBM Cloud Object Storage make it an enticing choice, especially for unstructured and growing data.

    The first of these features is erasure coding. It used to be that object storage required users to write the information, get a unique code or identifier for each object and then replicate it two or three times to ensure reliability. It was an inefficient system that chewed up two or three times as much storage, depending on the number of copies you made. Worse, if you misplaced your code, your data was lost. Today, IBM addresses the replication problem with erasure coding, which is somewhat like RAID parity but more flexible. Enterprises define the erasure code algorithms based on the redundancy their data requires and their efficiency objectives. If enterprises make a small change – even 15 percent – in favor of efficiency, they can see a savings of hundreds of terabytes.

    The second advancement that makes object storage make sense for enterprises is in the realm of security. Out of the box, an offering like IBM Cloud Object Storage supports encryption for data on its object storage platform. Enterprises also get Active Directory or LDAP authentication, as well as object-level access lists (ACLs) with S3 API compatibility. The result is easy integration into enterprise environments and granular access control.

    Taking the First Steps Toward Object Storage

    Cloud object storage has a lot to offer the enterprise IT leader with exabyte-level scalability demands and budgetary constraints. In terms of reliability, availability and total cost of ownership, object storage is compelling. We counsel clients to begin their object storage projects with defined scopes to ensure success from the start. Choose projects that aren’t reliant on older systems, for example, or start by creating an alternative place for backups, which won’t affect production environments while the team gets up to speed. There’s no need to go all in right from the start, but with data growth continuing at an accelerating pace, now is the time to start considering a transition to object storage.

    Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

     

    8:07p
    Alibaba to Use Own Immersion Cooling Tech in Cloud Data Centers

    The cloud computing arm of China’s e-commerce giant Alibaba Group is developing a data center cooling system that submerges server motherboards in liquid coolant to take advantage of liquid’s superior heat-transfer capabilities when compared to air.

    The company said this week it expects this approach to increase power density and save space inside the data centers it is building around the world to expand market reach, as it competes with the likes of Amazon Web Services and Microsoft Azure, who are continuing to build out their already massive global cloud data center networks. It expects the solution’s energy efficiency improvements to result in 20 percent lower data center operational costs.

    Alibaba said it plans to contribute the technology to the Open Compute Project, an open source hardware and data center design effort started about six years ago by Facebook. The Chinese company officially joined OCP this week.

    Its data center cooling technology has reached production stage, the company said, and will soon be ready for deployment in Alibaba’s cloud data centers.

    The concept of submerging servers in dielectric fluid to improve data center cooling efficiency isn’t new. There are several companies selling solutions that use it on the market, the more prominent examples being Green Revolution Cooling and Iceotope.

    See also: How Practical is Dunking Servers in Mineral Oil Exactly?

    Alibaba hasn’t revealed much detail about its particular solution, saying only that it “involves an immersed, liquid-cooling server solution that uses insulating coolant instead of traditional air-cooling equipment. The coolant absorbs the heat of the components before turning into gas, which is then liquefied back into the main cabinet for reuse.”

    Because the technology doesn’t require massive air conditioning systems present in most of the world’s data centers, Alibaba’s immersion cooling technology “can be deployed anywhere, delivering space savings of up to 75 percent,” the company said.

    9:21p
    The Machine of Tomorrow Today: Quantum Computing on the Verge

    Jon Asmundsson (Bloomberg) — It’s a sunny Tuesday morning in late March at IBM’s Thomas J. Watson Research Center. The ­corridor from the reception area follows the long, curving glass curtain-wall that looks out over the visitors’ parking lot to leafless trees covering a distant hill in Yorktown Heights, N.Y., an hour north of Manhattan. Walk past the podium from the Jeopardy! episodes at which IBM’s Watson smote the human champion of the TV quiz show, turn right into a hallway, and you’ll enter a windowless lab where a quantum computer is chirping away.

    Actually, “chirp” isn’t quite the right word. It’s a somewhat metallic sound, chush … chush … chush, that’s made by the equipment that lowers the temperature inside a so-called dilution ­refrigerator to within hailing distance of absolute zero. Encapsulated in a white canister suspended from a frame, the dilution refrigerator cools a superconducting chip studded with a handful of quantum bits, or qubits.

    Quantum computing has been around, in theory if not in practice, for several decades. But these new types of machines, designed to harness quantum mechanics and potentially process unimaginable amounts of data, are certifiably a big deal. “I would argue that a working quantum computer is perhaps the most sophisticated ­technology that humans have ever built,” says Chad Rigetti, founder and chief executive officer of Rigetti Computing, a startup in Berkeley, Calif. Quantum computers, he says, harness nature at a level we became aware of only about 100 years ago—one that isn’t apparent to us in everyday life.

    IBM’s 16 qubit processor (Photo IBM)

    What’s more, the potential of quantum computing is enormous. Tapping into the weird way nature works could potentially speed up computing so some problems that are now intractable to classical computers could finally yield solutions. And maybe not just for chemistry and materials science. With practical ­breakthroughs in speed on the horizon, Wall Street’s antennae are twitching.

    The second investment that CME Group Inc.’s venture arm ever made was in 1QB Information Technologies Inc., a ­quantum-computing software company in Vancouver. “From the start at CME Ventures, we’ve been looking further ahead at ­transformational innovations and technologies that we think could have an impact on the financial-services industry in the future,” says Rumi Morales, head of CME Ventures LLC.

    That 1QBit financing round, in 2015, was led by Royal Bank of Scotland. Kevin Hanley, RBS’s director of innovation, says quantum computing is likely to have the biggest impact on industries that are data-rich and time-sensitive. “We think financial services is kind of in the cross hairs of that profile,” he says.

    Goldman Sachs Group Inc. is an investor in D-Wave Systems Inc., another quantum player, as is In-Q-Tel, the CIA-backed venture capital company, says Vern Brownell, CEO of D-Wave. The Burnaby, B.C.-based company makes machines that do something called quantum annealing. “Quantum annealing is basically using the quantum computer to solve optimization problems at the lowest level,” Brownell says. “We’ve taken a slightly different approach where we’re actually trying to engage with customers, make our computers more and more powerful, and provide this advantage to them in the form of a programmable, usable computer.”

    Marcos López de Prado, a senior managing director at Guggenheim Partners LLC who’s also a scientific adviser at 1QBit and a research fellow at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, says it’s all about context. “The reason quantum computing is so exciting is its perfect marriage with machine learning,” he says. “I would go as far as to say that currently this is the main application for quantum computing.”

    Part of that simply derives from the idea of a quantum computer: harnessing a physical device to find an answer, López de Prado says. He sometimes explains it by pointing to the video game Angry Birds. When you play it on your iPad, the central processing units use some mathematical equations that have been programmed into a library to simulate the effects of gravity and the interaction of objects bouncing and colliding. “This is how digital computers work,” he says.

    By contrast, quantum computers turn that approach on its head, López de Prado says. The paradigm for quantum ­computers is this: Let’s throw some birds and see what happens. Encode into the quantum microchip this problem: These are your birds and where you throw them from, so what’s the optimal trajectory? “Then you let the computer check all possible solutions ­essentially—or a very large combination of them—and come back with an answer,” he says. In a quantum computer, there’s no mathematician cracking the problem, he says. “The laws of physics crack the problem for you.”

    The fundamental building blocks of our world are quantum mechanical. “If you look at a molecule,” says Dario Gil, vice president for science and solutions at IBM Research, “the reason molecules form and are stable is because of the interactions of these electron orbitals. Each calculation in there—each orbital—is a quantum mechanical calculation.” The number of those calculations, in turn, increases exponentially with the number of electrons you’re trying to model. By the time you have 50 ­electrons, you have 2 to the 50th power calculations, Gil says. “That’s a phenomenally large number, so we can’t compute it today,” he says. (For the record, it’s 1.125 quadrillion. So if you fired up your laptop and started cranking through several calculations a second, it would take a few million years to run through them all.) Connecting information theory to physics could provide a path to solving such problems, Gil says. A 50-qubit quantum computer might begin to be able to do it.

    Landon Downs, president and co-founder of 1QBit, says it’s now becoming possible to unlock the computational power of the quantum world. “This has huge implications for producing new materials or creating new drugs, because we can actually move from a paradigm of discovery to a new era of quantum design,” he says in an email. Rigetti, whose company is building hybrid ­quantum-classical machines, says one moonshot use of quantum computing could be to model catalysts that remove carbon and nitrogen from the atmosphere—and thereby help fix global warming. (Bloomberg Beta LP, a venture capital unit of Bloomberg LP, is an investor in Rigetti Computing.)

    The quantum-computing community hums with activity and excitement these days. Teams around the world—at startups, corporations, universities, and government labs—are racing to build machines using a welter of different approaches to process quantum information. Superconducting qubit chips too elementary for you? How about trapped ions, which have brought together researchers from the University of Maryland and the National Institute of Standards and Technology? Or maybe the topological approach that Microsoft Corp. is developing through an inter­national effort called Station Q? The aim is to harness a particle called a non-abelian anyon—which has not yet been definitively proven to exist.

    IBM Quantum Computing Scientists Hanhee Paik (left) and Sarah Sheldon (right) examine the hardware inside an open dilution fridge at the IBM Q Lab at IBM’s T. J. Watson Research Center in Yorktown, New York. (Photo: IBM)

    These are early days, to be sure. As of late May, the number of quantum computers in the world that clearly, unequivocally do something faster or better than a classical computer remains zero, according to Scott Aaronson, a professor of computer science and director of the Quantum Information Center at the University of Texas at Austin. Such a signal event would establish “quantum supremacy.” In Aaronson’s words: “That we don’t have yet.”

    Yet someone may accomplish the feat as soon as this year. Most insiders say one clear favorite is a group at Google Inc. led by John Martinis, a physics professor at the University of California at Santa Barbara. According to Martinis, the group’s goal is to achieve supremacy with a 49-qubit chip. As of late May, he says, the team was testing a 22-qubit processor as an intermediate step toward a showdown with a classical supercomputer. “We are optimistic about this, since prior chips have worked well,” he said in an email.

    The idea of using quantum mechanics to process information dates back decades. One key event happened in 1981, when International Business Machines Corp. and MIT co-sponsored a conference on the physics of computation at the university’s Endicott House in Dedham, Mass. At the conference, Richard Feynman, the famed physicist, proposed building a quantum computer. “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical,” he said in his talk. “And by golly, it’s a wonderful problem, because it doesn’t look so easy.”

    He got that part right. The basic idea is to take advantage of a couple of the weird properties of the ­atomic realm: superposition and entanglement. Superposition is the mind-bending observation that a particle can be in two states at the same time. Bring out your ruler to get a measurement, however, and the particle will collapse into one state or the other. And you won’t know which until you try, except in terms of probabilities. This effect is what underlies Schrödinger’s cat, the ­thought-experiment animal that’s both alive and dead in a box until you sneak a peek.

    Quantum computer mixing chamber (Photo: IBM)

    Sure, bending your brain around that one doesn’t come especially easy; nothing in everyday life works that way, of course. Yet about 1 million experiments since the early 20th century show that superposition is a thing. And if superposition happens to be your thing, the next step is figuring out how to strap such a crazy concept into a harness.

    Enter qubits. Classical bits can be a 0 or a 1; run a string of them together through “logic gates” (AND, OR, NOT, etc.), and you’ll multiply numbers, draw an image, and whatnot. A qubit, by contrast, can be a 0, a 1, or both at the same time, says IBM’s Gil.

    Ready for entanglement? (You’re in good company if you balk; Albert Einstein famously rebelled against the idea, calling it “spooky action at a distance.”) Well, let’s say two qubits were to get entangled; Gil says that would make them perfectly correlated. A quantum computer could then utilize a menagerie of distinctive logic gates. The so-called Hadamard gate, for example, puts a qubit into a state of perfect superposition. (There may be something called a “square root of NOT” gate, but let’s take a pass on that one.) If you tap the superposition and entanglement in clever arrangements of the weird quantum gates, you start to get at the potential power of quantum computing.

    If you have two qubits, you can explore four states: 00, 01, 10, and 11. (Note that that’s 4: 2 raised to the power 2.) “When I perform a logical operation on my quantum computer, I can operate on all of this at once,” Gil says. And the number of states you can look at is 2 raised to the power of the number of qubits. So if you could make a 50-qubit universal quantum computer, you could in theory explore all of those 1.125 quadrillion states—at the same time.

    Interior of a quantum computer (Photo: IBM)

    What gives quantum computing its special advantage, says Aaronson, of the University of Texas, is that quantum mechanics is based on things called amplitudes. “Amplitudes are sort of like probabilities, but they can also be negative—in fact, they can also be complex numbers,” he says. So if you want to know the probability that something will happen, you add up the amplitudes for all the different ways that it can happen, he says.

    “The idea with a quantum computation is that you try to choreograph a pattern of interference so that for each wrong answer to your problem, some paths leading there have positive amplitudes and some have negative amplitudes, so they cancel each other out,” Aaronson says. “Whereas the paths leading to the right answer all have amplitudes that are in phase with each other.” The tricky part is that you have to arrange everything not knowing in advance which answer is the right one. “So I would say it’s the exponentiality of quantum states combined with this potential for interference between positive and negative amplitudes—that’s really the source of the power of quantum computing,” he says.

    Did we mention that there are problems that a classical ­computer can’t solve? You probably harness one such difficulty every day when you use encryption on the internet. The problem is that it’s not easy to find the prime factors of a large number. To review: The prime factors of 15 are 5 and 3. That’s easy. If the number you’re trying to factor has, say, 200 digits, it’s very hard. Even with your laptop running an excellent algorithm, you might have to wait years to find the prime factors.

    That brings us to another milestone in quantum computing: Shor’s algorithm. Published in 1994 by Peter Shor, now a math professor at MIT, the algorithm demonstrated an approach that you could use to find the factors of a big number—if you had a quantum computer, which didn’t exist at the time. Essentially, Shor’s algorithm would perform some operations that would point to the regions of numbers in which the answer was most likely to be found.

    The following year, Shor also discovered a way to perform quantum error correction. “Then people really got the idea that, wow, this is a different way of computing things and is more powerful in certain test cases,” says Robert Schoelkopf, director of the Yale Quantum Institute and Sterling professor of applied ­physics and physics. “Then there was a big upswelling of interest from the physics community to figure out how you could make quantum bits and logic gates between quantum bits and all of those things.”

    Two decades later, those things are here.

    Asmundsson is editor of Bloomberg Markets.

    << Previous Day 2017/06/14
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org