Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
[Most Recent Entries]
[Calendar View]
Thursday, August 25th, 2016
Time |
Event |
12:00p |
Data Center Connectivity: How to Use WAN for Competitive Advantage A lot about data center connectivity, WAN, and how businesses utilize the cloud will change between now and 2020. Cisco’s recent Visual Network Index report outlined some of the biggest changes that are coming your way:
- Global IP traffic will increase nearly threefold over the next five years.
- Smartphone traffic will exceed PC traffic by 2020.
- Traffic from wireless and mobile devices will account for two-thirds of total IP traffic by 2020.
- The number of devices connected to IP networks will be more than three times the global population by 2020.
- Globally, virtual reality traffic will increase 61-fold between 2015 and 2020, potentially serving up hundreds of petabytes of traffic per month.
Square in the middle sits the business and the users they’re trying to support. This is why, over the next four years, organizations are going to use data center connectivity and WAN technologies to create real-world competitive advantages. With this in mind, let’s examine how companies can use WAN technologies to their benefit and where they can create even more optimization.
WAN Technologies Help Embrace Data Center Distribution
The way we interconnect data centers and their respective services drives organizational strategies and delivery methodologies. This means that companies are finding ways to make their infrastructures more resilient, adaptable to change, and capable of supporting a highly mobile user.
By creating data center distribution you enable greater levels of business agility and reduce risk for the organization. WAN technologies are smarter, contextually aware, and capable of supporting very specific data center and business services. This means companies can plan around application delivery and even data optimization based on users, workloads, and even market demands.
Content is Getting Richer and Your WAN Must Support This
Images, videos, files, and now virtual reality traffic are all impacting WAN systems and this type of information is actually delivered.
First of all, our ability to capture and distribute content will only continue to evolve. Furthermore, the type of content we’re delivering will grow in size, complexity, and value. This means your connectivity and WAN ecosystem must be able to support this.
It might mean optimizing very specific types of data streams, or supporting user sub-segments for precise business services. By being prepared for the evolving digital revolution and richer content, organizations can support quickly changing business strategies based on overall market trends.
WAN Must Be Used to Optimize Your Cloud
The concept of cloud now revolves around very specific kinds of services that support the overall business. This could be SaaS, IaaS, PaaS, DRaaS, CDNs, and much more.
The point is organizations are now creating much more detailed cloud consumption models specifically based on their use case and business requirements. How is your data center connectivity supporting these delivery strategies? What about your WAN and Ethernet services; are they keeping up with user and resource demands?
WANOP is the process of optimizing WAN connectivity alongside distributed cloud and data center traffic. However, optimization in general is a critical consideration. When optimization is applied to applications, users, and digital content by context, you create a powerful architecture capable of supporting a highly distributed ecosystem.
Most of all, WAN management becomes easier as new WAN and cloud management systems help aggregate controls.
Data Center Connectivity Supports a Decentralized Business
The future business might have a “primary” data center, but their overall business might be extremely distributed. A new digital workforce does not conform to the traditional 9 to 5 mentality. In turn, businesses are tasked with supporting a new type of strategy that is capable of vast levels of distribution and content delivery.
A decentralized business is one which knows how to distribute critical resources and bring information most effectively to the users. Most of all, these organizations can create and alter their own services when shifts in the market demand it.
WAN technologies are creating software-defined connections across data centers and across cloud platforms. Organizations that can leverage these new types of WAN control mechanisms will be able to create new kinds of services as well as powerful go-to-market strategies.
Modern organizations will need to look to the cloud to help their business stay truly competitive. Cloud, WAN, and data center connectivity help define the digital revolution we’re all experiencing. These connectivity platforms distribute vast amounts of traffic to all points globally. And, as we evolve into the next digital frontier, platforms like virtual reality will impact traffic patterns and the type of content we deliver.
To create next-generation competitive advantages organizations will need to leverage vehicles of digital transport to help them through the journey. These vehicles will revolve around WAN optimization, cloud connectivity, and intelligent data center distribution architectures. | 3:00p |
Your Data Center’s Brand Is No Longer What You Say It Is The way people research and make purchase decisions has changed drastically during the past few years.
Just look at what’s happening in the retail industry: online shopping is decimating iconic brands that have thrived for decades. It’s all about an empowered buyer getting exactly what they want, when they want on their terms.
And it’s not just the disruption of traditional retailing. iTunes transformed the music industry. Netflix has effectively made the video rental store industry irrelevant. SiriusXM Radio is redefining broadcast media. The Internet of Things (IOT) and artificial intelligence (AI) are almost certain to accelerate this kind of disruption.
Closer to the data center industry, in particular on the cloud side, there’s enormous pressure on many smaller providers coming from Amazon, Microsoft, and IBM.
Reclaiming Your Data Center’s Brand in the ZMOT Era
At the end of the day, your data center’s brand is no longer what you say it is. Your brand is now what the marketplace and more specifically your company’s core buyer personas believe it is.
Google’s Zero Moment of Truth (ZMOT) crystallizes this well: “The Internet has changed how we decide what to buy.”
The cruel irony in all of this: data centers should be at the forefront of understanding this trend and rallying their teams around the transformation. Instead, we often see data center leadership teams still sticking to the same marketing and sales playbooks that they’ve leaned on since the 1990s and early 2000s.
See also: 9 Questions Data Center CEOs Must Ask about Revenue Generation
With an estimated 70%+ of the decision-making process now over before someone from your company is even aware of a new client opportunity, this is a huge challenge for data center builders and operators.
But it’s also a big opportunity if your team can get found early, in the right context, as trusted advisors. The way to do so: create and distribute helpful, educational, thought leadership content.
Understanding What Marketing Can and Can’t Do On its Own
The real issue, however, is that too many data center CEOs mistakenly believe that this is a marketing problem.
While marketing can be part of the solution, it takes a village to change the industry’s perception of your company.
Why’s that?
Different people throughout your company get asked different questions, interact with different stakeholders, and bring vastly different perspectives.
And it’s only when all of these voices get heard that your data center can indeed be perceived as a company of thought leaders.
Data centers that are world-class communicators are the ones that attract world-class clients and a world-class staff.
Conversely, data centers that are terrible at communicating, that are never found early on, and whose entire business model depends on obnoxious cold calling and bids with threadbare profit margins, those are the data centers that settle for the bottom-of-the-barrel clients that no one else wants.
So the choice is yours.
Spotlighting All the Smart People and Great Advice Within Your Data Center’s Core Team
Do you have brilliant professionals throughout your data center company? Do they have collective decades — even centuries — of institutional knowledge, advice, and war stories just itching to get out there as a magnet for attracting great new clients and staff?
But the harsh reality is very few in that position have the time or interest to bang out 25 or 50 blog posts every year.
However is it possible that each person could find 30 minutes once a quarter — perhaps even once a month — to sit down for an interview over coffee or lunch?
From that 30 minute interview, a good writer can generate a piece of premium content that can become a lead generation asset. Perhaps it’s a downloadable planning checklist, a template, or a short report or eBook.
From that piece of content, the writer can then create a few blog posts that excerpt from and promote this newly created lead generation asset.
And each of these blog posts can be used to build several social media status updates.
To scale this editorial process, you’ll end up with a small team: a marketer, a writer, and perhaps a designer or multimedia specialist.
Who to Draft on Your Data Center Thought Leadership Team
But who should be on your data center’s thought leadership team?
- Sales – Because your sales team will be the ones that most directly and most immediately benefit from the great leads and opportunities that come from this effort, your sales director and sales reps should be the ones most excited by this kind of initiative. However, sales teams tend to be much closer to prospects and clients that marketing teams. So sales teams likely have dozens of questions that they’re answering every month — on the phone, in emails, and during in-person meetings. In most cases, each can be repurposed into content — as long as there are a strong process and a solid strategy.
- Operations – Think about how a sales rep spends a typical day. Then think about how someone on your operations team spends a typical day. Different questions. Different issues. Different stakeholders. All of this leads to very rich sources of content. Again mine the ticketing system, the call logs, the one-off emails, and notes from meetings.
- Facilities – Would a sales rep get excited about answering questions all day on power and cooling issues? But for those on your facilities team, it’s a critical part of running a data center. Just as with all the members of your data center’s thought leadership, if a client, channel partner, vendor, or employee is asking a facilities-related question, chances are someone else just like them will go to Google, Bing, Yahoo!, Siri, LinkedIn, YouTube, SlideShare, or Twitter and ask the very same question. Do you want your facilities team’s answers and advice to act like a giant thought leadership magnet, attracting world-class client opportunities and job candidates? Or do you want those strangers to be attracted to one of your competitors?
- Provisioning – While topics like meet me rooms and Open-IX might be an instant-snoozer for many of your employees, your provisioning team likely loves to talk shop about these issues. Again, if they’re asked questions, we want their sage advice in your data center’s thought leadership content to attract like-minded clients and talent.
- Finance – Tax incentives. CapEx vs. OpEx. Return on investment on outsourcing. Many data centers would not immediately think of their CFO and its staff as data centers thought leaders. But there’s an excellent chance their answers to common questions can also be used to draw in very particular kinds of IT influencers and decision makers.
- Executive Management – While many might instinctively think that your data center’s leadership team are the thought leaders, it’s surprising how often that there’s a massive disconnect on the implementation. The red flags? If your executive team only has mediocre LinkedIn profiles, or they’re completely absent from Twitter, or they have very little bylined educational content on the company blog, there are some significant gaps to be addressed. What’s even more shocking? We see a lot of data center CEOs and other C-level execs speaking at data center conferences — and very little of the content is being repurposed and promoted as a magnet to attract good-fit clients and staff before and after the live events have taken place.
Who currently sits on your data center’s thought leadership team? How often is each person interviewed? For how long has the initiative been in place? And what results have you seen? Let us know your take in the Comments section below.
If you’re attending the Data Center World conference next month in New Orleans, be sure to catch Joshua Feinberg’s related session on How Data Centers Use Thought Leadership to Attract World-Class Clients and Talent on Thursday, September 15, 2016 at 10:20 am in room R215 of the Ernest N. Morial Convention Center in New Orleans.
Joshua Feinberg is Vice President and Co-Founder of SP Home Run, which helps data center, managed service, hosting, and cloud providers grow their leads, client base, revenue, and profitability. | 3:49p |
Financial Networking Company Prepares for ‘Post-Quantum’ World (Bloomberg) — When it comes to cybersecurity, no one can accuse IPC Systems, the New Jersey-based company that builds communications networks for trading firms and financial markets, of preparing to fight the last war.
IPC, which is owned by private-equity firm Centerbridge Partners, said Thursday that it is partnering with U.K. startup Post-Quantum to offer its clients encryption, biometric authentication and a distributed-ledger record-keeping system that the software company says is designed to resist hacking — even by a quantum computer. Never mind that quantum computers are still largely the stuff of science fiction.
“We want to provide our customers with whatever level of security and encryption they want,” Tim Carmody, IPC’s vice president of network services engineering, said. “We hear from our customers that some are concerned about the post-quantum world when quantum computers can decode existing encryption.”
IPC said it will be offering Post-Quantum’s security products to users of its Connexus Cloud, a financial markets network with 200,000 users across 6,000 network locations in 700 cities. Carmody said it would be up to each customer to decide which of Post-Quantum’s security offerings, if any, they wished to implement.
Quantum Explained
Quantum computing is still in its infancy — with some saying a true example of the technology is yet to be built.
A Canadian company called D-wave sells a device that it says uses quantum computing to solve a certain kind of computation — what’s known as optimization problems — but its claims are controversial. Plus, in order to work, D-wave’s machine needs to be cooled to temperatures near absolute zero and kept free of electromagnetic interference. It takes up nearly half a room and has a price tag of $10 million to $15 million.
IBM researchers also have created a rudimentary quantum computer that scientists can access remotely via the internet. But it also has to be extensively cooled, and is less powerful than traditionally built supercomputers. A few financial firms, including Goldman Sachs, Royal Bank of Scotland, CME Group and Guggenheim Partners are evaluating quantum computers and their potential impacts on algorithmic trading and portfolio management.
With a number of researchers, including those from Microsoft, predicting quantum computing may become a reality within a decade, IPC wants to be ready. “We have a lot of sophisticated customers who know the state of cybersecurity today and are looking to leap-frog and future-proof as much as possible and bring in defense-grade systems,” Carmody said.
Encryption Breaking
Traditional computers process information encoded in a binary format — represented by either 0 or 1. Quantum computers, by contrast, work on quantum mechanical principles, including the concept of “superposition” — the idea that a particle can be in two different states, representing both a 0 and 1, simultaneously. This is what potentially gives quantum computers their incredible processing power, theoretically carrying out trillions of calculations per second.
And that is what has cybersecurity experts worried. Most digital encryption systems rely on numerical keys that are tens or hundreds of digits long. To break one by trying every possible combination, or by searching for numerical patterns that would allow the encryption algorithm to be reverse-engineered, is beyond the capability of conventional computers — at least in reasonable timescales. But a quantum computer could theoretically break these codes, including the popular RSA public key encryption standard, in seconds. In August 2015, the U.S. National Security Agency (NSA) warned U.S. government agencies and private government contractors that they should be prepared to transition to “quantum-resistant algorithms” in the future.
Patented Encryption
Post-Quantum, which was founded in London in 2009, uses an encryption system based on a type of cryptography first developed by researcher Robert McEliece in 1978. The McEliece encryption is believed to be far more resistant to the techniques a quantum computer could use to quickly break codes. Post-Quantum has also patented three modifications to McEliece’s original system that improve its functionality, Andersen Cheng, Post-Quantum’s chief executive officer, said.
The startup, which has conducted work for NATO and the U.K. government, sells a suite of security products. These include an authentication system that requires users to take a selfie video, in which they would read out a unique code. The recipient receives the video with a code generated by the encryption system overlaid on the image: if the code being read matches the code overlaid on the image, the message is authentic. (Plus, if the sender is known to the receiver, the receiver can recognize the sender’s image). This is designed to defeat so-called “man-in-the-middle” attacks where an intruder intercepts communications and impersonates a trusted party to gain access to a network, Cheng said.
The company also sells file encryption software that works by breaking a master decryption key into parts and distributing these — for instance, one might go to a financial firm’s customer, while another might go to a regulator. A certain minimum number of these key fragments must then be brought back together to unlock the file. This prevents insiders from abusing access privileges to steal or tamper with information, Cheng said. Post-Quantum has also developed a distributed-ledger record-keeping system, or blockchain, similar to the system that underpins the digital currency bitcoin. Financial firms are increasingly interested in using blockchains for clearing transactions and providing audit trails.
Carmody said that it was Post-Quantum’s biometric and geo-location based authentication system and its blockchain technology, as much as its quantum-resistant encryption, that convinced IPC to partner with the U.K. software company.
Cheng said that when Post-Quantum first began pitching its encryption technology to potential clients, he was routinely laughed out of the room by those who thought quantum computing was a joke. “No one is laughing anymore,” he said.
Post-Quantum received 8 million pounds in funding in July from VMS Investment Group, a private equity firm based in Hong Kong, and AM Partners. It is not the only startup trying to create encryption techniques for a post-quantum world. Quibtekk, a startup in California, is also working on similar products. | 4:20p |
Special Report: The SaaS Effect  Brought to you by MSPmentor
Like many solution providers, Sierra-Cedar has been hard at work for years helping customers modernize their existing IT systems and applications while introducing them to emerging innovations that improve efficiencies, enhance customer experiences and generate new revenue opportunities.
Few things have improved Sierra-Cedar’s fortunes quite like the emergence of software-as-a-service (SaaS), which the Alpharetta, Ga., company says its customers have embraced in a big way. A survey of several completed two years ago reveals why. Customers, Sierra-Cedar discovered, said they believed SaaS apps translated into “improved user experiences,” “easier upgrades” and “best practice functionality,” among other things.
Hard to argue with happy customers.
Which brings us to you: To help you better understand the SaaS market and what opportunities it presents, the editors at The VAR Guy, MSPmentor and Talkin’ Cloud combined forces to create this Special Report. We call it, “The SaaS Effect: A Penton Technology Channel Group spotlight on ISVs and the opportunities SaaS presents for partners.”
In addition to this market overview, you can read Talkin’ Cloud Editor-in-Chief (EiC) Nicole Henderson’s “Fast SaaS: 7 SaaS Startups You Need to Know.” From there, be sure to check out MSPmentor EiC Aldrin Brown’s take on an up-and-coming SaaS player, “ISV Atera Rides Wave of SaaS MSP Tools.”
You’re going to also want to read The VAR Guy’s story one SaaS matchmaker, SaaSMAX CEO Dina Moskowitz, and her quest to help connect developers and channel partners. Also check out The VAR Guy, aka Kris Blackmon, and her SaaS expert quiz. See for yourself just how “SaaSy” you are. (Sorry, I couldn’t resist.)
In the meantime, here’s an overview of what’s happening in the SaaS market.
A (much longer) version of this article first ran at http://mspmentor.net/software-service-and-hardware-service/special-report-saas-effect | 6:11p |
Google Explains What Went Wrong to Cause PaaS Outage  Brought to You by The WHIR
Google has released more details this week on what caused its Google App Engine outage earlier this month. The Aug. 11 outage affected 37 percent of applications hosted in its US-Central region, according to the incident report.
Google said that the outage lasted just under two hours in the afternoon (local time). Almost half of those affected, 18 percent of apps hosted in the region overall, experienced error rates between 10 and 50 percent, while 14 percent experienced error rates between one and 10 percent. Three percent had error rates higher than 50 percent.
Latency also increased for the apps impacted during the disruption. Other Google App Engine regions were not affected by the incident.
“We apologize for this incident,” Google said in the report. “We know that you choose to run your applications on Google App Engine to obtain flexible, reliable, high-performance service, and in this incident we have not delivered the level of reliability for which we strive.”
Google blamed the incident on a software update for its traffic routers, which triggered a rolling restart during standard periodic maintenance. The maintenance involved Google engineers shifting applications between data centers. The engineers “gracefully drain traffic from an equivalent proportion of servers in the downsized datacenter in order to reclaim resources.”
At this point, the reduced router capacity lead to rescheduled instance start-ups, slow start-ups and retried start requests, ultimately overloading even the extra system capacity. The company’s manual traffic redirection was still not enough to resolve the problem until a configuration error causing a traffic imbalance in the new data centers was identified and fixed.
Traffic routing capacity has been upgraded, and application rescheduling and system retry procedures will be changed to prevent a repeat of the incident.
A configuration error was also part of the cause of the brief Google Compute Engine outage in April. |
|