Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Monday, December 22nd, 2014

    Time Event
    1:00p
    10 Ways Data Center Industry Will Change in 2015

    As we approach the end of 2014, those in IT who like to ponder industry trends send us their predictions for next year. Here are some of the more interesting predictions we have received from folks so far. Stay tuned for more 2015 predictions on Data Center Knowledge in the coming weeks.

    Here it is, our list of data center industry trends that will dominate the conversation in 2015:

    1. The Big Guys Build Their Own Networks

    Operators of massive data centers, the hyperscale cloud and Internet content providers, also known as web-scale operators, are increasingly buying their own switching gear and dark fiber to expand their networks instead of relying on commercial carriers, according to Cyan Networks, a U.S. networking technology vendor. In 2015, these networks will account for more than half of all network interconnect capacity for the first time ever, Cyan predicted.

    2. Network Operators Will Switch to Data Center Hosted Solutions

    Less attention on hardware and more attention on software development and operations (DevOps), lower-cost packet switching in data centers, and high cost and complexity of scale in IPv6 Internet is driving network operators to data center hosted solutions, interconnected by agile optical networks, according to Cyan.

    3. Carriers Will Invest a Lot More in Data Centers

    New DevOps personnel will get more power over strategic direction, and carriers will double down on data center investment, according to Cyan. Telcos will be defining more products and services based on telco cloud.

    4. Get on Board With Web-Scale or Suffer

    Single-app Web 2.0 companies will continue to proliferate and so will commoditization of cloud infrastructure technology. Platforms like OpenStack have enabled an explosion of cloud-based services and apps, and the web-scale way of building out data center capacity will be a must. Operators and vendors that fail to evolve to support the web-scale paradigm will struggle in 2015, according to Cyan.

    5. Real Time Insight Will Replace Batch Processing in Data Lakes

    The data lakes and data hubs that have been popular ways to deploy Hadoop this year will turn into processing data platforms, predicts John Schroeder, CEO and co-founder of MapR, one of the biggest Hadoop distribution vendors. Data lakes and Hadoop are great because of their agility and the ability to use thousands of servers to store petabytes of data at less than $1,000 per terabyte per year. But companies will move from batch to real-time processing and integrate file-based, Hadoop, and database engines into large-scale processing platforms.

    6. There Will Be Fewer Hadoop Vendors

    Adoption of Hadoop at scale around the world is beyond any other data platform 10 years after it was conceived, Schroeder said. Hadoop is still in the innovation phase today, and vendors who have made the mistake of adopting “Red Hat for Hadoop” strategy are existing the market. Intel is the most notable example, Schroeder said, forecasting that EMC’s Pivotal will follow. Hadoop vendors will be consolidating as new business models emerge, while others exit.

    7. Docker Will Disrupt IT Departments

    No list of 2015 IT trends should go without a mention of Docker. The modern, lightweight form of containers is the next-generation virtualization technology, according to 451 Research. Large enterprises will use Docker more and more, in some cases running alongside traditional VMs and in some replacing them completely because of its management and efficiency advantages. Docker orchestration and security are not quite there yet, but there are plenty of vendors working to address these issues, 451 analysts said.

    8. Non-x86 Silicon Will Begin Its Rise

    Multiple vendors, including Cavium and Applied Micro, came out with server-grade ARM chips this year. Google and Rackspace, among others, got involved with IBM’s OpenPOWER foundation, opening up the architecture and accelerating development on it. These two things will make 2015 the year alternative silicon really begins to rise, according to Rackspace CTO John Engates.

    9. OpenStack Will Get Boring

    OpenStack will celebrate its fifth birthday in 2015, and, according to Engates, that birthday will be boring. Rackspace sees the technology mature and shift from mostly test and development to mainstream production. With maturity will come simplification. OpenStack will be easier to use, manage, and scale.

    10. More Flash in the Server

    As customers in verticals like healthcare, finance, and retail demand faster and faster access to critical applications, more companies will deploy servers with flash storage embedded, according to Dell. These are companies where instant transactions define the experience of their customers.

    4:30p
    The Dynamic Data Center: 3 Trends Driving Change in 2015

    Kevin Leahy is group general manager for the data center business unit at Dimension Data, a $6 billion global ICT services and solutions provider.

    The data center is in a period of unprecedented transition – driven, in part, by greater cloud provisioning, increased automation, mobile reliance, rising costs and the constant need to do more with less. As the rapid pace of change continues, full-speed ahead, important data center trends are rising to the fore that will help shape the industry in 2015 and beyond.

    Data Center: Both Shrinking and Expanding

    No, this isn’t a paradox or mind-bending riddle. Rather, while smaller, enterprise-owned data centers are seeking guidance on how to reduce their footprint, large colocation providers see nothing but growth on the horizon.

    What’s impelling enterprise data centers to scale back? The cloud is playing a major role. A year or two ago, companies were debating if they were ready for the cloud. Now, it’s no longer a question of if – it’s about how much they can get out of the cloud. What portions of their environments can they move there? How quickly can they transition workloads? As organizations answer these questions, they’re re-evaluating the amount of physical space they need for their data centers, often reducing footprints by up to 50 percent.

    Meanwhile, large data center providers are seeing their growth go unabated. They’ve made substantial investments in the efficient use of power and cooling, so their energy consumption per unit of computing power is much lower than other companies could achieve by themselves, often making colocation a win-win.

    New Skill Sets Emerge

    Unlocking the business value within technology is becoming more important than the technology itself, as IT leaders face challenges including how to harness social media, whether to use the cloud and for which workloads, how to maximize big data and analytics, and the role of mobility. This shift is driving organizations to hire for an entirely new set of skills within the data center – moving away from individualized domain experts in the stack of the infrastructure to skill sets that focus on automation, API integrations between technologies, the outcomes of user experience and how to integrate the new with the old.

    In many cases, organizations are also looking to partner with managed services providers that are well-versed in bundled technologies, how to drive greater levels of automation and integration with legacy infrastructure. This helps organizations look beyond the nitty-gritty of their individual technologies and instead focus on the business outcomes they want to achieve.

    Getting More Agile

    The cloud and other transformational technologies, along with the concept of consumption-based billing for IT capacity, are allowing all businesses – not just the big players – to be more flexible, creative and effective at solving problems. As these trends create a move away from yesteryear’s waterfall processes and idle-driven systems, companies are seeing that creating small, incremental changes can empower IT leaders to make game-changing decisions, minimize risk and lead to much more stable environments, not to mention more productive and collaborative workforces.

    As such, look for IT leaders to continue to embrace agile development in the year ahead, and thus be more responsive to change within the data center. Because developers can test new ideas quickly the data center is better able to drive cost efficiencies and unleash innovation.

    Happy New Year!

    Many of the above trends are interrelated, and when applied in concert, their positive impact on business grows. As companies seek to keep growth and greater business value a constant – amidst the data center’s rapid evolution – we wish you good luck in the year ahead!

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    7:11p
    QTS Raises $100M to Fund Data Center Construction

    Data center service provider QTS Realty Trust has expanded its credit facility by $100 million, money the company said will help it fund data center construction around the U.S.

    QTS is expanding capacity at its already operating sites in Atlanta, Dallas, New Jersey, Northern Virginia, Silicon Valley, and Sacramento, California. It also recently bought a former newspaper printing plant in Chicago it plans to convert into a data center – its first in that market.

    Access to new capital brings the company’s total unsecured credit facility to $650 million. New credit facility has an accordion feature that enables the company to expand it by up to $200 million with additional commitments.

    KeyBank National Association led the consortium of banks that participated in the deal to fund the company’s data center construction. The syndicate consists of 17 institutions, including Bank of America, Deutsche Bank, and Regions Bank.

    QTS went public as a real estate investment trust in October 2013.

    The company has been expanding national presence by buying, redeveloping or expanding properties. The Chicago Sun-Times plant was only the most recent example.

    The Dallas-Fort Worth data center it launched in October was a former semiconductor plant. In July, QTS bought a McGraw Hill Financial data center in New Jersey, leasing a portion of the facility back to the firm and planning to expand capacity to add more tenants there.

    “The increased credit facility capacity, extended term and lower interest rates enhance the company’s financial flexibility and liquidity as we continue to expand our facilities,” QTS CFO Bill Schafer said in a statement.

    9:30p
    Former Bitcoin Foundation Exec Sentenced for Black Market Cash Exchange on Silk Road

    logo-WHIR

    This article originally appeared at The WHIR

    Charlie Shrem, a former Bitcoin Foundation executive, was sentenced to two years in prison on Friday. He pled guilty earlier this year for aiding and abetting an unlicensed money transmitting business. In 2012, Shrem indirectly helped send one million dollars of bitcoin digital currency on the TOR site Silk Road. Robert Faiella, who was the other party involved in the transaction, pleaded guilty to operating an unlicensed money transmitting business and will be sentenced in January.

    Over 400 additional TOR sites selling drugs, guns and a number of other illegal goods and services were taken offline by law enforcement in early November. The original Silk Road was shut down by the FBI in 2013. Ross Ulbricht, the alleged operator, is scheduled to face trial in January and has pleaded not guilty. Bitcoins seized during the Silk Road investigation were auctioned off in early December by US Marshals. Blake Benthall, the alleged operator of the followup site Silk Road 2.0 has also been charged by the FBI.

    “I screwed up,” Shrem told US District Judge Jed Rakoff. “The bitcoin community, they’re scared and there is no money laundering going on any more. They’re terrified. Bitcoin is my baby, it’s my whole world and my whole life, it’s what I was put on this earth to do,” said Shrem in a Bloomberg article. “I need to be out there. If your honor grants me that, I can be out there in the world, making sure that people don’t do the same stupid things that I did.”

    According to Bloomberg, US District Judge Jed Rakoff said Shrem had committed a serious crime and that the prison term was warranted. “There’s no question that Mr. Shrem, over a period of many months, was knowingly, wilfully, to some extent excitedly and even passionately involved in activities he knew were, in part, involved in serious violations of the law,” the judge said. He is also required to forfeit $950,000 to the US government.

    On Friday Shrem tweeted, “I’ve been sentenced to two years, to self surrender in 90 days. Considering I was facing 30 years, justice has been served. #Bitcoin.” Since then, he has received a lot of support through Twitter and the hashtag #freeshrem is being used to express disagreement with the sentence.

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/former-bitcoin-foundation-exec-sentenced-black-market-cash-exchange-silk-road

    10:00p
    Researchers Say Decentralized BitTorrent Client Prevents Shutdown A La Pirate Bay

    logo-WHIR

    This article originally appeared at The WHIR

    A group of researchers from Delft University of Technology have launched the latest version of Tribler, a BitTorrent client that doesn’t rely on central servers and offers anonymous downloading.

    According to a report by TorrentFreak, because Tribler doesn’t rely on a central (directory) server, it is impossible to shut down.

    The timing of Tribler comes as authorities have shut down the largest torrent site on the Internet, The Pirate Bay, prompting many copies to come online. Since the Pirate Bay shutdown, Tribler has seen a 30 percent increase in users.

    According to the Tridler website, it does not use the normal Tor network, but instead created a dedicated Tor-like online routing network exclusively for torrent downloading.

    “Bittorrent offers no privacy protection. It is easily traced that you are downloading certain (controversial) content,” Tribler explains on its website. “Content is offered openly and everybody can see who is doing what. Dissidents and unpopular opinions can be easily discovered and subsequently stopped.”

    Tribler’s lead researcher Prof. Pouwelse tells TorrentFreak that its team has been working for a decade “to prepare for the age of server-less solutions and aggressive suppressors.” The project has received more than 3.53 million Euro in funding.

    The anonymity feature allows Tribler users to share and publish files without broadcasting their IP addresses, TorrentFreak said. The feature is built into this release with end-to-end encryption.

    One issue with anonymous downloading is higher bandwidth usage. According to the report, with the anonymous feature users become proxies and have to rely on the transfers of others.

    Earlier this year, Tor investigated a five-month attack on its network that may have unmasked anonymous users.

    This article originally appeared at: http://www.thewhir.com/web-hosting-news/researchers-say-decentralized-bittorrent-client-prevents-shutdown-la-pirate-bay

    10:10p
    Israeli Enterprise Hadoop Startup Xplenty Says US Market Momentum Strong

    Xplenty has executed about 500,000 Hadoop-facilitated data jobs across more than a petabyte of data all together since the company’s U.S. rollout a few months ago. The startup provides tools to make enterprise Hadoop easier through a simple web-based drag-and-drop interface.

    CEO Yaniv Mor suggests that Hadoop’s scale is creating personnel challenges, some businesses struggling to find the talent needed. Xplenty’s sell is making Hadoop easy with no coding, addressing a market whose growth is outpacing the growth of supply of data scientists.

    Its multi-platform functionality allows users to import data from wherever it may be stored, then process and prepare it for analysis with BigQuery, Redshift, and more. The software both simplifies and speeds up data processing, the company says.

    Xplenty is fully supported across Google Cloud Platform, Amazon Web Services, Rackspace, and IBM’s SoftLayer cloud.

    The Tel Aviv-based enterprise Hadoop company raised a $3 million round in October, which it put toward establishing a U.S. presence, and, judging by this week’s announcement, it is happy with the early results.

    “The growth exceeded expectations (in such a short time frame), and I believe it was due to the maturity of the market accepting and using cloud services such as Xplenty to process and handle big data,” Mor said via email.

    The closest competitors would be the Elastic MapReduce service by AWS and big data integration software provider Talend, which raised $40 million last year.

    In term of some early lessons learned, Mor said customers want more and more connectivity to a wide range of data sources, so the company is working to provide additional connectors to enable as many data sources as possible.

    10:20p
    Report: Third of Q3 Hardware Spending Was on Cloud Hardware

    About a third of the money companies around the world spent on servers, disk storage systems, and Ethernet switches in the third quarter was for cloud infrastructure, according to a recent report by IDC.

    Companies spent nearly half of that on public cloud infrastructure, while the other half was spent to stand up private cloud environments. Total cloud infrastructure revenue for the quarter was $6.5 billion, up 16 percent year over year.

    Public cloud providers large and small have expanded their data center capacity substantially throughout this year which means many big cloud hardware purchases.

    IBM, for example, in January committed $1.2 billion to global cloud data center capacity expansion, and has added new sites to its portfolio throughout the year. The company rolled out the most recent expansion just last week, announcing addition of 12 data centers to the footprint.

    Software-as-a-Service giant Salesforce said it would add a data center in Paris in September, launched a U.K. data center in November, and announced an upcoming new site in Japan earlier this month.

    Amazon Web Services launched a data center in Germany in October.

    A lot of the private cloud build-out that occurred this year can be attributed to OpenStack, the open source cloud architecture and software that has grown tremendously popular. There is now a multitude of vendors offering enterprises their help in setting up private clouds.

    They include big vendors, such as HP, which aims to be able to set up OpenStack on cloud hardware of the customer’s choice, be it new servers by any vendor or servers they already have, and startups like Piston Cloud Computing, which pushes the idea that next-generation clouds should be built on cheap commodity boxes.

    These vendors also offer help with OpenStack cloud infrastructure setup to telcos, for many of whom OpenStack has become a way to quickly enter the public cloud services market as providers.

    IDC defines public cloud services as services shared by unrelated users and designed for a market, as opposed to a specific organization. A private cloud service, according to the market research firm, is designed for a single enterprise and can be deployed at the user’s own data center or at another company’s facility. Private cloud services can be managed by a third party or by the user.

    Richard Villars, vice president of data center and cloud research at IDC, said effective use of public and private services will give “first-mover advantage” to any company in a competitive market. “Whether internally owned or ‘rented’ from a service provider, cloud environments are strategic assets that organizations of all types must rely upon to quickly introduce new services of unprecedented scale, speed, and scope,” he said in a statement.

    << Previous Day 2014/12/22
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org