Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, January 2nd, 2015

    Time Event
    2:30p
    Year in Review: The 10 Most Popular Data Center Stories of 2014

    From Google abandoning MapReduce to Facebook switching off its data center, here’s a look back at the 10 most popular articles on Data Center Knowledge during 2014. Enjoy!

    Google Dumps MapReduce in Favor of New Hyper-Scale Analytics System – Google has abandoned MapReduce, the system for running data analytics jobs spread across many servers the company developed and later open sourced, in favor of a new cloud analytics system it has built called Cloud Dataflow.

    The Home Data Center: Man Cave for the Internet Age – In the ultimate manifestation of the “server hugger” who wants to be close to their equipment, a number of hobbyists and IT professionals have set up data centers in their home, creating server rooms in garages, basements and home offices.

    Enterprise server and storage equipment fills a basement space in this home data center in Canada. (Photo: VE2CUY Project).

    Enterprise server and storage equipment fills a basement space in this home data center in Canada. (Photo: VE2CUY Project).

    GE Claims Breakthrough in Fuel-Cell Tech, Launches Fuel-Cell Subsidiary – General Electric unveiled a fuel-cell subsidiary called GE Fuel Cells and said it was building a new fuel-cell manufacturing plant in upstate New York. Fuel cells use natural gas or biogas to generate electricity and have seen some adoption in the data center space.

    Microsoft Joins Open Compute Project, Shares its Server Designs – In a dramatic move that illustrates how cloud computing has altered the data center landscape, Microsoft is opening up the server and rack designs that power its vast online platforms and sharing them with the world.

    Google Using Machine Learning to Boost Data Center Efficiency – Google is using machine learning and artificial intelligence to wring even more efficiency out of its mighty data centers.

    Mining Experiment: Running 600 Servers for a Year Yields 0.4 Bitcoin – Can data centers tap unused server capacity to mine for Bitcoins? The question occurred to the team at the online backup service iDrive, which performs most of its customer backup jobs overnight, leaving its 3,000 quad-core servers idle for much of the day. So the company ran a test with 600 servers to see whether Bitcoin mining could become a secondary revenue stream.

    Tokyo’s Tsubame-KFC Remains World’s Most Energy Efficient Supercomputer – After the most powerful supercomputers in the world were pronounced last week at the International Supercomputing Conference, the Green500 published its semi-annual list of the world’s most energy-efficient supercomputers, with Tokyo’s Tsubame-KFC immersion-cooled system remaining at the number-one spot.

    The Tsubame KFC system at Tokyo Institute of Technology immerses its servers and GPUs in cooling fluid. The system topped this year's Green 500 list of the most efficient supercomputers. (Photo: NVIDIA)

    The Tsubame KFC system at Tokyo Institute of Technology immerses its servers and GPUs in cooling fluid. The system topped this year’s Green 500 list of the most efficient supercomputers. (Photo: NVIDIA)

    Facebook Turned Off Entire Data Center to Test Resiliency – A few months ago, Facebook added a whole new dimension to the idea of an infrastructure stress test. The company shut down one of its data centers in its entirety to see how the safeguards it had put in place for such incidents performed in action.

    Bitcoin Miners Building 10 Megawatt Data Center in Sweden – Facebook has new neighbors in Sweden, and they’re building Bitcoin’s version of the Death Star – a 10 megawatt data center filled with high-powered computers mining for cryptocurrency.

    Bitcoin Hardware Player BitFury Enters Cloud Mining With 20MW Data Center – BitFury, a major Bitcoin hardware vendor, has entered the hosting and cloud mining business. Fueled by $20 million in recent venture funding, the company is rolling out a global data center network, anchored by a new 20 megawatt facility in the Republic of Georgia. The announcement is bound to fuel the company’s status as a contender to be the first major Bitcoin IPO.

    Stay current on data center news by subscribing to our daily email updates and RSS feed, or by following us on Twitter, Facebook, LinkedIn and Google+.

    4:30p
    Top 10 Data Center Predictions for 2015

    DBIA, LEED AP BD+C, leads the Mission Critical Market for at JE Dunn Construction.

    With the arrival of a new year, organizations are making plans to address the explosion of enterprise data in 2015. A big question is how those plans will differ from your data center operations this past year.

    In 2014, mission critical innovation drove new standards in PUE reduction, water use and digital service efficiency.

    Pressing data center operator concerns include security, operational expense management and Internet of Things (IOT) growth. Everyone seems to be balancing big data, cloud and SDN initiatives. And with shadow IT coming to light, you have more internal influencers with input on your data center operations.

    So what does this mean for 2015? I forecast a year of incremental adaptation vs. radical change as industry buzz gives way to genuine innovation and proven methodologies. As much as some things will change, others will frustratingly stay the same.

    Predictions for the Year Ahead

    Cloud Won’t Doom Data Centers

    The impending doom of the data center industry has been greatly exaggerated. A few bright minds have shared their thoughts that the cloud is quickly taking over the delivery of IT services and our future offering data center services is bleak.

    As much as cloud is gaining momentum, FUD (fear, uncertainty, doubt) alone will prevent a seismic shift away from traditional data center services. An optimized hybrid model of both is much more likely.

    Warmer Data Centers

    Temperatures in the data center will inch higher, but not substantially overall. Large and enlightened enterprises and Web giants will continue to push the allowable temperature and humidity ranges offered by ASHRAE, but the average enterprise and colocation operator will largely operate the same. Unfortunately, the colocation market is competitive and must cater to those soliciting their services. Until enterprises change their operating parameters, energy will continue to be wasted by operating at temperatures far lower than necessary.

    TCO Doesn’t Drive Design

    Total Cost of Ownership will continue to be ignored by most of the industry. Total Cost of Ownership (TCO) has been talked about by many, but few are really using it as a decision making tool. Many data center developers and operators are still using the short-sighted approach driven by CapEx. Regardless of the projected lifespan of your data center, TCO is still the best way to optimize your decisions.

    Data Center Services Commoditization

    Commoditization of data center services will continue and accelerate. The colocation market has become increasingly price-driven. Modular designs have made smarter use of CapEx allowing operators to offer a lower price point when placed in a competitive situation. This will continue and possibly accelerate due to the efforts of the largest in the industry.

    Flat Data Center Density

    Data center densities will remain flat for the most part. Several studies have shown that average power densities remain far below what is possible with current IT technology. Although pockets of higher density cabinets will exist, overall densities will remain low.

    Greenpeace Influences Data Center Decisions

    Greenpeace will continue their ‘name and shame’ campaign and will update their naughty and nice list of data center and cloud operators based upon their use of renewable energy. In 2014, they introduced a new wrinkle by naming a few of the largest colocation providers. Additional names will be added in 2015, but the question is whether or not it will have an impact on this very competitive branch of the industry.

    Server Huggers Drive Demand

    Server huggers will continue to drive business in the locations where they live in greatest numbers.

    This will ensure that the large markets remain that way, but also that smaller markets will remain relevant. Markets like Omaha, Kansas City, and others will continue to thrive.

    Renewable Energy: More Talk than Action

    Discussion of renewable energy use in data centers will continue with little action. The Web giants will continue to invest in renewable energy, but others will stay on the sideline for the most part. Until the industry bands together to affect change, renewable energy will not reach grid parity, which will prohibit it from being used in a competitive market.

    Transactive Energy Management Gains Traction

    On-site power production, dispatchable power, and other forms of transactive energy management will become more accepted. The risks and rewards will be weighed, mitigated, and accepted.

    Green Fatigue Grows

    Green fatigue will reach record levels in 2015. Although I am a big proponent of sustainability, do we really need another conference panel focused on greening the data center? Let’s focus on the business case and let the numbers fall where they may.

    Your Thoughts on 2015 Data Center Trends

    What are your thoughts on these topics? And what other data center predictions do you have for the year ahead? Share your comments here to continue a healthy debate and industry discussion in shaping the future of data centers.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    4:30p
    Friday Funny Caption Contest: New Year’s List

    The holiday decorations are packed away but Kip and Gary’s celebration continues! Help us ring in the brand new year with this festive Friday Funny!

    Diane Alber, the Arizona artist who created Kip and Gary, has a new cartoon for Data Center Knowledge’s cartoon caption contest. We challenge you to submit a humorous and clever caption that fits the comedic situation. Please add your entry in the comments below. Then, next week, our readers will vote for the best submission.

    Here’s what Diane had to say about this week’s cartoon, “The new year might bring celebration, but it also brings new projects and work!”

    Congratulations to the last cartoon winner, Jo, who won with, “We really need to take our Green Data Center efforts through the roof!”

    For more cartoons on DCK, see our Humor Channel. For more of Diane’s work, visit Kip and Gary’s website.

    6:34p
    Expedient to Upgrade Memphis Data Center

    Pittsburgh, Pennsylvania-based Expedient Data Centers undertaking a data center upgrade project in Memphis expected to cost about $5 million. The project is part of a flurry of expansions for the company, which added capacity at existing sites in several markets and broke ground on a new site in Columbus, Ohio, in September.

    Expedient has requested a permit for an initial $2.8 million in renovations inside a former Hilton Hotel data center, Memphis Business Journal reported. This is its first data center in the southeastern U.S.

    The company provides a range of cloud and colocation services. Most of the markets it operates in are secondary data center markets, such as Columbus, Cleveland, Indianapolis,Baltimore, and Pittsburgh.

    Expedient has invested $22 million in data center expansions so far, recently finishing an expansion in Pittsburgh. It is investing the $5 million on renovation in Memphis as part of a close to $9 million investment overall in that market that will create 20 jobs.

    All of these investments bring the total size of Expedient’s footprint to 250,000 square feet.

    The provider is renting the property from North Carolina-based Five 9s Digital. Five 9s, a data center development and advisory firm, acquired the property and close to 4.5 acres from Hilton for $3.7 million. The two companies have a longstanding relationship.

    The 35,000 square foot Memphis facility is expected to open by July following the data center upgrade with an initial 7,500 square feet of raised floor. It will provide data center services to local businesses as well as Memphis outsiders.

    Expedient's Memphis data center. Legend: 1. 35,000 sq ft. total space 2. 7,500 sq. ft. initial raised floor 3. Operations support center 4. (2) KVA power feeds from 2 substations 5. (4) 338 KVA UPS 6. 450 Tons Cooling (Source: Expedient)

    Expedient’s Memphis data center. Legend: 1. 35,000 sq ft. total space 2. 7,500 sq. ft. initial raised floor 3. Operations support center 4. (2) KVA power feeds from 2 substations 5. (4) 338 KVA UPS 6. 450 Tons Cooling (Source: Expedient)

    “We believe the Memphis market will be a great complement to our existing network of interconnected data centers,” said Expedient Chief Operating Officer Shawn McGorry in the initial Memphis expansion announcement in October. “It will bring an additional geographically diverse option for our current client base, as well as a local market solution for the attractive and growing Memphis commercial sector.”

    There are not many multi-tenant data center providers in Memphis. One competitor is Zayo Group’s zColo.

    Given it’s the southeast, the market is threatened by at least one type of natural disaster. Tennessee is prone to tornadoes though the vast majority of instances are outside of Memphis. Another potential threat is flooding, with the famous Grand Ole Opry suffering a lot of damage in 2010.

    7:00p
    Five Reasons Hybrid Cloud Technologies Will Continue to Grow

    It’s 2015, and it’s safe to say that many of us have our heads in the cloud. We’re using more mobile devices, requesting even more data from a variety of data center points, and are demanding even more from the infrastructure that is designed to support the next-generation cloud platform. Data centers are becoming massive hubs for multi-tenant environments which are continuously being tasked for more resources and are experiencing even more utilization.

    The latest Cisco Cloud Index report indicates some pretty powerful trends:

    • Annual global data center IP traffic will reach 8.6 zettabytes (715 exabytes [EB] per month) by the end of 2018, up from 3.1 zettabytes (ZB) per year (255 EB per month) in 2013.
    • Global data center IP traffic will nearly triple (2.8-fold) over the next 5 years. Overall, data center IP traffic will grow at a compound annual growth rate (CAGR) of 23 percent from 2013 to 2018.

    Here’s the biggest takeaway: by 2018, more than three quarters (78 percent) of workloads will be processed by cloud data centers; 22 percent will be processed by traditional data centers.

    Furthermore, significant promoters of cloud traffic growth are the rapid adoption of and migration to cloud architectures, along with the ability of cloud data centers to handle significantly higher traffic loads.

    So why are we focusing our attention on the hybrid cloud model? Because this is the model that will begin to shape what other cloud platform will look like moving forward.

    • It’s the closest thing to an agnostic cloud. In the future all cloud models will need to be interconnected. There are many resources that are already being pulled from external resources making “traditional” private clouds technically hybrid cloud models. Now, it’s become even easier to extend your data center platform into some type of cloud extension by using a hybrid cloud resource.
    • Greater interconnectivity support. It wasn’t as easy at first. But now, traditional data center providers are creating easier, global, interconnectivity points for both private and hosted data center solutions. Bandwidth is improving, there are better optimization methods from connectivity, and overall resource utilization is becoming a lot more streamlined. APIs and open-source cloud platforms are creating some pretty powerful ways to connect with some amazing technologies.
    • A lot of new use-cases. As data centers jump on the hybrid cloud bandwagon pricing, solutions, and offerings all get a lot better. And more competitive. This means smaller enterprises can get into the hybrid cloud game. Why? Data center extension, disaster recovery and business continuity, building a “business-in-a-box,” developing a business segment that is completely cloud-based, and even creating new service offerings are just a few reasons many organizations are actively exploring a hybrid cloud model.
    • Government and regulation changes. This is really beginning to make an impact on the cloud model. For example, the recent Omnibus rule (enacted as a change to HIPAA) now allows organizations to become business associates (BA). A BA is any organization that has more than just transient access to data (FedEx, UPS, or USPS, for example). An organization can sign the business associate agreement (BAA) which would now allow them to take on additional liability to manage protected healthcare information (PHI). Regulations are also changing how data center providers approach ecommerce as well. Take a look at Rackspace and what they’ve done with PCI/DSS. At a high-level, they intelligently controlling data through the cloud, the organization’s servers and the payment gateway. This type of design allows your organization to continuously control the flow of sensitive information.
    • An ever-evolving user (and enterprise). The dynamic nature of the hybrid cloud has allowed organizations to scale very quickly. In fact, cloud orchestration allows for companies to set traffic and user thresholds to immediately scale their platform into a hybrid cloud model. Moving forward, enterprises will look for new ways to deliver their data closer to the end-user and their own locations. This means that content distribution networks (CDNs) and edge computing will become even more powerful. Look for hybrid clouds to make a big impact in expanding the capabilities of the traditional cloud and data center model.

    What does the future of the cloud and data center model look like? It’s always hard to say since forecasting technology has become harder than ever. However, we do know that cloud interconnectivity will be expanding to more end-points in just a few years. Google is very actively expanding into the home with acquisitions like the Nest purchase. We’re creating smart cars, intelligent appliances, and even a constantly connected business presence. All of this will require more from the modern data center platform, and even more from the future cloud environment.

    7:15p
    Another Suit in Failed $1B Delaware Data Center Project Filed

    The failed data center and power plant project originally proposed for the University of Delaware’s STAR campus has become messier. In addition to several previously filed lawsuits from disgruntled parties, now one The Data Centers LLC partner is suing another.

    The suit alleges that the partner mismanaged the business and hid information from the public. TDC has been searching for a new potential location, tossing Maryland in the mix as an option. The planned $1 billion project was an interesting one, since TDC was planning a cogeneration plant on site that would power the facility entirely.

    The original suit came from two firms for $1.3 million in unpaid invoices for several services performed. Engineering firm Duffield Associates and site project manager Constructure Management filed a joint complaint in Delaware Superior Court.

    Now former president of TDC Robert Krizman claims that Chief Executive Earl Eugene Kern froze him out of business affairs, then ran up millions in debt that the company cannot pay, Delaware Online News Journal reported. Krizman reportedly left an executive job at a major tech company (Jones Lang LaSalle) to form the limited liability company.

    Given the nature of lawsuits, the involved parties are tight-lipped.

    TDC has originally planned to build a large data center supported by a 279 megawatt energy generation facility featuring combined heat and power that would allow it to operate “off the grid” on a property owned by the university. While heralded as a forward-looking data center cogeneration project, it was met with sizable resistance by many members of the community.

    That resistance led to expenses racked up while trying to win approval.

    Krizman wants to be pardoned from the $1 million plus in expenses and wants the company to pay for his legal defense, the News Journal reported, citing court documents. He resigned in January 2014.

    Krizman also alleges that Kern never secured financing for the $1 billion dollar project.

    In April, the News Journal revealed details of a previous failed data center proposal by Kern at Rowan University in Glassboro, New Jersey.

    The University of Delaware terminated its lease with TDC on July 10, following intense debate. The property is now being redeveloped as a science, tech, and research campus. The Delaware Economic Development Office had originally approved a $7.5 million grant for TDC if it met certain conditions.

    Data centers have generally avoided NIMBY (Not In My Backyard) concerns as they are seen as great for communities, helping a tech scene thrive. However, in Northern Virginia, residents are currently contesting a proposed Dominion Power line and using the Delaware project as a template for their protests — a fate predicted during this fiasco.

    7:46p
    Tesora’s Trove DBaaS Certified for Mirantis OpenStack Distribution

    Tesora now has the first Mirantis-certified OpenStack Trove Database-as-a-Service. The certification ensures that the company’s DBaaS works with Mirantis OpenStack, one of the most widely used distributions of the open source cloud platform.

    Certifications generally mean easier integrations. As enterprises look to open source and hosted delivery models to evolve their IT, the issue becomes making sure all the pieces work together.

    Trove is an open source DBaaS component of OpenStack that provides functionality for both relational and non-relational database engines. It lets admins and those using the popular DevOps approach manage multiple instances of different database management systems. Its framework is extensible. It is designed to support a single-tenant database.

    Tesora has also added an enterprise distribution from Mirantis to the Tesora OpenStack Database Certification Program, ensuring compatibility with both NoSQL and SQL database management systems on Trove.

    Mirantis provides all the software, services, training, and support needed for running OpenStack in production at scale. While popular with enterprises, its free hosted tier is appealing for individual developers and small companies.

    Its OpenStack 6.0 was released last month in preview with new OpenStack Juno features.

    One of Mirantis’ biggest competitors is Red Hat. The two companies dissolved a partnership when Red Hat started expanding into Mirantis territory.

    Tesora is also certified with Red Hat and Ubuntu OpenStack flavors. It is certifying with popular distros to ensure that the Tesora DBaaS Platform installs, configures, and operates properly with these popular OpenStack distributions.

    Mountain View, California-based Mirantis is backed by several venture capital firms, as well as by Ericsson, Red Hat, and Intel Capital. It landed a huge financing round of $100 million a few months ago, a lot of which is going toward engineering efforts.

    The company also pledged to boost its partner ecosystem in a bid to create a “zero-lock-in” OpenStack distro, one example being Tesora.

    Tesora certification and support includes several databases including MongoDB, MySQL Community Edition, Percona Server, MariaDB, Redis, and Cassandra. Its aim is to make it easy to get database capacity on demand to make the most out of OpenStack investments.

    The company is a big contributor to the Trove project and is the first one out of the gate with a commercially-available Trove product, the DBaaS Platform Enterprise Edition. General Catalyst, Point Judith Capital, and a number of angel investors back the company.

    OpenStack Trove is also available as an option in HP’s IaaS and PaaS combo Helion.

    “We are seeing great interest in OpenStack as an enterprise private cloud platform, and database as a service with Trove is important to the project,” Boris Renski, chief marketing officer at Mirantis, said in a statement. “Our partnership with Tesora gives our customers greater confidence when provisioning and managing a wide range of enterprise databases in their OpenStack clouds.”

    The news speaks to another trend, which is DBaaS itself. Databases are notoriously hard to manage and have a history of staying on premises because of the heavy duty data lifting they perform. The complexities and difficulties of running databases on owned hardware often coupled with extendable, open source database technology has provided a big opportunity.

    It’s an opportunity that Rackspace wanted to capitalize on with its acquisition of ObjectRocket for MongoDB (and now other services like Redis) and IBM with its Cloudant acquisition.

    << Previous Day 2015/01/02
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org