Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, February 12th, 2014

    Time Event
    1:29p
    HP Rolls Out Vertica Marketplace for Big Data Analytics

    HP rolls out Vertica Marketplace as an online destination for big data analytics, and SGI and Cognilytics partner to bring advanced analytics to enterprise big data.

    HP Launches Big Data Analytics Marketplace

    HP (HPQ) announced Vertica Marketplace, an online destination for developers, HP Vertica users and technology partners to create and sell innovative big data analytics solutions built for the HP Vertica Analytics Platform. Meant to be a hub for developers, the marketplace enables partners and customers to create and share extensions, enhancements and solutions that integrate with, and enhance the value of, the HP Vertica Analytics Platform. These add-ons and solutions include connectors and third-party extensions, business intelligence tools, Extract Transform and Load (ETL) and data transformation products, connectors and tools for HP HAVEn Big Data analytics platform, as well as industry and other original equipment manufacturer (OEM) solutions. It will also have the latest solutions from HP Vertica’s innovations incubation program, allowing users to create cutting-edge big data applications.

    “Our rapidly growing community of customers, partners and developers are building vertical and horizontal solutions on, and creating new add-on capabilities to, Vertica every day,” said Colin Mahony, vice president and general manager, Vertica, HP.  New innovations from the Vertica Incubation program include Vertica Distributed R, Vertica Pulse for leveraging an in-database sentiment analysis tool, and Vertica Place – which stores and analyzes geospatial data in real time.

    SGI and Cognilytics Partner for Enterprise big data

    SGI and Cognilytics announced a partnership to help enterprises monetize their data as a strategic asset. Combining SGI’s HPC technology for Big Data analytics, the recently announced plan to develop a SAP HANA in-memory computing appliance, and Cognilytics’ expertise in implementing technologies such as Hadoop and SAP Platform and Analytics solutions including SAP HANA and Predictive Analysis, the partnership enables the enterprise to closely align analytics initiatives with current business models to fully capitalize on results and achieve business objectives.

    “Cognilytics has extensive experience in providing advanced predictive solutions to a range of business needs across multiple industries,” said Jorge Titinger, president and CEO of SGI. “Coupling SGI’s breakthrough compute and storage solutions with Cognilytics’ application environment expertise gives lines of business a powerful advantage in leveraging Big Data to gain insight, reduce risk, and fuel economic growth.”

    Solutions delivered through the partnership include SGI InfiniteData Cluster and SGI UV;  with Hadoop and SGI UV with SAP HANA, SGI and Cognilytics – for a a turnkey solution to enable knowledge discovery through advanced graph analytics and micro-second time to results. ”SGI’s expertise in high performance computing highly complements Cognilytics’ expertise in predictive modeling,” said Gary Gauba, Cognilytics chairman and CEO. “Delivering application integrated solutions for Big Data powered by SGI infrastructure increases our ability to help customers convert Big Data to a strategic asset.”

    1:41p
    Red Hat and Hortonworks Deepen Strategic Alliance

    With the goal of enabling the next generation of data-driven applications, enterprise Apache Hadoop provider Hortonworks and Red Hat (RHT) announced an expanded  relationship through the formation of a strategic alliance. The two firms will integrate product lines, create joint go-to-market initiatives and provide collaborative customer support. The companies also announced the availability of the beta program for the Hortonworks Data Platform (HDP) plug-in for Red Hat Storage.

    A comprehensive open source approach to the Hadoop platform can address the growing requirements of key enterprise stakeholders and their big data initiatives. With an enterprise Apache Hadoop platform that is tightly integrated with open hybrid cloud technologies, the two companies plan to deliver infrastructure solutions that will enable enterprise stakeholders to more quickly analyze big data to realize business insights and value.

    “At the rapid rate that enterprises expand their Hadoop requirements – due to the business consistently identifying new use cases and more internal stakeholders – the Red Hat and Hortonworks strategic alliance provides a seamless approach to enabling the next generation of data-driven applications,” said Shaun Connolly, vice president, corporate strategy, Hortonworks. ”Our mutual customers complement both their Hadoop strategy and commitment to community-driven open source innovation.”

    A new beta version of the Hortonworks Data Platform (HDP) joins with Red Hat Storage to provide a secure and resilient general-purpose storage pool with multiple interfaces, including Hadoop, POSIX and OpenStack Object Storage (Swift). HDP combined with Red Hat Enterprise Linux and OpenJDK will give enterprise operators and application developers the ability to scale Hadoop infrastructure and enable faster development of enterprise-strength analytic applications. For developers, HDP with Red Hat JBoss Data Virtualization will integrate Hadoop with existing information sources including data warehouses, SQL and NoSQL databases, enterprise and cloud applications, and flat and XML files.

    “Hortonworks and Red Hat are natural partners given the strong commitment to open source on both sides,” said Matthew Aslett, research director, data management and analytics, 451 Research. ”The strategic alliance will benefit Hortonworks and Red Hat customers looking to develop and deploy Hadoop applications, as well as the wider Hadoop community as the results of joint-development work are contributed back into Apache Hadoop.”

    “Data – specifically data running processed with Hadoop – is the killer application for the open hybrid cloud,” said Ranga Rangachari, vice president and general manager, Storage and Big Data, Red Hat. ”Enterprises are looking to IT solution providers to help with a dramatic reduction in time-to-results for their big data projects. Red Hat’s strategic alliance with Hortonworks is focused on helping customers with efficiency and agility as they embark on big data projects.”

    2:22p
    The Ergonomic Data Center: An Ode to the Column

    Chris Crosby is CEO of Compass Datacenters.

    chris-crosby-tnCHRIS CROSBY
    Compass Datacenters

    The Merriam and Webster dictionary defines the word “ergonomic” as follows: “The parts or qualities of something’s design that make it easy to use.”

    If this definition in no way describes your data center, you’re not alone. Although a number of factors go into the design of the average data center, the ease of its use for you and your personnel is not among them. Interestingly enough, no one ever talks about the “little things” that would enable a data center to actually promote more efficient operations. Let’s face it, if zoning ordinances didn’t require it, the average site would probably feature what could be euphemistically described as “detached modular” toilet facilities.

    This two-part series is designed to help you identify areas within your next data center that can, and should be, addressed to make sure that your new site accommodates your requirements rather than the other way around.

    Sometime in the long distant past, college educators decided that all future sheepskin recipients could not go out into the world without an appreciation for the virtues of ancient art and architecture. From this humble beginning the “art in the dark” class was born in which many of sat dutifully in our seats as countless pictures of ancient treasures passed before our eyes in an effort to ensure that we would never confuse a Goya with a Rembrandt–at least until after we had taken the final.

    As we all remember, our study of classical architecture included an in-depth analysis of columns and the temples and arenas they adorned. In the ancient world, columns were everywhere. I don’t know if the Greeks invented them, but they sure had a lot of them, and the Romans were pretty big fans of them as well. As a result, don’t we all think adding a little Greek revivalist into the design mix lends an air of class to any building? Would the White House or the Capital building look half as nice without their respective architectural nods to the ancients? Perhaps the one place where we can live without this Greco-Roman influence is in our data centers.

    What’s The Point of Columns, Anyway?

    Since most of you have spent more than a little time inside a data center, I think you’ll agree that their designers must be big fans of the ancients, particularly in the area of columns, since most facilities have more of them then the Parthenon. For example, columns are commonly found within most multi-tenant data centers (MTDC). Because of the size of these sites, they serve a very important role—they hold the roof up. This make sense when you consider the fact that no one wants to see a few million dollars worth of servers and storage gear crushed beneath several tons of concrete and steel.

    Unfortunately, this architectural necessity can cause serious layout issues. In short, columns get in the way. At a time when on-floor layout options are at a premium, the possibility of stranding capacity due to the giant pole sitting amid your row of your high-density servers is an issue you’d probably choose to avoid. As many of us have concluded through our personal experiences, while the Greeks may have loved them, the column is not your data center friend.

    Building Size, Room Size

    Along with their devotion to the column, our toga clad ancestors were big fans of large buildings. Who doesn’t like to have a little elbow-room when attending a Gladiator match or bacchanalian festival? In terms of layout flexibility, doesn’t this desire for usable space make the limitations imposed by today’s pre-fabricated modular offerings about as useful to the average end user as sandals were to Achilles? Featuring average dimensions of approximately 12’ by 40’ due to their needs to fit on a flatbed for transportation, these solutions offer less than 500 square feet of space. Fine, perhaps, for the emperor’s box at the Coliseum, but not so much if you are trying to configure applications with load groups measured in anything over a few hundred kW.

    Since, as ancient architects understood, in terms of layout flexibility, square footage isn’t necessarily all the same, it is important to understand the configuration of your prospective data center in advance. Understanding the physical features that can limit your layout options is essential when selecting a data center provider.

    This is one of those things where it’s better to identify it upfront versus being unpleasantly surprised later on. Although the penalties for not understanding the impact of your data center’s design on your ability to support your applications aren’t as draconian as in the days of our Greek and Roman forefathers, they could make you wish you’d stayed awake more often during class.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    3:15p
    LinkedIn Will Add A Third Data Center Next Year

    LinkedIn plans to add another data center next year to support its continued growth. The company recently began operating its first data center in Ashburn, Virginia and is preparing its second facility in Dallas.

    In its quarterly earnings call last week, LinkedIn said it intends to open a third data center in 2015.  It didn’t indicate the location of the facility, but a West Coast site appears likely, given the company’s existing footprint. LinkedIn is likely to use wholesale data center space, as it has for its first two data centers, which have been leased from Digital Realty Trust. In a wholesale deal, a tenant leases a finished “turn-key” data hall from a third-party provider.

    “Colocating LinkedIn across these three locations will improve both the site’s speed and experience for our members,” said Steven Sordello, Chief Financial Officer of LinkedIn, the social network for business. “Our goal is to host LinkedIn on a self-managed infrastructure, ultimately enabling us to control capacity growth while more efficiently managing our cost structure, with approximately 50 percent in opex savings per data center once each facility is fully tested and operating at scale.”

    Big Data Drives Need For Big Data Centers

    The computing horsepower to store and analyze information for 277 million users requires data center space. Professionals are signing up to join LinkedIn at a rate of more than two new members per second. LinkedIn has built a success story by focusing on job seekers, recruiters and corporate networking. LinkedIn’s data scientists have access to a rich trove of data about trends in hiring and careers, and the company has developed extensive software and tools to process and analyze its data.

    Last year LinkedIn secured a new data center in northern Virginia, leasing 6 megawatts of space from Digital Realty. The $109 million, 11-year deal marked LinkedIn’s first use of wholesale data center space, in which it leases a finished “turn-key” data hall from a third-party provider.

    In November,  the company signed a $116 million, 11-year lease for a large chunk of data center space at a Digital Realty campus in Dallas. LinkedIn also rents space in data center facilities operated by Equinix, which offers colocation space, in which servers are housed in dedicated cages within a data hall that may also house other tenants.

    The need for additional data center space is one reason that LinkedIn recently announced plans to raise $1 billion in a secondary stock offering last year. Sordello said the new data center capacity would boost capital expenses for the company over the next three years.

    “In order to ensure each new data center’s security and performance, we must build new server capacity while growing legacy facilities in parallel,” said Sordello. “In 2013, we spent approximately $75 million on our first facility, and an additional $25 million in parallel capacity. As we exit this project over the next two to three years, we plan to resume a more normal capex level toward our eventual 10% of revenue goal.”

    << Previous Day 2014/02/12
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org