Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, February 27th, 2013

    Time Event
    12:30p
    Data Center Jobs: Alban Cat

    At the Data Center Jobs Board, we have a new job listing from Alban Cat, which is seeking a EPG Field Service Supervisor in Sterling, VA.

    The EPG Field Supervisor is responsible for directing and supervising activities of field operations and repairs; traveling from site to site and providing guidance and support to the EPG field technicians; and addressing customer concerns and disputes and resolving them in a timely manner. To view full details and apply, see job listing details.

    Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed.

    2:30p
    Data Center People: Cohesive FT, JE Dunn, 451 Research, DataBank

    Here’s an update on some of this week’s executive hiring announcements from the data center industry:

    JE Dunn Adds Ron Vokoun – JE Dunn Construction Company, the 10th largest domestic general building contractor in the United States, is pleased to welcome Ron Vokoun to our Mission Critical team. Vokoun has more than 25 years in the industry, with experience as both an owner and contractor, providing mission critical leadership to multiple industry sectors. Vokoun is based in JE Dunn’s Phoenix office, and will help foster the continued growth of JE Dunn’s Mission Critical vertical market on a national level. Prior to joining JE Dunn, Vokoun spent the last 10 years providing industry leading thought leadership on mission critical projects. He is involved with the U.S. Green Building Council, 7×24 Exchange, Green Chamber of Greater Phoenix and serves as Vice President of the AFCOM Arizona chapter. Vokoun graduated with a Bachelor of Science in construction management from the University of Nebraska-Lincoln. He is also a regular contributor to the Industry Perspectives column at Data Center Knowledge.

    CohesiveFT Hires Chris Swan as CTO – CohesiveFT, which provides enterprise application-to-cloud migration and application software defined networking (SDN), has hired Chris Swan as its new Chief Technology Officer. Swan, formerly CTO for Client Experience at UBS, is an active enterprise user and thought leader. He is a member of the Open Data Center Appliance, and frequently speaks at Cloud Camp London events. He will join the CohesiveFT London office, though his responsibilities will take him to events and customers globally. “I am very excited to be joining the talented team at CohesiveFT and working in the exciting cloud computing automation and SDN space,” said Swan. “I have been following CohesiveFT since their beginning as both as a potential customer and industry peer.” Before joining UBS Swan was CTO at a London-based technology investment banking boutique, which operated a cloud-only IT platform. He previously held various senior R&D, architecture and engineering positions at Credit Suisse, which included networks, security, data centre automation and introduction of new application platforms. Before moving to the world of financial services Chris was a Combat Systems Engineering Officer in the Royal Navy. He has an MBA from OUBS and a BEng in Electronics Engineering from the University of York.

    Tony Bishop Joins 451 Research – The 451 Group, the corporate parent of 451 Research, Uptime Institute and Yankee Group, has acquired advisory and research firm Applied Velocity Labs, the developer of a proprietary digital infrastructure strategy framework adopted by multiple Fortune 500 companies and leading technology-industry providers. Bishop will join The 451 Group as Chief Strategy Officer and Co-Head of 451 Advisors, which provides advisory services to The 451 Group’s enterprise, IT vendor and service-provider clients. “We are delighted to welcome Tony Bishop and the Applied Velocity Labs team to join me and the 230+ professionals here at The 451 Group,” said The 451 Group’s Chairman and CEO, Martin McCarthy. “The convergence of systems, facilities and IT under the umbrella of digital infrastructure is an important dynamic in our industry. Tony Bishop has an excellent track record in helping business and technology executives navigate digital infrastructure challenges.”

    Aaron Alwell Joins DataBank Management Team – DataBank Holdings, a data center and colocation service provider based in Dallas, this week announced the hiring of Aaron Alwell as the VP of Marketing. Alwell will oversee all aspects of marketing, lead generation, and branding for the company as well as spearhead their entry into new markets. Alwell joined DataBank from AnoGenex, where he has served DataBank in a consulting role for the past year.
    “I am very excited to officially welcome Aaron to the team,” said Tim Moore, CEO of DataBank. “Aaron has already had a big impact on the company’s visibility and lead-generation, and we plan to leverage his industry experience to foster additional share in our current market as well as engaging new ones as we execute our growth-plan.” Previous to DataBank, Alwell served as the Director of Marketing for Denver-based ViaWest and Dallas-based Dataside.

    2:32p
    Brokering IT Services Internally: Building the Order Process

    Dick Benton, a principal consultant for GlassHouse Technologies, has worked with numerous Fortune 1000 clients in a wide range of industries to develop and execute business-aligned strategies for technology governance, cloud computing and disaster recovery.

    Dick Benton GlasshouseDICK BENTON
    Glasshouse

    In my last post, I outlined the fourth of seven key tips IT departments should follow if they want to begin building a better service strategy for their internal users: advertise the Ts and Cs. That means developing a simple and easy-to-read list of the terms and conditions under which IT services are supplied. Now we will address actually building the order process, so that services can be provisioned in an automated way that can satisfy today’s on-demand consumers. One of the major cloud differentiators is its ability to support self-service selection followed by automated provisioning; in other words, being able to offer services on a Web page which will trigger scripts to automatically provision the selected resource.

    Historically, the largest roadblock to an expedited provisioning process is the “legacy” approach to service fulfillment. Typically, when the busy consumer seeks access to a resource, he or she is faced with a daunting obstacle course of approvals. Usually, this will involve a lengthy form to complete (sometimes even an online form) with an explicit rationale for the request, along with minute detail on the amount of resources to be consumed and information like peak loadings. Some organizations include an additional section for risk management covering the impact or association the resource may have with various compliance legislation and internal compliance regulation. Other sections will provide space for approval of active directory additions, network access, storage utilization, backup protection and even disaster recovery. Often there will be a section covering the data to be used with the resource and a questionnaire on this data covering its corporate sensitivity and the need for encryption.

    CYA Approach

    Although the process seems to place every conceivable obstacle between the consumer and the resource sought, the process is typically one that has been built over time. Each checkpoint probably developed in response to some painful, embarrassing or uptime-impacting event in the past. The process is designed to specifically eliminate the possibility of those past events recurring. The “cover-your-backside” process is not uncommon in IT procedures.

    Just-in-case provisioning is designed to ensure that the risk of any inappropriate or unauthorized allocation or usage is reduced to near zero. In the past, this has been the foundation of provisioning time frames lasting weeks and even months. Also, it’s why IT is often seen as an impediment rather than an enabler of the business. These forces have also driven an evolution in the defense mechanisms of the resource requestor. The resource requestor learned what the “right” answers were in order to get their request through the process with only a three- or four-week delay. Again, typically there is no detection mechanism nor any procedure for withdrawal of a resource found to have been used in a manner that is different from its “planned” purpose.

    3:15p
    Can IPS Devices and Firewalls Stop DDoS Threats?

    Cloud computing and the growing usage of the Internet has placed even greater demands on a corporate data center. Now, organizations are relying more and more on their IT infrastructure to be the mechanism to drive growth and enable agility. Because of this focus on the data center, concerns around security have continued to grow as well. As a result, the growing scale and frequency of distributed denial of service (DDoS) attacks are taking a toll on these businesses.

    The creativity in attacks has evolved with the growth in data center utilization. Where “volumetric” attacks were common, now organizations have to deal with advanced application-layer attacks. Furthermore, they are seeing greater amounts of attack-based data being thrown at an organization. The challenge now becomes understanding how modern security system interact with DDoS attacks.

    IPS devices, firewalls and other security products are essential elements of a layered-defense strategy, but they are designed to solve security problems that are fundamentally different from dedicated DDoS detection and mitigation products. When analyzing the structure and impact of a DDoS attack, administrators must understand that their current security infrastructure may not necessarily protect them against a denial of service attack. This is where working with Intelligent DDoS Mitigation Systems is a must. IDMS solutions are placed within a data center to help prevent both volumetric and application-layer attacks. Arbor Networks outlines the key features of IDMS and how they can benefit an organization. These features include:

    • Stateless
    • Inline and Out-of-Band Deployment Options
    • Scalable DDoS Mitigation
    • Ability to Stop “Distributed” DoS Attacks
    • Multiple Attack Countermeasures
    • Comprehensive Reporting
    • Industry Track Record and Enterprise

    Download this white paper from Arbor Networks to see where current security devices fall short and how a DDoS attack can actually maneuver around modern firewalls and IPS solutions. By securing both internal and external data center components, security administrators create a logical layered defense strategy. By doing so, managers are able to be proactive against attacks and help prevent data loss, unwanted intrusions, and increase uptime.

    3:30p
    Cloudera Updates Enterprise Hadoop Platform

    The O’Reilly Strata Conference is underway this week in Santa Clara, and Cloudera, DDN and MapR all have big data announcements. The Strata Conference online conversation can be followed on the Twitter hashtag #Strataconf.

    Cloudera Enterprise evolves. Cloudera announced the next evolution of its platform for big data, Cloudera Enterprise, designed to meet and exceed enterprise business continuity and compliance requirements and simplify integration with existing data management systems. New advancements in the Enterprise platform include Cloudera Navigator, Cloudera Enterprise BDR (Backup and Disaster Recovery), along with version 4.5 of the company’s market-leading management interface, Cloudera Manager. ”The market has reached an important milestone,” said Mike Olson, CEO of Cloudera. “There is now a wave of mainstream enterprise interest in the integration of Apache Hadoop into existing IT environments. At Cloudera, our experience — our large installed base includes more than half the Fortune 100 — gives us a clear understanding of enterprise requirements. With the new enhancements announced today, we offer customers the most advanced, most mature business-ready solution capable of supporting critical enterprise data and systems management needs.”

    DDN launches hScaler appliance. DataDirect Networks (DDN) announced the hScaler appliance, an Apache Hadoop platform for big data with integration and flexibility optimized specifically for the enterprise. As an enterprise appliance the hScaler features a plug-and-play experience with a single-pane-of-glass management interface to simplify monitoring of the entire Hadoop infrastructure. It combines high throughput shared storage, powered by DDN’s Storage Fusion Architecture (SFA) capabilities, and a high performance server framework. It leverages the SFA12K platform, a 40GB/s InfiniBand storage appliance that crunches big data with 1.4M sustained SSD IOPS in real-time. “DDN’s hScaler appliance represents the next step forward in the democratization of big data. It takes an advanced analytics solution that was economical for only the richest and most information-driven organizations in the world and puts it well within the grasp of enterprise CIOs,” said Jean-Luc Chatelain, Executive Vice President of Strategy and Technology, DDN. ”For enterprises seeking to maximize the value of their information, hScaler technology presents an opportunity to do so at a lower cost — in terms of time, money, and resources — than ever before.”

    MapR sets MinuteSort record.  MapR Technologies announced a new world record for MinuteSort, sorting 15 billion 100-byte records (a total of 1.5 trillion bytes) in 60 seconds using Google Compute Engine and the MapR Distribution for Apache Hadoop. The benchmark, often referred to as the World Cup of data sorting, demonstrates how quickly data can be sorted starting and ending on disks. The benchmark was completed on 2,103 virtual instances in the Google Compute Engine. Each virtual instance consisted of four virtual cores and one virtual disk, for a total of 8,412 virtual cores and 2,103 virtual disks. “The record is significant because it represents a total efficiency improvement executed in a cloud environment,” said Jack Norris, VP of marketing, MapR Technologies. “In an era where information is increasing by tremendous leaps, being able to quickly scale to meet data growth with high performance makes business analytics a reality in situations previously impossible.”

    4:00p
    HostingCon 2013

    Hosting industry conference HostingCon 2013 will be held on June 17-19 at the Austin Convention Center in Austin, Texas. This annual event includes 52 educational sessions across 3 days. More than 100 speakers and panelists, including numerous industry thought leaders and trend setters, share their knowledge of the ever-evolving Web hosting industry. More than 1,900 people from all areas and niches of the hosted services industry are expected to attend the conference.

    More information is available on HostingCon’s website. For DCK readers, use this coupon code when registering: DCK2013 to receive a discount on the registration fee.

    For hotel information and to book online, visit the hotel page.

    Venue
    Hilton Austin
    500 East 4th Street
    Austin, TX 78701
    1-512-482-8000
    1-800-HILTONS

    For more events, please return to the Data Center Knowledge Events Calendar.

    4:09p
    Minkels, Stulz Unveil New Cooling Systems
    Minkels-Next-Generation-Col

    An overhead view of Minkels’ Next Generation Cold Corridor, a containment system for data center cooling management. (Photo: Minkels)

    Several cooling vendors have announced new products this week. Here’s an overview:

    Minkels Updates Cold Corridor Containment – Data center maker Minkels, part of the listed company Legrand  has launched its Next Generation Cold Corridor, a modular and highly flexible aisle containment solution that separates hot and cold airflows in an energy-efficient manner. Minkels launched the first version of the Cold Corridor in 2006, in a time when attention to energy efficiency was still very much a new trend. Minkels is scheduled to exhibit the Next Generation Cold Corridor to users for the first time at Data Centre World 2013, which is being held in London today and tomorrow. “Virtualisation and cloud computing have given data centre dynamics a considerable boost,” says Jeroen Hol, Chief Executive Officer (CEO) at Minkels. “As an extension of this development, users are expressing a growing need for highly scalable and therefore flexible data centre solutions. They want to be able to conveniently upscale or downscale a data centre whenever necessary. Cost considerations also play a role in this call for flexibility. This highly modular design offers extensive opportunities to implement such a Cold Corridor solution in stages, and therefore more cost effectively, too.”  Thanks to its modular structure, the Next Generation Cold Corridor can be flexibly adapted to the specific building environment.

    STULZ Introduces CyberCon Cooling for Containers - STULZ introduced the STULZ CyberCon modular, outdoor cooling system, a highly energy-efficient, self-contained, external cooling solution designed for rapid deployment with containerized computer rooms (PODs). “STULZ CyberCon is a true all-in-one cooling solution,” said Joerg Desler, Vice President of Production and Engineering for STULZ. “With STULZ CyberCon, STULZ is providing a precision cooling solution that can be tailored to meet all IT manufacturer and data center needs worldwide.” The modular design of the new STULZ CyberCon permits the ability to scale capacity and quickly align with IT demand and rapidly changing environmental conditions. Since STULZ CyberCon is constructed in advance of installation, it reduces upfront capital costs associated with the construction of a brick-and-mortar data center. To permit ease of shipping, doors, fans, and louvers have been designed so that they do not exceed the external dimensions of the STULZ CyberCon system. David Joy, Vice President of Sales and Marketing for STULZ, noted, “Given the rapid growth of modular data centers over the past two years, the STULZ CyberCon’s modular design makes it ideally suited to the precision cooling needs of containerized data centers.”

    4:30p
    Revelytix Launches Loom Dataset Management for Hadoop
    Revelx

    Revelytix, a big data software provider, has a background in working with government agencies on big data sets.

    Big data software and tools provider Revelytix announced early access availability of Loom Dataset Management for Hadoop which makes it easier for data scientists to work with Hadoop and easier for their organizations to manage the huge challenges of big data files created with Hadoop.  Loom tracks the lineage and provenance of all registered HDFS data and offers query execution using SQL, SPARQL or HiveQL, as well as integration with R.

    Dataset Management for Hadoop

    “Loom makes it easy for data scientists and IT to build more analytics faster with easy-to-use interfaces that simplify getting the right data for the job quickly and managing datasets efficiently over time with proper tracking and data auditing,” said Revelytix CEO

    Mike Lang.  Loom includes dataset lineage so you know where a dataset came from, Active Scan to dynamically profile datasets, Lab Bench for finding, transforming, and analyzing data in Hadoop and Hive; data suitability, and open APIs.

    Based on nearly a decade of designing and building big data fabrics and solutions for the U.S. Department of Defense, the leading intelligence organizations in the United States and major pharmaceutical, financial services and life sciences companies, Loom is the product of deep big data experience. Because Hadoop makes practical so many new analytics and datasets, Loom’s tracking and management capabilities are fundamental to managing datasets in Hadoop.

    Relationship with U.S. DoD Expanded

    Revelytix also announced that it will provide big data software and services for the U.S. Department of Defense (DOD) during 2013, deepening the multi-year big data relationship already in place. The DOD has relied on Revelytix for three years to create its data architecture and provide support to allow it to establish common architecture and semantics across all military service branches.

    “The management team at Revelytix has been working on complex data processing and data management problems for the federal government for the past 12 years,” Lang said. ”Our first company, Metamatrix, now part of Red Hat, produced data processing software used in the intelligence community and the DOD. Revelytix has been working for the DOD for the past four years specifically on the problem of processing and managing highly distributed sets of data. The resulting Revelytix technology is now in full production.”

    4:52p
    GM Plans $258 Million Data Center in Michigan
    An illustration of the design for a new General Motors data center in Warren, Michigan.

    An illustration of the design for a new General Motors data center in Warren, Michigan. The company has announced plans to build a similar facility in Milford, Mich.

    General Motors is hoping to build a $258 million data center at a research facility it owns in Milford, Michigan. The company is seeking tax abatements for the project at the Milford Proving Ground, which would feature a 100,000 square foot data center and employ about 20 workers.

    “GM is developing a business case regarding a possible future investment to construct and equip a consolidated GM information technology data center facility in Milford, Mich., on the GM Proving Ground campus,” the company told the Detroit Free Press.

    The Milford site appears to be the second data center to be built as part of a huge data center consolidation at GM that would centralize its IT infrastructure, consolidating from 23 sizable data centers worldwide to just two facilities in Michigan. Last June, GM announced that the first of the new data hubs would be a $130 million facility located in Warren, Mich.  As part of that process, the company will refresh its server and storage gear to bring higher levels of automation and efficiency to its infrastructure.

    The consolidation is part of a GM initiative to drastically reduce its reliance upon third-party outsourcing firms. The automaker currently outsources about 90 percent of its IT services to systems integrators including HP/EDS, IBM, Capgemini, and Wipro.

    GM is requesting a 50 percent tax abatement on real property and personal property for 15 years in Milford, township officials said. The GM Milford Proving Ground was the industry’s first dedicated automobile testing facility when it opened in 1924, and covers 4,000 acres.

    8:08p
    Rackspace Acquires ObjectRocket for MongoDB Service

    Rackspace Hosting is acquiring ObjectRocket, a provider of database as a service (DBaaS) offerings using the MongoDB database. The acquisition expands Rackspace’s big data play, allowing them to provide Open Cloud customers with demanding applications with a NoSQL DBaaS.

    The acquisition is expected to close today, and the ObjectRocket offering will be available in early March for Rackspace customers . The offering will roll out first for customers in the company’s Chicago data center, but will soon be integrated across Rackspace’s Open Cloud portfolio. Financial terms of the deal were not disclosed.

    Rackspace is establishing itself in the high growth NoSQL database market. NoSQL offerings forego traditional approaches to databases, including the use of the relational database model or Structured Query Language (SQL), in favor of an approach that provides better scalability across a distributed architecture. The NoSQL approach has become popular for use in large cloud applications. A  recent report from The 451 Group projects that NoSQL software revenue will grow at an annual rate of 82 percent to reach $215 million by 2015. 

    “Databases are the core of any application and expertise in the most popular database technologies will be critical to us delivering Fanatical Support in the open cloud,” said Pat Matthews, senior vice president of corporate development at Rackspace. “As we look to expand our open cloud database offering into the MongoDB world, we are really excited to work with the entrepreneurs and engineers at ObjectRocket.”

    MongoDB has already been adopted by organizations such as Disney, The New York Times and Craigslist, among others. Built on a NoSQL, open source engine, MongoDB is able to store structured data in JSON-like schemas. By offering MondoDB as a service, ObjectRocket makes it easier to use for customers by eliminating much of the setup and configuration.

    ObjectRocket will continue to be sold as a standalone service, so it’s still usable in conjunction with other clouds; it also leverages AWS Direct Connect to provide low latency and free bandwidth to AWS customers.

    The ObjectRocket founding team collectively brings more than 50 years of experience in scaling large data systems, including MongoDB. They have also designed and managed systems that power some of the busiest sites on the web, and played key founding development roles at companies like Shutterfly, PayPal, eBay and AOL. 

    “With Rackspace’s open cloud philosophy and our shared emphasis on providing the highest level of customer support, we feel this union is an ideal fit,” said Chris Lalonde, co-founder and CEO of ObjectRocket. ”Since the beginning, our focus has been on creating a DBaaS platform that would perform, scale and support critical workloads in a superior manner.  Joining forces with Rackspace will enable us to achieve this goal, while delivering one of the most advanced MongoDB DBaaS solutions on the market.”

    At the beginning of the year, Rackspace CTO John Engates stated in his cloud predictions that “this is the year when Big Data makes its way into enterprise conversations.” This acquisition reinforces that belief.

    9:10p
    Datacentres Europe 2013

    The Datacentres Europe 2013 event will take place in Nice, France on May 29-30. The event, run by BroadGroup, is focused on end users in vertical markets and the data centre and IT infrastructure they use.

    BroadGroup research forecasts that strong vertical market demand will drive growth in outsourcing to third-party data centres in Europe, but this still leaves approximately 70 percent of the market that has not yet outsourced by 2016. However, the consulting firm also believes that this linear perspective of the market will largely disappear as the computing environment changes. BroadGroup’s research suggests that IT departments emerge as cloud brokers, hosting applications across a range of datacentres and distributed architectures. Businesses will be offered a flexible menu of options, automated and on demand. For data centres and users, this dénouement brings a completely new set of challenges and opportunities.

    Sponsors of the event include Schneider Electric, Hewlett Packard, Bird & Bird, eco Verband der deutschen Internetwirtschaft e.V., APL, MigSolv, Cofely GDF Suez, Invest in Iceland, Scholzegruppe, Future Facilities, ABB, Bouygues, EBRC, DEF, Scottish Development International, Conteg, Smacs, Partner Organizations are the European Data Centre Association (EUDCA) and CESIT. Industry Partners include Colo-X, Colo Research, Globeron and EPI.

    The event includes an industry exhibition with more than 70 companies represented.

    For more information visit, Datacentres Europe 2013 website. The event advanced discount offers close on Feb. 28.

    Venue
    Palais des Congrès Acropolis
    Nice, Côte d’Azur, France

    Hotel information can be found on the conference website.

    For more events, please return to the Data Center Knowledge Events Calendar.

    9:15p
    Intel Enters the Hadoop Software Market
    supermicro-hadoop-fattwin

    Supermicro has introduced its 4U FatTwin SuperServer system, which supports four nodes with twelve 3.5-inch hard drives plus two 2.5-inch drives each. The system supports Intel’s Hadoop distribution, which was announced yesterday, (Photo: Supermicro)

    The market for Hadoop software continues to attract new players. Intel (INTC) announced the availability of its Distribution for Apache Hadoop, including new management tools. More than 20 partners announced support for Intel’s Hadoop offering, including Cisco, Red Hat, Cray and Supermicro.

    Intel’s move comes a day after EMC rolled out its own Pivotal HD Hadoop distribution integrating Greenplum’s massively parallel processing (MPP) database. Intel and EMC will compete with newer players including Map R, Hortonworks and Cloudera in the growing market for Hadoop solutions.

    Apache Hadoop is a software framework that supports data-intensive distributed applications. It has become one of the most important technologies for managing large volumes of data, and has given rise to a growing ecosystem of tools and applications that can store and analyze large datasets on commodity hardware.

    “Transformational Opportunity” of Big Data

    “People and machines are producing valuable information that could enrich our lives in so many ways, from pinpoint accuracy in predicting severe weather to developing customized treatments for terminal diseases,” said Boyd Davis, vice president and general manager of Intel’s Datacenter Software Division. “Intel is committed to contributing its enhancements made to use all of the computing horsepower available to the open source community to provide the industry with a better foundation from which it can push the limits of innovation and realize the transformational opportunity of big data.”

    The Intel open platform was built from the ground up, on Apache Hadoop, and will keep pace with the rapid evolution of big data analytics. Intel says its distribution is the first to provide complete encryption with support of Intel AES New Instructions (Intel AES-NI) in the Intel Xeon processor. Silicon-based encryption allows organizations to more securely analyze their data sets without compromising performance.

    Intel Manager for Apache Hadoop simplifies the deployment, configuration and monitoring of the cluster for system administrators as they look to deploy new applications. The new Intel Active Tuner automatically configures and optimizes performance for Hadoop.

    Intel has worked with strategic partners to integrate this software into a number of next-generation platforms and solutions, including:

    • Red Hat plans to build solutions using Intel Distribution integrated with Red Hat solutions, such as Red Hat Storage Server 2.0 and Red Hat Enterprise Linux. Big data solutions resulting from the expanded Red Hat and Intel alliance will be designed to meet enterprise expectations for availability, performance, and compatibility. This builds upon Red Hat’s plans announced last week surrounding big data strategy.
    • Cray announced that it will introduce a new solution combining the Intel Distribution for Apache Hadoop software with the Cray Xtreme line of supercomputers. The new offering will add to Cray’s portfolio of  ”Big Data” solutions and give customers the ability to leverage the fusion of supercomputing and Big Data. “Cray has enabled customers to achieve modeling and simulation at the highest scale possible, and we are excited to work with Intel to provide dramatic new levels of data assimilation combined with modeling and simulation – driving the analytics needed for knowledge discovery and optimal decision making,” said Bill Blake, senior vice president and CTO of Cray. “The new features added to the Intel Distribution, such as greater security, improved real-time handling of sensor data, and improved performance throughout the storage hierarchy will benefit Cray’s traditional customers in big science and supercomputing, in addition to new commercial customers who need the combination of supercomputing capabilities and an enterprise-strength approach to Big Data.”
    • Supermicro announced it is expanding its Hadoop Big Data initiatives with support for the new Intel Distribution for Apache Hadoop. Supermicro’s Hadoop-optimized server and storage systems have undergone rigorous testing and validation.
    • RainStor announced  that its Big Data Analytics on Hadoop product has been validated with the Intel Distribution for Apache Hadoop software. The solution provides faster, more flexible analytics using Standard SQL, BI Tools and MapReduce without the need to move data out of the Hadoop environment.
    • Zettaset announced that it supports the launch of the Intel Distribution for Apache Hadoop. Joint customers of Zettaset and Intel can now benefit from the ease of use of a Hadoop installation and management solution that works with the only distribution of Apache Hadoop built from the silicon up.“Intel has worked diligently with their partners to ensure compatibility and deliver a robust, high performance Big Data solution for the enterprise,” said Zettaset President and CEO Jim Vogt. “We are excited to be included in Intel’s growing Big Data ecosystem and look forward to helping our joint customers to easily install, manage and secure their Intel-powered Hadoop deployments.”

    In this video, Davis and Paul Perez, Cisco Vice President and CTO, Data Center Group discuss the extended relationship between the companies into big data.

    << Previous Day 2013/02/27
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org