Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, September 20th, 2013

    Time Event
    11:30a
    Teradata Sharpens Focus on In-database Analytics

    Striving to take in-database analytics beyond an emerging trend, Teradata (TDC) announced its next-generation, in-database R Analytics that are fully parallel and scalable. Customers can integrate the analytic libraries into Teradata’s flexible analytic ecosystem to achieve faster analysis of large datasets. In addition, Teradata now provides the largest, most complete analytic library of over 1,000 in-database analytics.

    “In-database analytics is no longer an emerging trend, but now an absolute requirement to meet the processing speed and data volume demands of the business,” said Scott Gnau, president, Teradata Labs. “Through integrating multiple analytic techniques in the database, Teradata empowers its customers to push beyond the traditional limits of analytics by bringing the analytics to the data.”

    Integrated Revolution R Analytics

    To respond to customers’ challenges, Teradata opened its database to Revolution Analytics and Fuzzy Logix, and added enhanced database capabilities for XML data analysis, accelerated performance with geospatial data, and extended temporal analytics. These innovations support the Teradata Unified Data Architecture, which enables customers to deploy, support, manage, and seamlessly access all their data in an integrated and dynamic environment.

    Revolution R Analytics is an open source statistical language and software that is quickly becoming a tool of choice for data scientists. Teradata and Revolution Analytics are the first to bring parallel R into the database, tackling the complexity of running R analytics in parallel and making it accessible to even more users.

    “We are proud to partner with Teradata to offer joint customers the first in-database parallel R platform,” said David Rich, chief executive officer, Revolution Analytics. “With over two million users worldwide, the R language is today’s standard for powerful predictive analytics. Combined with the Teradata Database, Revolution R Enterprise’s scalable, supported R technology enables users to innovate to meet the demands of data-driven businesses.”

    Fuzzy Logix and Geospatial Analysis

    Teradata now offers customers the largest library of more than 600 Fuzzy Logix in-database analytics to complement the existing in-database Teradata capabilities, creating more than 1,000 analytics in total. These analytics can easily be accessed with the programing language of business – structured query language (SQL). Fuzzy Logix in-database capabilities, combined with Teradata, provide a scalable solution that makes it possible to accurately predict many types of future outcomes.

    Teradata transformed a manual, computationally intense location analysis process and automated it with a new spatial indexing technique. The geospatial capability adds yet another data dimension to enhance the ability of organizations to understand their business. The use of geospatial data is commonplace, for example, by utility companies to quickly identify and respond to outages.

    Teradata is the first vendor to pioneer the use of Temporal analytics, which allows customers to create a full picture of an organization’s business at any point in time. Teradata added three new built-in capabilities that streamline, reduce complexity, and simplify the use of Temporal analytics. Temporal analytics enables organizations to capture and track changes as the business evolves over time.

    “We continue to push the envelope of what is possible in a database environment. The tools we are offering the analytics professionals within our customer base provide them with the ability to create more analytics against more data, with a faster turnaround time. That is a competitive advantage that our customers want,” said Bill Franks, chief analytics officer, Teradata.

    2:04p
    Cutting Big Data Problems Down to Size

    Jeff Rauscher, Director of Solutions Design for Redwood Software, has more than 31 years of diversified MIS/IT experience working with a wide variety of technologies including SAP, HP, IBM, and many others.

    Jeff_Rauscher_tnJEFF RAUSCHER
    Redwood

    Recent International Data Corporation (IDC) research indicated that our digital universe will grow from 130 exabytes to 40,000 exabytes by 2020.* (An exabyte is one quintillion (1018) bytes, or one billion gigabytes.)

    This seems almost unbelieveable when you consider that Internet traffic only grew past the one exabyte per month mark in 2004, but it’s quite possible. What it shows is the remarkable rate our digital world is expanding. According to one of our partners, IBM, 2.5 quintillion bytes of data are created every day. In fact, the rate of data creation is growing so quickly that 90 percent of the data in the world today was created in the past two years.**

    This is big news for business, too. Some analysts estimate that the worldwide volume of business data alone is estimated to be doubling every 1.2 years. It fuels a whole range of new developments across every industry – whether that is generating personalized recommendations, informing supply chain adjustments, processing customer transactions or simply analyzing customer sentiment and feedback. While all of this data offers the promise of valuable insight, managing its sheer volume, as well as the speed with which it is being created is a tremendous challenge. How can companies turn piles of data into useful information?

    Big Data Only Useful When Analyzed

    “Big Data” is an industry buzzword – heralded as the game-changer for businesses by media and analysts alike. It’s been described as the cure-all for everything from profitability and efficiency woes to insurance against revenue drops and a fast way to find new business opportunities. Nevertheless, Big Data can only provide value when an enterprise can efficiently process it.

    To do this, previously inconceivable volumes of data have to be analyzed and understood—very quickly. Most IT enterprises today are not equipped to do this. In fact, the phrase “Big Data” itself came from a reference to data that is too vast, too unstructured, or gathered too quickly for existing IT systems to manage. But enterprises still try. Today, businesses turn to a variety of big-name CRM or ERP solutions to sort through the volume and attempt to identify hidden business insights. With the ever-growing volumes of data, this can be as difficult as finding the proverbial “needle in a haystack.”

    Data center professionals in search of these information “needles” may need to look no further than to the example of other parts of the enterprise, such as manufacturing or logistics. In these areas, a series of many complex and interconnected steps are executed automatically to achieve the desired result.

    Automation Needed

    Cutting Big Data down to size and transforming it into “Big Information” is similar to running other business and IT processes. Automation can help support the constant, rapid, accurate and reliable processing of any series of related, dependent tasks in a logical sequence. If you automate the way you process Big Data, you can quickly bring it down to size.

    Many of our customers successfully use automated processes that are connected and coordinated across departments, applications and locations to handle some of their most daunting Big Data tasks – such as keeping a data warehouse up-to-date, updating retail customer transaction information, or coordinating entire supply chains—from ordering to delivery. Connecting automation throughout the enterprise brings some of the most complex activities, like dealing with Big Data, back to human scale. It builds on the processes that have already been automated – including individual applications or tasks – pulling everything together without requiring micromanagement.

    Smart businesses use automation everywhere – including in how they deal with data. As with the original Industrial Revolution, when people realized how much they could support manufacturing automation, corporate leaders are now spearheading a new Informational Revolution. They’re using connected, automated IT processes to manage huge volumes of data. Connected IT and business process automation provides the flexibility to expand and grow markets, even as data complexity blossoms. The smart businesses of tomorrow will rely on it to transform high-volume and highly complex activities from a Big Data burden into valuable, actionable information.

    Endnotes
    *IDC Marks Big Data Analytics for Explosive Growth, Newsfactor Bu$iness Report, 8/5/2013
    **Improving Decision Making in the World of Big Data, Forbes, 3/25/2012

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    2:32p
    Yahoo Gets the OK For Buffalo Data Center Expansion
    An aerial view of the Yahoo data center in Lockport, N.Y.

    An aerial view of the first phase of the Yahoo data center in Lockport, N.Y.

    Yahoo’s slow-moving expansion in Buffalo has received the OK to move forward from the town of Lockport. The company announced the expansion back in March of this year, following its purchase of additional land. On Tuesday the Lockport planning board voted unanimously to approve Yahoo’s plans.

    Lockport is home to 275,000 square feet of Yahoo data center space, which is part of the company’s global initiative to make its data center footprint more efficient and sustainable. The first site features many innovative features and refinements, and is famously known for adopting principles of chicken coops.

    The second Yahoo project in Lockport will be built in two phases, with the total project consisting of 242,000 square feet. The first phase will be 165,000 square feet, and will include data center pods and half of the central core operations building, which houses a 24-hour customer care call center. Construction is anticipated to start in December.  The first phase will include all of the utilities and generators needed for both halves of the construction.

    Part two is more of the same, consisting of the second half of operations and the second computer pod, with construction to begin at an undisclosed future date.

    In April, Yahoo landed a 20-year package of property tax and sales tax breaks, approved by the Town of Lockport Industrial Development Agency Board of Directors. The tax breaks are expected to save Yahoo more than $30 million. The tax break package includes a 20-year sales tax abatement for building materials and equipment, as well as 100 percent abatement of property tax for 10 years. The property tax exemption would then be reduced by 20 percent every two years for the next eight.

    According to Lockport Journal, the Town Attorney Michael J. Norris said some blasting may be needed to excavate the bedrock on the property. But officials said blasting was possible during the construction of the current Yahoo buildings and it wasn’t needed after all.

    The first phases of the Lockport project, built in 2010-11, featured 275,000 square feet of data center space housed in five 120-by-60 foot prefabricated metal structures using the Yahoo Computing Coop data center design. The project was part of a global initiative to make Yahoo’s data center footprint more efficient and sustainable, saving millions of dollars in power costs along the way.

    2:54p
    Sentinel Completes Initial Phase of North Carolina Data Center
    sentinel

    An artist’s illustration of the Sentinel Data Centers project in Durham, North Carolina. (Photo: Sentinel)

    Sentinel Data Centers has completed construction and commissioning of the initial 50,000 square foot, 8 megawatt initial phase of its NC-1 Data Center in Durham, North Carolina. This is the sixth facility Sentinel has developed since its founding in 2001, and marks its entry into the active North Carolina data center market..

    The 420,000 square foot facility will ultimately yield approximately 200,000 square feet of net usable space, divided in hard-walled, turnkey suites of multiple sizes and configurations. Sentinel can accommodate footprints ranging from 10 cabinets to 50,000 square feet in the facility. It offers a seamless ability to expand capacity and/or footprint “on demand, as well as “Multi-Tiering” solutions that will enable each user to deploy different resiliency zones within its space, allowing for granular optimization of cost structure by application.

    “We are gratified that our skillful design and construction teams have executed on our vision to bring a wholly new, higher quality and lower cost data center model to the region,” said Todd Aaron, Sentinel Co-President said.  “North Carolina is a truly exceptional state in which to do business, and we are grateful at the reception our model and the NC-1 facility have received both from local companies and from out-of-region enterprises seeking to leverage the state’s attractive occupancy economics.”

    Earlier this year, Sentinel also announced the development of a new 131,000 square foot, single-tenant data center for Bloomberg LP in Orangetown, N.Y. Sentinel is collaborating with Russo Development on the project.

    Wholesale Comes to Research Triangle

    Sentinel Data Centers began development of the 420,000 square foot facility roughly a year ago. Targeting both in-region and out-of-region users, the facility is one of the first wholesale data centers in the North Carolina market, along with a smaller project by Compass Datacenters in Raleigh. Sentinel offers a turn-key solution for large-footprint users in a region short of these options. When the facility was initially announced, co-president Josh Rabina noted that the combination of highly reliable, low cost power, significant regional talent, easy access via multiple airports and exceptional quality of life are all factors that attract out-of-region users from multiple industries to Durham.

    Sentinel is a veteran player and early pioneer in the wholesale data center business. The company builds and leases finished data center space to enterprise customers. Since its inception, Sentinel has developed more than 1.5 million square feet and 120 megawatts of best-in-class data center solutions for a diversity of Fortune 500 enterprises across multiple industries, including financial services, healthcare, technology and biotechnology.

    North Carolina has been a hot market for data center development in recent years, attracting major projects from Apple, Facebook and Google. The state also has a cluster of data centers in the Research Triangle area. Sentinel marks the trend of wholesale data center players building in the region.

    3:00p
    Hyve Contributes Open Compute Server Design

    Hyve Solutions,  a division of SYNNEX Corporation (SNX), announced that it has contributed its Open Compute Project-Ready server design engineered to fit into a standard 19-inch data center rack to the Open Compute Project. Hyve debuted the Hyve 1500 Series platform last January at the Open Compute Summit. Hyve Solutions 1500 Series is OCP V2 and is composed of 28 2-node 1.5u high servers into a 44u rack.

    “Hyve Solutions has been dedicated to Open Compute Project from its inception and our contribution of the Hyve Solutions 1500 Series platform furthers our commitment to the OCP community and core values of innovation, collaboration and transparency,” said Steve Ichinaga, Senior Vice President and General Manager, Hyve Solutions.

    Saying that vendor contributions are crucial to the Open Compute project, Cole Crawford, COO of the Open Compute Foundation, said, “We commend Hyve Solutions for contributing the Hyve 1500 Series platform to the OCP community.”

    Data Center Knowledge spoke with Hyve’s Ichinaga at the January 2013 Open Compute Project meeting, where Hyve was showing off its initial 1500 series machines. Here’s the video of the interview.

    Next week, Hyve will demonstrate the new rack-level server deployment options at SYNNEX headquarters. The Hyve Integration Center will showcase where Open Compute Project servers for Facebook data centers are born, and how OCP-inspired designs, such as the Hyve Ambient series servers are developed.

    The next Open Compute Summit will be in January 2014 in San Jose, Ca.

    5:00p
    MemSQL Offers View Across Structured and Semi-structured Data

    MemSQL adds JSON analytics support for combining a single view across structured and semi-structured data, Violin Memory launches Maestro Memory Services to transform legacy storage into memory-based infrastructure, and Splunk acquires mobile analytics company BugSense.

    MemSQL Adds JSON Analytics

    Real-time analytics company MemSQL introduced its distributed in-memory database that now provides JSON (Java Script Objection Notation) analytics for a consolidated view across structured and semi-structured data, including standard enterprise and social-media data. This empowers the combination of two disparate data sources for operational analytics, network security, real-time recommendations, and risk management. JSON is used for storing and exchanging semi-structured data from social-media networks such as Facebook, Twitter, and Instagram, but do very little for real-time analytics.

    “With support for JSON, our distributed in-memory database is poised to have the same effect on databases that VMware had on servers,” said Eric Frenkiel, CEO, MemSQL. “By setting the foundation for database consolidation, organizations will soon reap the benefit of lower total cost of ownership and achieve significant efficiency gains by eliminating the difficulty of moving data around. In fact, our ability to combine structured and semi-structured data together could dramatically impact the need for NoSQL in the future.”

    Violin Launches Maestro Memory Services

    Violin Memory announced the availability of Violin Maestro, a comprehensive suite of Memory Services software for acceleration, tiering, migration and protection of application data in the enterprise data center. The services run on memory-optimized, hardware-accelerated Violin Memory Appliances to transparently integrate into customers’ existing data center environments. Maestro software on the Memory Appliance provides non-disruptive migration of data off legacy storage to high performance Violin Memory Arrays.

    “Companies need every competitive advantage they can get, and that includes being able to make real-time decisions.  Today, we are expanding our enterprise memory software portfolio to bring the benefits of memory-based computing to traditional enterprise storage at significantly better economics than what is offered by legacy storage vendors,” said Don Basile, CEO of Violin Memory.

    Splunk Acquires BugSense

    Splunk (SPLK) announced that it has entered into an agreement to acquire BugSense, an analytics solution for machine data generated by mobile devices. The addition of BugSense will enhance the ability of Splunk customers to analyze machine data directly from mobile devices and correlate it with other machine-generated data to gain operational intelligence.

    “BugSense was founded by a team of experienced developers who share a passion for improving visibility and gaining valuable insights into data from mobile devices,” said Jon Vlachoyiannis, co-founder and CTO, BugSense. “We have admired Splunk over the years and are pleased to join a growing organization that is a leader in big data analytics.”

    5:55p
    Friday Funny: Kip and Gary Visit Data Center World

    It’s Friday! We are crossing the finish line on another work week. And at Data Center Knowledge, it means we publish our cartoon contest.

    Diane Alber, our data center cartoonist, writes, “I’m sure you have been to a trade show sometime in your career, so I thought it would be a good opportunity for Kip and Gary to go to one too! Who will be at the Data Center World Exhibit Hall to greet their friends Kip and Gary? Why the Data Center Knowledge team, of course. ” You can visit Diane’s website Kip and Gary for more of her data center humor.

    Submit your caption suggestions for the cartoon below.

    Also, big congratulations to our reader, who submitted, “Just 5 or 6 more cords and we can reach the neighbor’s building.” for our cartoon, Extending More Power.

    New to the caption contest? Here’s how it works: We provide the cartoon and you, our readers, submit the captions. We then choose finalists and the readers vote for their favorite funniest suggestion. The winner receives a hard copy print, with his or her caption included in the cartoon!

     

    trade-show-sm

    For the previous cartoons on DCK, see our Humor Channel.

    6:02p
    DC Corp Opens Data Center In West Virginia, Going After Fed Business
    dc-corpdatacenter-470

    The exterior of the new DC Corp. facility in Martinsburg, West Virginia. (Photo: DC Corp)

    Colocation is beginning to move into smaller markets. DC Corp is opening a data center in West Virginia, a state that hasn’t traditionally been known as a colocation market. The company is retrofitting a 180,000 square foot building with 11,000 square feet of office space. DC Corp is a build-to-suit data center were the customer can choose its build-out options.

    “Our main focus is the federal government, as we are strategically located outside the DC blast zone,” said Chuck Asbury II, CEO of DC Corp. “We will also target higher education institutes with our access to dark fiber and Internet2.”

    DC Corp’s facility is located in Martinsburg, West Virginia within the John D. Rockefeller Science and Technology Center which is located 90 minutes from Washington DC in the eastern panhandle of West Virginia. The data center is expected to have positive economic impact on the local area through increased IT employment opportunities.

    DC Corp designs custom solutions for the customer. The company says it takes into account every detail from mechanical and electrical engineering, data facility construction, commissioning, and to a variety of staffing and support options. DC Corp offers complete flexibility, customers can  fully or partially select vendors to do any or all portions of the work. This includes from infrastructure and installations all the way to configuration and services.

    Why West Virginia? The company outlines some of the state’s selling points:

    • Low risk of natural disasters
    • Convenient access from New York, Pennsylvania, Maryland and Washington, DC. It’s within 500 miles of 50% of the U.S. population
    • Low-cost energy averaging 40% savings over surrounding states.
    • The cost of living in West Virginia is 5.4% below the national average.
    • Workforce has turnover rates among the lowest in the country.

    << Previous Day 2013/09/20
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org