Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, June 11th, 2013

    Time Event
    1:05p
    Data Center Jobs: CBRE

    At the Data Center Jobs Board, we have two new job listings  from CBRE, which is seeking a Maintenance Technician and a Building Engineer in Birmingham, Alabama.

    The Maintenance Technician is responsible for fully understanding critical infrastructure systems including uninterrupted power supply, electrical switchgear, emergency generators, and chilled water cooling system, computerized building controls, fire detection and fire suppression systems, monitoring and operating the infrastructure systems as directed by the Chief Engineer and Senior Operations Manager, performing daily inspections of mechanical, electrical rooms provide written reports regarding the performance condition of the equipment to the Chief Engineer, and immediately reporting abnormal conditions per operational procedures. To view full details and apply, see job listing details.

    The Building Engineer is responsible for complying with all applicable codes, regulations, governmental agency and Company directives related to building operations and work safety, inspecting building systems including fire alarms, HVAC, and plumbing to ensure operation of equipment is within design capabilities and achieves environmental conditions prescribed by client, overseeing and inspecting the work performed by outside contractors, performing assigned repairs, emergency and preventive maintenance, completing maintenance and repair records as required, reviewing assigned work orders, and estimating time and materials needed to complete repair. To view full details and apply, see job listing details.

    Are you hiring for your data center? You can list your company’s job openings on the Data Center Jobs Board, and also track new openings via our jobs RSS feed.

    1:34p
    New Big Blue Products Translate Big Data into Big Business

    At its Edge2013 partner conference in Las Vegas IBM announced new products to help cloud adoption as companies begin building Software Defined Environments (SDE). The event conversation can be followed on Twitter hashtag #IBMEdge.

    “Cloud computing and Big Data analytics are playing key roles in helping organizations lower operating expenses, improve efficiencies, and increase productivity,” said Tom Rosamilia, senior vice president, IBM Systems & Technology Group. “But they’re also enabling greater and faster access to business insights, which is fundamentally transforming the ways in which organizations, public and private, are interacting with, learning from, and supporting their customers.”

    Power Systems

    IBM introduced nine new Power Systems offerings, each providing advanced capabilities in big data analytics and cloud computing. With industry-themed solutions that combine IBM Power Systems and specialized software for big data analytics, five of the packaged solutions were for healthcare and retail industries. Three additional solutions provide organizations in any industry sector with advanced computing capabilities including predictive analytics, scoring and optimization techniques, as well as turnkey cloud offerings that support “pay-per-use” business models or fit-for-purpose private cloud infrastructures. Additionally, new entry-level technical computing solutions were announced, to integrate the company’s Platform Computing software directly on IBM Power Systems, Flex Systems, System x, iDataPlex, and IBM Storage Systems.

    Storage

    IBM released its FlashSystem series Monday – a series of all-Flash appliances that have been enhanced to deliver less than 1/10 the cost per transaction while using 4 percent of the energy and 2 percent of the space compared to hybrid disk and flash systems. Other storage portfolio enhancements included support for 4TB drives to its Storwize V7000 and XIV advanced storage systems for 33 percent more capacity in the same space, and new capabilities to XIV that let clients send large volumes of data between systems through the cloud without performance degradation.

    PureFlex, Cloud and Big Data

    In support of cloud, mobile, social and analytics, IBM added solutions that will build on the strong global momentum of PureSystems. IBM PureFlex Solution for Cloud Backup and Recovery incorporates specialized patterns based on Tivoli Storage Manager, Microsoft Hyper-V and the SAP Business Suite to help clients reduce the security risks they often face when deploying cloud solutions.  The Mobile Application Platform Pattern is a solution based on IBM Worklight server on PureSystems for both Power and x86 that can accelerate client access to millions of mobile users by allowing new mobile solutions to be deployed in as little as 30-40 minutes. Finally, the  IBM Connections Pattern will allow clients to deploy their social business platform on IBM PureSystems.

    2:06p
    Data Protection Across the Distributed Enterprise

    Andres Rodriguez is the CEO and Founder of Nasuni, a unified storage company that serves the needs of distributed enterprises. Previously he was a CTO at Hitachi Data Systems and CTO of the New York Times.

    Andres-Nasuni-tnANDRES RODRIGUEZ
    Nasuni

    When it comes to storage, IT professionals’ biggest challenge is figuring out how to serve an increasingly distributed enterprise. The problem is, as New York Times columnist Tom Friedman has memorably written, “The world has gone flat.” This trend is forcing IT to deploy infrastructure to a hodgepodge of remote locations with a broad set of requirements. Some locations may be as large as headquarters; some may be a half a dozen people with flaky network access.

    Challenges of Distributed Locations

    But everyone at these locations needs one thing: access to data. Being able to provide global access to data while delivering a uniform level of data protection is a formidable challenge in itself, and it’s compounded by the suddenness with which this need arises. In most organizations that are growing quickly, whether organically or through acquisitions, IT has a decentralized infrastructure. When the first disaster happens, however, and IT at HQ cannot get a handle on the situation or bring everyone back up quickly, the CIO is instructed to figure out how to make sure it never happens again.

    Are Security and Backup At Odds?

    Data protection really has two meanings that stand at odds to one another. The way IT protects data against loss is to make additional copies. But when it comes to protecting data against theft or an accidental leak, IT reduces the number of places where data can be accessed. There is an intrinsic tension between our need to secure and our need to make sure data is sufficiently backed up. Standards, like ISO 27002, capture this tension by requiring data security policies as well as business continuity policies to protect against data loss.

    The way around this tension is to encrypt all data before it goes to backup and then make copies of the encrypted data. As long as the keys are kept under tight controls – and copied – it becomes possible to make as many copies as needed without compromising security. Modern backup software supports encryption to ensure that no copies of data ever leave the data center unencrypted. This can work well if your organization only has one or two locations that require IT infrastructure; however, the distributed enterprise creates an additional layer of complexity because data is being created in many locations and often it needs to be accessed from every location.

    What About Cloud?

    CIOs have started to look at cloud services as a way to deliver uniform infrastructure globally because the cloud enables IT to deploy distributed resources while retaining complete central control. Cloud providers offer an ever expanding menu of choices for IT, giving departments new ways to consume their global toolkit. Cloud projects range from transitioning complete application stacks such as email into Software as a Service offerings, to connecting layers of your existing infrastructure (Infrastructure as a Service), be it components of storage or the network, to the cloud directly.

    Rather than having systems that are independently managed at each remote location, the cloud gives IT the ability to manage all enterprise data from HQ while simultaneously allowing resources to be deployed globally from its global backbone.

    These benefits are not unlike those that arise from data center consolidation, but there’s a big difference: the best cloud providers’ massive infrastructure is far superior even to that of the largest enterprise data centers. No enterprise data center can approach the kind of availability, redundancy and replication across multiple geographies that a storage cloud like Amazon S3 can provide. At the same time, its usage-based cost model makes it far less expensive to deploy than data center consolidation. The cloud is data center consolidation without the data center.

    Cloud is Consistent

    Cloud infrastructure allows IT to deliver a uniform level of service anywhere in the world. Data is consistently encrypted and replicated to a common back-end that in turn is available to every other location. By allowing IT to tightly regulate access to what is in essence a global private distribution network, data can be both available and fully protected across the distributed enterprise.

    However, it’s important to note that, when it comes to storage, the largest storage clouds are essentially raw components, much like the commodity disk drives that enterprise storage systems vendors like EMC and Netapp use in their storage arrays. On their own, cloud storage services like Amazon S3 and Microsoft Windows Azure Blob Storage cannot serve as enterprise storage solutions any more than could a large shipment of Seagate disk drives. Cloud storage needs to be integrated into a larger, smarter service by a new generation of systems vendors who can tap into the revolutionary potential of cloud storage to provide true Infrastructure as a Service.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    2:23p
    DCK Webinar: Redefining Security for Global Business Operations

    With the increasing growth of data and users in today’s data center and increased requirements around the data center infrastructure, administrators are continuously tasked with increasing security. The challenge, however, is that aging data center technologies are failing to meet new security needs. Our next Data Center Knowledge webinar with IO will be presented on Thursday, June 27, will bring you a comprehensive look at delivering optimal security strategies, meeting current challenges in today’s environments.

    The Data Center Knowledge webinar series continues, providing free, educational content focused on mission-critical issues facing today’s data center managers, owners and investors. Each webinar is an opportunity to learn from top experts in the field.

    Bob-Butler-tnRobert Butler
    IO

    Robert Butler, Chief Security Officer at IO, will present on security and “Data Center 2.0″ and Data Center Knowledge contributor Bill Kleyman will moderate the conversation.

    Title: Data Center 2.0: Redefining Security for Global Business Operations
    Date: Thursday, June 27, 2013
    Time: 2 pm Eastern/ 11 am Pacific (Duration 60 minutes, including time for Q&A)
    Register: Sign up for the webinar.

    Beyond physical security, older data centers are still ill-equipped to handle a cyber-attack on critical power and cooling infrastructure. In order to deliver an always-on, on-demand service level -enterprises, governments, and service providers need to take a closer look at the security profile of their data center.

    This webinar will cover several important security topics which revolve around the modern data center. As new workloads are deployed, administrators will have to think “outside of the data center” to ensure maximum physical and logical security. Topics of discussion include:

    • The security challenges facing the “Data Center 1.0″ infrastructure
    • The impact of technology trends on business and government – and how it affects you.
    • The cyber-threat evolution in recent years
    • How an APT (Advanced Persistent Threat) steals secrets and disrupts operations
    • An integrated approach to physical and logical security.
    • Deploying intelligent security solutions from the bottom up.

    This one-hour webinar will examine the more detailed aspects, such advanced persistent threats (APT), how hacking has become a commodity, current trends and growth within the data center, as well as how to create security on both a logical and physical layer.

    Following the presentation, there will be a Q&A session with your peers, Bill Kleyman, Robert Butler from IO. Sign up today and you will receive further instructions via e-mail about the webinar. We invite you to join the conversation.

    3:17p
    Open Compute Plans Hardware Hackathon at Structure

    Hardware innovation is not often viewed as sexy, yet it can lead to massive changes in the data center. The Open Compute Project continues to support and encourage breaking down barriers of traditional thinking about IT equipment. To that end, the group is sponsoring its second Hardware Hackathon on June 18 at the GigaOM Structure conference.

    The winner will earn a $10,000 prize, if it’s best positioned to become a venture-backed startup. (Or if the winner would rather not receive the funding, the foundation will issue a patent for the winning hack in the winners name.) In addition to the prize money, the Open Compute Foundation is partnering with a team of angel investors and venture capitalists who will work with the individual to formulate the initial idea into a business plan.

    Winners will present their hacks and receive their prizes on stage at GigaOM Structure on June 19.

    Upverter and GrabCAD will be there to assist you in designing your project. Sign up for the hackathon at Upverter’s site. Registration is open through the end of this week.

    For more information, visit the OCP site. To join the discussion on OCP hackthons, join the OCP Hack 2013 group on Facebook.

    4:21p
    New Cold Storage Player SageCloud Raises $10 Million

    More venture bets continue to be placed in the cloud storage space. SageCloud, a new player focused on cold storage, has raised a $10 million Series B round. SageCloud is from the founder of Carbonite, a sizeable cloud storage player that went public in 2011. The round was led by Braemar Energy Ventures and was joined by existing investor Matrix Partners.

    SageCloud is focused on archival and backup storage needs and the increasing importance of Big Data analytics. The company’s storage systems are engineered to address the growth of “cold data” –  data at the petabyte scale that is infrequently accessed. It aims to deliver what it calls “breakthrough price points” for Cold Storage Tiering that will align storage on disk with the price point of tape. To achieve these economics, SageCloud says it has integrated open source and proprietary software to resolve the data durability problems of commodity drives and leverage better power management designs of open stack enclosures.

    “SageCloud has developed a differentiated solution for extremely low-cost cold data storage that does not compromise on long-term data preservation and improves substantially upon existing solutions from an efficiency standpoint,” said Dr. Jiong Ma, Partner at Braemar, who will join the SageCloud Board of Directors.

    “We’re seeing the potential for a big disruption to the storage industry coming from the disaggregation of software and hardware,” said David Skok, General Partner, Matrix Partners and SageCloud board member. ”Hardware prices are being driven down by open standard hardware (referred to as Infrastructure 2.0) and commoditized components.  Jeff and his team have spotted this opportunity, and created the software to allow enterprises and service providers to use this hardware for storage of their Cold Data at a fraction of the cost of other alternatives. Given the unrelenting growth in data that we’re seeing, we believe this has the potential to be a huge category.”

    A recently published report from IDC, power management is identified as a critical data center need and forecasts that the market for solutions to address energy-efficient data storage will exceed $25 billion by 2016.

    In laymen’s terms, SageCloud is for the type of data that is written once and infrequently accessed.  The press release claims industry analysts believe as much as 80 percent of data qualifies as cold data, and that volume is growing. SageCloud Founder and CEO Jeff Flowers drew from his experience at Carbonite, a company he co-founded and helped lead as CTO, to identify the need for next gen cloud storage solutions.

    “SageCloud has developed powerful system software which leverages open-standards enclosures and economical commodity drives to optimize energy efficiency, drive sustainability and long-term data preservation for customers’ data centers,” said Flowers. “In this way, we are revolutionizing the economics of data storage.”

    “Jeff and his team bring a demonstrable track record of success in previous ventures and a wealth of expertise in data storage. We are excited to partner with this world-class firm, which Braemar believes has a chance to disrupt the enterprise market for intelligent data management and storage.”

    8:19p
    TIBCO Turbocharges Big Data Analytics

    During its Transform 2013 events in Paris and London, infrastructure software company TIBCO Software (TIBX) launched Iris, a new troubleshooting and forensic application, and updated its FTL low-latency messaging platform.

    TIBCO announced Iris, a new software product designed to deliver enhanced application troubleshooting and forensic capabilities for application developers, support engineers, security analysts, and QA testers. Iris is a standalone solution to ingest, process and visualize results for application and machine log data to help trpuble-sheet and analyze enterprise applications.

    “There are threats and opportunities hidden in customers’ big data,” said Rock Gnatovich, senior vice president, TIBCO. “With TIBCO Iris, we are complementing our ability to collect and manage massive amounts of log data with a self-service solution that will become indispensable to application developers and security experts who are charged with keeping systems running and companies safe.”

    TIBOC FTL Leverages In-Memory

    TIBCO announced the latest version of its flagship extreme low-latency messaging platform, TIBCO FTL 3.1, which introduces in-memory technology to bring new levels of reliability to persistent messaging. The powerful messaging platform can be leveraged by any industry facing ever-increasing data volumes, including logistics and transportation companies, government, power and utilities companies and high-tech manufacturing industries.

    FTL 3.1 merges the latest advancements in hardware and networking to handle higher message throughput with lower latency, and a greater number of concurrent connections, than traditional messaging approaches. A new distributed in-memory persistence engine is introduced in FTL 3.1, with guaranteed message delivery and a messaging throughput of over 850,000 messages per second.

    “TIBCO FTL was designed to address the needs of extreme low latent environments such as financial markets, where trades are increasingly performed by high-performance algorithms and where microseconds make a vast difference in delivering results,” said Denny Page, Chief Engineer, TIBCO. “With the introduction of the in-memory persistence engine for guaranteed delivery, TIBCO FTL 3.1 is now also ideally suited to address high performance use cases in any industry where extremely fast environments are required to track billions of data points a day.”

    << Previous Day 2013/06/11
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org