Data Center Knowledge | News and analysis for the data center industry - Industr's Journal
 
[Most Recent Entries] [Calendar View]

Friday, November 8th, 2013

    Time Event
    1:00p
    How New Technology Can Boost DR and Business Continuity
    clouds-concept

    The evolution of the data center has brought many new enhancements that can impact your disaster recovery and business continuity planning.

    Let’s talk a little shop today. One of the hottest conversations many business managers are having is how their organization can use the data center as a key element for their disaster recovery and business continuity strategies. Cloud computing, data replication and virtualization all play a major role in the disaster recovery and business continuity (DRBC) discussion. Still, the evolution of the data center and new types of resources requires administrators to take a look at their strategies and see where they can improve further.

    Although business continuity and disaster recovery can overlap, they are really different IT objectives. With that in mind, the conversation about data center DR strategies has really evolved over the past few years. Where it was once reserved for only the big shops or ones with a lot of dollars to spend, modern IT infrastructure allows a broader range of companies to do a lot more, for a lot less. Smaller organizations are now leveraging private and public cloud environments for their DR needs. In fact, this influx of new business is part of the reason that many data center providers are seeing a boom in services requests.

    What are the real driving factors here?

    • Global traffic management (GTM) and global server load balancing (GSLB). The truth is simple: Without these types of modern global traffic controllers, it would be a lot more difficult to replicate and distribute data. Just take a look at what technologies like F5, NetScaler and Silver Peak are doing. They are creating a new logical layer for globally distributed traffic management. Not only are these technologies optimizing the follow of traffic, they are controlling where users go and what resources they can utilize. The digitization of global business now requires administrators to have intelligent devices helping optimize and load-balance traffic all over the world. With virtualization, both physical and virtual appliances are capable of spanning global resources and communicating with each other in real time. From there, they can route users to the appropriate data center as a regular function of the policy – or even can route entire groups to other live data centers in case of an emergency.
    • Software-defined technologies. Working with software-based and virtual technologies certainly makes life easier. What we’re able to do now with a logical network controller in terms of creating thousands of virtual connections is pretty amazing. Furthermore, the ability to control traffic, QoS, and even deploy virtual security services makes these new types of technologies very valuable. Remember, the conversation isn’t just around SDN. Software-defined technologies also incorporate security, storage and other key data center components. We are creating logical layers which allow for improved communication between hardware components and global resources. This is the concept of virtualizing servers and services. Inter-linking nodes without having to deploy additional hardware is a big reason cloud computing and the boom in data center resources has become so much more prevalent.
    • High-density computing. Shared environments and multi-tenancy are becoming regular platforms within the modern data center. After all, why not? The ability to consolidate and logically place numerous users on one shared system is pretty efficient. Plus, administrators are able to use highly intelligent blade systems to quickly provision new workloads and rapidly repurpose entire chassis in case of an emergency. Furthermore, converged infrastructure is seeing even more advancement as more SSD and flash technologies become incorporated directly into the chassis. Imagine now having the capability to deliver millions of IOPS and hundreds of terabytes of flash storage to key workloads distributed over your corporate cloud. Furthermore, replicating this type of system requires less resource utilization and focus more on server profiles. This isn’t only a highly effective use of server technology – it’s the utilization of advanced service profiles which are capable of virtualization – at the hardware layer.
    • More bandwidth. More fiber, better local and external connectivity, and greatly improved network capabilities are allowing the data center to deliver massive amounts of data and lightening speeds. Already, we are seeing Google Fiber delivering unprecedented speeds to homes for very low prices. I, for one, can’t wait for that service to come to my city. The point is that bandwidth is become more available. Edge systems are capable of handling more traffic and very rich content. This increase in WAN-based resources is the direct reason that so many organizations are moving a part of their infrastructure into the cloud. Hybrid systems allow for fast data replication and the ability to stay up if your primary data center goes down. Furthermore, when you couple the above three drivers together, you get an environment which can replicate quickly and stay extremely agile.

    Are there other advancements that have helped organizations achieve greater levels of redundancy? Of course there are. Everything from the software layer to new types of hardware advancements all help organizations better utilize resources. The bottom line is this – if you haven’t explored a cloud option for DRBC, maybe it’s time. The modern data center has become the home to a lot of really advanced technologies and service delivery models. All of these systems are working together to deliver more services, resources and a lot more agility for your organization.

    1:30p
    IT Process Automation Poised to Grow

    Gabby Nizri is the Founder & CEO of Ayehu Software Technologies Ltd., publishers of eyeShare, an enterprise-class, lightweight IT process automation tool.

    GabbyNizri_tnGABBY NIZRI
    Ayehu Software

    In a recent study conducted by trusted IT industry researcher Gartner, IT process automation tools are poised to make a significant impact for businesses across the globe over the next several years. Specifically, the study indicates that IT Process Automation (ITPA) will become an invaluable addition to IT departments and organizations as a whole by providing “consistent, measurable and better-quality repeatable services at optimal cost.”

    CIO_Gartner_Predictions

    Here’s how I believe this projection will play out over the coming years.

    The Current Climate and Impending Changes

    Different organizations will be using IT process automation in varying ways, depending on their existing setup and specific business needs. The increased need and/or usage of ITPA will be driven by the increase in adoption of cloud technology. Automation tools are very versatile and can be used in collaboration with public, private and hybrid cloud setups. Certain ITPA tools can be easily and seamlessly integrated with existing CMP tools to further their capability and add more depth.

    What will need to occur in order for the projected influx of ITPA adoption is that IT organizations will need to be able to clearly identify, understand and document exactly which activities and workflows they want to automate. This is important because not every IT process automation tool is created equal. With different features and functionality, it’s important to be able to determine ahead of time what product specifications would be the best fit.

    How Will ITPA Benefit Future Business?

    The main benefits of adopting ITPA, either as a standalone automation tool, integrated with existing IT management products or as the “glue” that ties the entire infrastructure together, are marked improvement in operational efficiency and minimization of risks associated with human error. When businesses are able to automate as many processes as possible, across multiple IT domains, they are able to consistently achieve a higher level of productivity. In other words, human error is reduced or eliminated altogether, and operations improve overall because personnel is freed up to focus on more important business critical tasks.

    What to Keep in Mind

    As the shift toward a more automated business culture begins to take place, users must be aware of what their specific needs are and how their ITPA tool(s) will allow them to achieve their specific goals. This involves understanding how each automation product works. For instance, some products offer a more defined orientation and process framework. These tools are fine, provided that they meet the specific needs and address the pain points that are unique to the business, otherwise the value will be limited.

    Other ITPA products will offer a broader functionality. While these tools may be more valuable in that they offer more features and provide a more robust list of functions, the business that chooses such a tool must first be sure that they have a mature team in place to be able to develop, build and maintain the various automation processes and workflows. It’s important to know the capabilities and limitations of your business prior to selecting an ITPA tool.

    For those businesses that choose an ITPA product that is designed to support more functions for multiple processes across different domains, or to be integrated with several other vendor tools, it’s equally as important to identify and document all of the ITOM (IT operation management) processes prior to implementing the new automation tool.

    When properly researched and carefully selected, ITPA tools can vastly improve the overall operations, both internally in the IT department, as well as across the entire organization. Individual time-consuming, repetitive tasks can be automated, as can complex workflows, to create a more streamlined work environment and open up more opportunities for IT professionals to further develop their skills, making them more valuable to the businesses for which they work.

    Be Ready for The Future

    I’ve said it before, and we’ll say it again – IT process automation is the future of business across just about every field and industry. Now there’s a solid research study, conducted by a trusted and highly qualified firm, to bolster that prediction. Are you ready to roll with these changes and embrace ITPA as a part of the future success of your organization? The time is now. Don’t get left behind.

    Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

    2:00p
    Schneider Electric Announces Prefab Data Centers Up To 2 Megawatts
    schneider-module-470

    Schneider Electric has expanded its range of modular data center products, including both IT modules and power skids. (Photo: Colleen Miller)

    Schneider Electric has announced the availability of prefabricated data centers offering up to 2 megawatts of IT capacity. The company sees modular expansion as a major opportunity, and has been increasing its offerings to target this market. The company introduced 15 new prebricated data center modules as well as 14 data center reference designs.

    The new prefab modules deliver IT, power, and/or cooling integrated with best in class components as well as the company’s StruxureWare Data Center Infrastructure Management (DCIM). The modules are scalable in 250kW to 2MW increments, with capacities ranging from 90kW to 1.2 megawatts. They are Uptime Tier II and Tier III compliant, with the goal of being customizable, predictable and easy to deploy for facilities managers looking to optimize and expand.

    “Today’s business environment demands data centers that are increasingly more flexible and scalable with an emphasis on deployment speed,” said Kevin Brown, vice president, Data Center Global Offer and Strategy, Schneider Electric. “Prefabricated data centers enable data center managers to maximize the speed of business through rapid installation, easy expansion and improved cash management. This prefabricated approach inherently increases the predictability of the build process, since most of the construction occurs in a factory instead of in the field.”

    These prefabricated modules are delivered on-site preconfigured and pre-tested for easy installation with a lead time of 12-16 weeks, depending on project complexity. Schneider said the solution includes a wide range of prefabricated data center reference designs and module configurations, detailed technical documentation, and regional support teams.

    2:30p
    Swimming Storage: Helium Hard Drives Use Immersion Cooling
    helium-grcooling-470

    Several 6TB helium-filled hard disk drives from HGST are lowered into a liquid cooling solution from Green Revolution Cooling at the Cloud Expo 2013 conference .(Photo: HGST)

    On Monday, Western Digital’s HGST unit announced a new Helium-filled hard drive, which boosts storage capacity by reducing drag, since helium has one-seventh the density of air. This allows HGST to place spinning disks closer together, and pack seven disks into a drive that normally would hold five. Since the HelioSeal drive is hermetically sealed, it also makes it ideal for immersion cooling. At this week’s Cloud Expo 2013 conference, HGST (previously Hitachi Global Storage Technologies) showcased this capability, submerging its Ultrastar He6 drives hard drives in a liquid cooling solution from Green Revolution Cooling. The combination could prove attractive to organizations with must crunch and store massive volumes of data. Green Revolution’s technology has already been used to create immersion data centers that can pack extraordinary compute density into a small room. In this video, Brendan Collins, Vice President of Product Marketing at HGST, discusses the features of the He6 helium drives which are running behind him, fully immersed in the Green Revolution liquid cooling tank. This video runs about 1 minute.

    5:00p
    Friday Funny: Raised Floor Hazard

    So it’s Friday! You know what that means–time for our fabulous data center cartoon caption contest.

    Our data center cartoonist, Diane Alber, has a great new cartoon, which shows the comic possibilities of the an open tile in the raised floor in the data hall. Diane writes, “I’ve seen people working under the floor and they accidentally didn’t put the floor tile back in place! It just looks like a accident waiting to happen!”

    Here’s How It Works:

    We provide the cartoon and you, our readers, submit the captions. We then choose finalists and the readers vote for their favorite funniest suggestion. The winner receives a hard copy print, with his or her caption included in the cartoon!

    Recent Contest Winners

    We voted on two cartoons last week. Here’s what DCK readers thought was the best caption for Is It Hot Enough for You?: “Don’t run, it’s just a sales guy here for a tour.” Hearty congrats to DDay!

    And the best caption for Holy Batman! was submitted by Joe W, who sent in, “Kip, there is no way I am going to the company party as Cat 6 Woman.”

    And now, this week’s cartoon. Scroll down and submit your best captions!
    fallen-floor-470Click to enlarge.

    Please visit Diane’s website Kip and Gary for more of her data center humor. For the previous cartoons on DCK, see our Humor Channel.
     

    << Previous Day 2013/11/08
    [Calendar]
    Next Day >>

Data Center Knowledge | News and analysis for the data center industry - Industry News and Analysis About Data Centers   About LJ.Rossia.org