|
| |||
|
|
How Does Data Fabric Fit With Software-Defined Storage, and What Can It Do For You? Marc Fleischmann is CEO and Founder of Datera. Infrastructure matters. The power of big data hinges upon its accessibility, so when information is siloed, it curtails the functionality of analytics. Unfortunately, this is an all too common occurrence for companies who rely on traditional monolithic storage solutions, which have fixed boundaries. Infrastructure impacts applications’ performance, cost and scalability, yet few people grasp the importance of selecting the right type of platform. At one time, storage was relatively simple, with pre-defined capacity, performance and cost, but also relatively straightforward. Back then it was a simple matter of integrated hardware and software delivering a clearly defined service. But with the advent of the cloud, data proliferated at unprecedented rates and created exciting new possibilities. Similar to software-defined networking, software-defined storage has taken off over the past few years. If networking entails the movement of data from place to place, storage requires the preservation of data–its quality, reliability and endurance. Software-defined storage brings life to stored information, sorting it, organizing it and automating retrieval processes. There are many implementations of software-defined storage. Most recently, however, hyper-converged solutions and scale-out distributed system (or data fabrics) have driven most of the use cases. Hyper-converged solutions have the benefit of being simple, turnkey, focused for support for virtual machines and targeted for small to medium size deployments. On the other hand, data fabrics provide a wide spectrum of capabilities, add scale, can support adaptive policies and morph as storage requirements evolve. The latter is more efficient, as it allows for independent scaling of compute and storage, while the former is more simple, as it packages compute and storage scaling. Unlike traditional monolithic storage systems, a data fabric is agile, conforming to evolving application needs. As a result, companies can access data more readily, spend resources more sustainably and deploy their applications faster. According to Forrester, data fabric can help enterprise architects “accelerate their big data initiatives, monetize big data sources, and respond more quickly to business needs and competitive threats.” Here are some of the major ways that data fabric adds to software-defined storage, differs from traditional data storage, and how it impacts IT.
Overall, a data fabric platform automates storage provisioning for applications, significantly simplifying the consumption compared to legacy systems that require that to be done manually. It’s faster and more adaptive, which allows enterprise IT teams to focus on building and improving the applications themselves. Data fabric technology is likely to become the data center solution of choice for the majority of enterprises within the next few years–a competitive advantage that will set IT-savvy companies miles ahead. Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library. |
|||||||||||||