Disruption in Data Storage – Nuances & Opportunities
Blog: NASSCOM Official Blog
In a world where most of our technology needs are delivered as cloud-based services and AI knowing more about us than our friends do, it is only apt to believe that all the underlying infrastructure needed to support this digital intrusion is ripe for a disruptive change.
We are talking about the storage solutions that help store, organize, process, and safeguard nearly 2.45 Quintillion bytes of data created worldwide in a single day. The data storage segment has seen a slew of disruptions over the past few years.
Evidently, one of the most prolific shifts in the way storage solutions were viewed was the growth of Software Defined Storage. SDS created a whole new dimension for allowing enterprises to manage their explosive data storage needs with software running on local or cloud data centers.
So, what fueled the need for further disruptive innovations in the data storage space?
With roughly 4.57 billion people or 59% of the global population being active Internet users, the amount of data generated in the coming years may well need new names in numerical labeling.
The growth of AI and Machine Learning has contributed significantly to the need for enterprises to store and analyze ever bigger volumes of data generated across their business. It becoming clear that the more data enterprises can crunch, the greater will be the efficiency of insights developed by AI systems.
The connected digital economy wherein consumers have more of their devices connected to the Internet is only proving to be the icing on the cake. From smartphones to wearables, smart televisions, intelligent home speakers, and even connected cars becoming more mainstream, several experimental and conceptual innovations in data storage have finally made it into real-world use cases.
Let’s have a look at the top three disruptions in data storage that bring about new opportunities for enterprises to actualize their digital aspirations.
Composable Architecture refers to an approach of managing a digital infrastructure through software via a web-based interface. By digital infrastructure in this context, we mean the computing hardware, storage, and network fabric. Each of these can be abstracted from their actual physical location and managed through an API that unifies executions of applications over these decoupled infrastructure components. Each device or component in a digital infrastructure following composable architecture can have its own independent existence, thereby eliminating the risk of the entire system failing if any one or more components face a technical issue. Storage is a vital part of composable architecture and enterprises will have the freedom to uniquely extend any component of the architecture, for example, storage alone, to support the management of the increased data generated. A company like Western Digital claims that this approach of composable architecture lowers the Total Cost of Ownership (TCO) by over 40% when compared to hyperscale solutions that enterprises have been relying on for a while now.
Computational Storage Services
Moving large amounts of data from a storage mechanism to the processing host or system is often a bottleneck for performance in the enterprise world. The concept of Computational Storage came into existence to resolve this problem. It focuses on bringing data and computational processing together as elements associated with the storage unit. It improves the efficiency of applications by integrating computing resources directly with storage. This facilitates parallel computation and lowers the latency in data transmission between memory and computing resources in traditional settings. Such processing power and speed could prove to be highly influential in today’s digital-friendly enterprise sector.
NVMe Over Fabric
Non-Volatile Memory Express (NVMe) brought about a massive shift from the way storage devices and technologies were standardized in the past. It was a consortium of industry leaders in the storage sector who came together to redefine and standardize the interface for NVM devices connected to a PCIe bus. The main aim was to reduce the dependency of the storage medium on the CPU so that the CPU could be left for better utilization by the core application. NVMe Over Fabrics (NVMe-oF) was the protocol that allowed the seamless transfer of storage management commands to remote NVM devices. This allowed enterprises the flexibility of having access to powerful memory storage services, while offering lesser load on the CPU subscribed, for memory management. As more brands involved in the manufacture of storage devices come together, NVMe-oF will soon become the dominant force in distributed memory management that enterprises cannot ignore.
The world is witnessing major changes. There’s the dawn of 5G wherein the Internet will see more traffic than ever before from a barrage of devices ranging from smartphones to refrigerators and automobiles that go online. The pandemic has driven enterprises into a fully remote working mode with thousands of employees accessing enterprise data, tools, and technology stacks from homes and remote locations. Devices, systems, and tools are generating massive volumes of data. Handling data of this magnitude requires quite a different storage approach from traditional practices. Enterprises will need to focus on building solutions that can incorporate newer storage innovations in their data management policies to enable sustainable growth in their business aspirations. The three disruptors covered here could well form a part of that select set.
The post Disruption in Data Storage – Nuances & Opportunities appeared first on NASSCOM Community |The Official Community of Indian IT Industry.