Integration Captures Big Data’s Hidden Value
Blog: Software AG Blog - Reality Check
More and more companies are adopting the Apache Hadoop framework as data volumes grow from gigabytes to terabytes; they need a cost-effective way to store and process data sets. But most firms have been adopting these platforms without actualizing their full value.
With the explosion in big data, a cost-effective framework such as open-source Hadoop was needed. Storing vast amounts of data in traditional databases is expensive and slow. The Apache Hadoop framework allows for the distributed processing of large data sets across clusters of computers using simple programming models.
The lost cost of adoption means that, unlike traditional databases, companies can store anything they want in Hadoop without worrying about the size of data store and the processors it will need. As a result, the types of data sets being stored have grown. Everything about customer behavior, their preferences, history, geo-location – and much, much more – is being stored without little plans to use it.
Many companies are simply moving their data into Hadoop and forgetting about it. They are not using it to drive analytics or getting any real business value out of it.
Imagine that big data stored in Hadoop is like a giant freezer containing lots of tubs of ice cream in different flavors. But you do not have a spoon so you cannot scoop out the icy goodness. Not only is there ice cream, you can also find main courses and soups and desserts in the freezer, but you have no way to get them out. You want one tool to take them all out – you don’t want to have to find a different tool for each flavor or food type.
Now imagine you have a spoon that can take out not just the ice cream of your choice, but also a steak and some frozen peas for dinner. This is the kind of tool needed to realize the value of your data in Hadoop; a kind of middleware spoon that takes real-time data out and mixes it with static or historical company data to use for analytics.
Amazon, for example, uses this kind of middleware spoon to scoop out relevant data. If you are on the Amazon site and you hover over an item, it captures that interaction information in order to improve the website. It can see if you used the product reviews, whether its suggestions (Customers who bought X also bought Y) were heeded. In other words, Amazon uses all of the ice cream in the freezer to influence customer behaviour.
There are other stores that have a freezer full of information too, but do nothing with it. Some retailers, for example, have reams of data on their customers and their purchases yet just leave it sitting in Hadoop. They are customer centric, yes, but they are not using the customer data to improve and refine their offerings.
By using the spoon, or increasing the integration of existing systems with newly acquired Hadoop platforms, you will unlock the hidden value in the freezer. Only then can you enable big data to be used to make smart decisions and improve customer satisfaction.