We need to prevent insights from dying in the big data avalanche

Most enterprises think they know how promising their data is. The truth is, they don’t realize just how much value is hidden in the massive amounts of data they sit on — even as more data rolls on in. And because of this, the best insights – the ones that can be harnessed for transformative change – are at risk of getting buried in today’s data avalanche.

Analyze your data in the now.

Back when I was working on my PhD, I worked in the lab of a major telecommunications operator. My job was to run algorithmic possibilities on sensor-generated data so I could identify valuable trends and clues to network performance.
The best part of the day was when the FedEx truck arrived and I could get my hands on boxes of network data storage drives with mountains of months-old data generated by those sensors just waiting to be analyzed. Talk about timely insights being dead on arrival!

My employer had no idea what insights lay buried inside those drives. And yet, collecting, storing and sending the data out for analysis was the only option they it at the time. At that point I realized the model for data analysis had to change on a fundamental level, especially if data was going to continue its exponential growth curve. Businesses needed to analyze data as the avalanche roared in, and it was going to take some sturdy tools to do it.

The smartphones and tablets we rely on today contain a wealth of information on us — our preferences, our habits, our behavior. And this is just one kind of machine interface. there are also cars, for example, which now come equipped with an array of sensors to gauge everything from driving styles to road conditions and wear-and-tear, all in the interest of making driving safer and more enjoyable. Meanwhile, cities are deploying wireless sensors in stoplights for improved traffic surveillance. In disaster-prone regions, bridges and buildings can even evaluate their own stress points.

This phenomenon gives us an extraordinary opportunity — one that no civilization has had before — to know the now. If businesses act fast enough, they can distill that knowledge into timely, intelligent, data-driven insights for more agile operational and business processes. For example, an auto collision warning that pops up three weeks after the crash itself is useless. It’s the immediacy of insight that can then be translated immediately into action that safeguards us against disaster.

We need tools for real-time analysis

So now that the ability to gather such immediate data from a variety of devices and places exists in our world, it’s imperative we put it to work for our advantage. How can businesses parse data in a timely manner to identify trends, glean new insights into customer behavior, and respond immediately to changing market dynamics or customer habits? How can we best take divergent sources of data and dynamically fuse them together so people, machines and processes make optimal responses at any given moment in time?
In order to save this data from a premature death, and catapult it into a driving force for a data-driven global economy both the enterprise and the analytics architecture must rise to the occasion. Enterprises need a new approach to analytics where contextually-aware applications are based on specific use cases, built on a new data processing stack and backed by a new economic model.

As it stands today, big data analytics technology is comprised of many disparate toolsets and technologies. What’s missing is a foundational architecture to support all these individual tools and technologies — a complete, holistic stack that can help organizations get from data ingestion to data decisions in one fell swoop. This new architecture must recognize that a sensor-rich world creates data continuously, and in order to take immediate action, the analysis too must also be done continuously, rather than after-the-fact, once the data is stored away. This new architecture must also combine a variety of data sources instead of keeping them in silos. And it must elastically scale to the petabytes of structured and unstructured data that are now generated on a nonstop basis.

Equally important is the need for a new economic model for data processing. Today, enterprise customers spend tens of millions of upfront dollars on data projects, the majority of which goes toward capturing and storing the data. They must then wait a year or more to start seeing value from their data assets.

Our data-rich world therefore needs a new paradigm where enterprises first spend on analytics — not storage — with an agile, iterative approach that proves out the value of a particular idea in the first days and weeks of deployment. Once proven, this use case is swiftly rolled out as an application that any business manager can use to make decisions. This business value-led approach to big data can then be scaled across other functional areas of the business and power data-driven decision-making across the enterprise.

Once enterprises embrace this new approach, big data’s vast potential will no longer be crushed by its own weight. If data is at risk of being lost in the avalanche, our analytics platforms should serve as first responders to the emergency.

Anukool Lakhina is the CEO of Guavus.


GigaOM