Big data has reached a critical stage. The market is poised to grow to more than $ 50 billion by 2017, but more than 55 percent of big data projects fail.
With so much opportunity coupled with hype and misinformation, we are in the midst of the big data Wild West. There is a standoff coming between those that understand what big data is — the ones making investments to collect, store and harvest it — and those that are buying snake oil and don’t understand how big data can impact their business.
The good
Big data allows you to fail. This may sound counterintuitive, but when it comes to big data, there’s good failure and bad failure.
Confused? Here’s an explanation: Big data, in its raw form, allows for a “test and learn” approach. Companies have to create many small “failures” by developing hypotheses and examining them against the data. This allows enterprises to develop a truly coherent strategic approach grounded in data.
These “failures,” part of the process of uncovering good unbiased analysis, create tremendous opportunities for companies in a number of areas: customer recommendations, risk measurements, device failure predictions and streamlining logistics to name a few.
In some of the best cases, businesses are developing new products from their insights, creating new revenue streams and even transforming their culture to be data-driven. To reach this stage of evolution, businesses must attack scalability and cost containment, develop agile analytics and insights, and look to optimize their business with automated predictive analytics at scale. Only then can they truly move to the final stage of the process, where big data capabilities literally transform your business and help you create new products based on the data.
Speed is the key to success in the early stages of big data implementation. The faster you can complete projects and build organizational expertise in using data in this new way, the sooner you can create value and move to a more sophisticated stage of adoption.
Quantcast is a perfect example of this four-stage transformation. It began as a free audience-measurement service to directly measure website traffic. As it gained traction and grew from thousands to billions of events per day, Quantcast quickly hit a requirement to scale beyond traditional database technologies. As the business expanded, it became essential to provide higher quality insights and analytics, to add value and deepen relationships with audience measurement.
Quantcast quickly saw the need to invest in data science to address complex challenges in counting unique people and identifying demographics and interests among a sea of anonymous activity. Subsequently, Quantcast used this environment to test a number of products and offerings, including a highly successful scalable advertising solution, called Lookalikes, that allows advertisers to target new consumers who are highly similar to their already-engaged ones. Quantcast is now earning more than $ 100 million in annual revenue, and its dexterity allowed the company to react as big data opportunities developed.
Quantcast, Google, Facebook, LinkedIn are pioneers in big data and have gone through these stages of evolution. As big data moves from webscale pioneers, other enterprises are beginning to embrace it to create value and building competency. For example, large IT suppliers are using detailed technical product data, blended with business data, to optimize their businesses with predictive analytics that results in capabilities such as proactive servicing of customer devices.
The bad
Unfortunately, many enterprises are still in the kick-the-tires mode and are only exploring big data from a cost containment or storage scalability point of view. They may be looking at “agile analytics” — the ability to be flexible with your data, unconstrained by traditional limitations of developer resources or database capacity — to work with data.
But this means that many, if not most, companies exploring big data are missing opportunities to improve their business and provide better service to customers. They are also potentially missing out on the chance to develop new products based on data rather than just intuition. They are reaching big data plateaus — achieving the ability to store data, but not extracting additional value.
Big data requires an investment of people and resources — the human layer to make sense of all the technology. Traditional methods of cost savings often require staff reductions. With big data, the opposite is true. In order to achieve big data breakthroughs, companies need to invest in technology and people. Larger enterprises that are unable to move quickly and make these investments will allow smaller, nimbler competitors to gain a competitive edge.
The ugly
It doesn’t get any uglier than a majority failure rate — that aforementioned 55 percent. Why the failure? One thought is that in the Wild West of data, there are hustlers and charlatans who promise the world but don’t produce results. These vendors realize there’s a lot of hype around big data and behave accordingly. Many legacy consultants and systems integrators have positioned themselves as experts despite their lack of qualifications.
Likewise, many established product vendors are marketing last-generation environments as “big data.” Many of these vendors have their work run on a single computer with SAS models — this is not big data! Others focus on data munging and reporting, as well as extracting, transforming and loading against small relational databases. These vendors often have some pre-built proprietary software that doesn’t leverage open source standards such as Apache Hadoop.
We are at a crucial point in the trajectory of big data — one that begs for consistent results in order to ensure continued growth. When companies are fooled into thinking inadequate technologies or techniques are big data, their business will suffer as their analytics fall short of what is necessary. If their projects fail or provide misinformation, these enterprises will lose ground to their competitors who did understand how to properly use big data technologies.
Rod Bodkin is co-founder and CEO of Think Big Analytics. He will be speaking Thursday, Sept. 19, at our Structure: Europe conference in London.
Feature image courtesy of Shutterstock user Antonio Gravante.
Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.
- A near-term outlook for big data
- The Incredible, Growing, Commercial Hadoop Market
- Cloud and data first-quarter 2013: analysis and outlook