For big data analytics, recall the tried and true old-school rules

Data analysis didn’t start with Hadoop. Companies have been working with data to get insights for decades. While technology has changed, some of the rules from the past still apply, or ought to, as data gets bigger and bigger.

Jack Rivkin, an occasional blogger with deep investment experience, recently shared some of the best practices he was exposed to early in his career working on economic forecasts. He shared some sage suggestions for enterprises to bear in mind as they consider and implement big data strategies. Among his insights:

  • Forecasting models can only be as good as the data inputs.
  • Be skeptical and hedge when sharing the models by noting factors that could lead to different results.
  • The less time it takes to process data, the more valuable it is.
  • Constantly improve models and inputs.

Of course, big data isn’t wholly evolutionary — it does bring its own all-new opporunities and risks. Some of the world’s leading data scientists, IT executives and business users will address them at GigaOM’s Structure:Data conference in New York on March 20-21.

Feature image courtesy of Flickr user luckey_sun.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • What’s driving the next phase of the e-commerce evolution
  • GigaOM Research highs and lows from CES 2013
  • How HR can make the case for workforce analytics


GigaOM