Nasdaq on the virtues of the public cloud

The financial services industry needs to store a lot of data, and the folks at NASDAQ want to use it to help build new products. But they also have to keep an eye on costs. Ann Neidenbach, an SVP at the NASDAQ OMX Group, explained how the exchange made the decision to use the Amazon public cloud for storage Thursday at the Structure:Data 2013 conference in New York.

“The economics of the clouds are just phenomenal,” says Neidenbach. She and Ron Bodkin, the founder and CEO of Think Big Analytics, noted that the exchanges have regulatory requirements to keep petabytes of old data associated with trading, and that adds up. She says the exchange has 10 petabytes of trade data and 3.5 petabytes of data that’s just backups. “We’re not even talking about email or even instant messaging,” said Neidenbach. “It’s a tremendous amount and that’s why we went into this partnership with AWS.

“We have to leverage the public cloud where storage is so much cheaper per gigabyte,” she added.

The exchange did have to spend time making sure the public cloud was secure enough for their data, but also for their customers’ data and that’s a journey that is just beginning. But she certainly implied that given the costs of storage in the public cloud and the amount of data financial institutions must keep, that the journey and eventual public cloud destination was inevitable.

And the end result of having all this data stored in one public place might yield new products. Neidenbach expects that combining trade data with social media might result in new opportunities for trading and research. You had the feeling that the sky might be the limit.

Check out the rest of our Structure:Data 2013 live coverage here, and a video embed of the session follows below:

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • A near-term outlook for big data
  • How consumer media will change in 2013
  • Examining the rise of crowd labor platforms in 2012


GigaOM