Feds prep big data confab

All right all you big data nerds — time to suit up for the NIST’s Big Data Workshop  slated for next week.

The goal, according the workshop’s web site is to explore:

  • State-of-the-art core technologies needed to collect, store, preserve, manage, analyze, and share big data that could benefit from standardization
  • Potential measurements to ensure the accuracy and robustness of methods that harness these technologies

The agenda for the event, hosted by the National Institute of Standards and Technology’s Information Technology Library, includes sessions on healthcare analytics and big data programs. Speakers on the latter session will include folks from the National Science Foundation, the Department of Energy, the Intelligence Advanced Research Projects Agency and the Air Force Office of Scientific Research. Experts from industry powers Google, IBM, Oracle, and Microsoft — all of which have launched massive efforts in this area — will speak on big data platforms.

The federal government has been thinking — and talking about — big data a lot of late. In late March, the Obama administration hosted a webcast outlining its “Big Data Research and Development Initiative” that allocated more than $ 200 million annually in research grants and other investments to promote the use of big data to solve problems. That’s a pretty big number in a time of stressed and stretched federal budgets so the administration is betting that big data techniques can pay off in increased productivity and cost savings over the long run.

For more on the event, to be held in Gaithersburg, M.D. June 13-14, check out the Computing Community Consortium’s blog.

Feature photo courtesy of Flickr user Kevin Krejci


Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • 2012: The Hadoop infrastructure market booms
  • Infrastructure Q1: Cloud and big data woo enterprises
  • Why service providers matter for the future of big data



GigaOM