Google I/O sensors will detect motion and generate data for real-time visualization

While there will be no shortage of smartphone-equipped developers and media recording the goings-on at the Google I/O developer conference later this week, Google plans on conducting its own experiments. To get the most out of its developer conference at the Moscone Center in San Francisco later this week, it will deploy a bunch of Arduinos throughout the venue to detect humidity, motion, sound and temperature.

According to a Monday blog post from Michael Manoochehri, a Google developer program engineer, Google will take the data coming in from the Arduino boards and visualize it all in real time with Google Cloud Platform services such as Google Compute Engine and BigQuery. And it’s no teensy-weensy data set:

“Altogether, the sensors network will provide over 4,000 continuous data streams over a ZigBee mesh network managed by Device Cloud by Etherios.”

The visualizations will be on display on screens during the conference. And Google said it will make the Cloud Platform code and the resulting data available in open source.

O’Reilly Media has used Arduinos at events for similar purposes before, as I reported in February. How are the deployments different? For one thing, Google uses the Google cloud — surprise, surprise — while O’Reilly has used Amazon Web Services. The question is whether the project will persuade non-Google developers to try using the Google Cloud Platform for their own programs to crunch data generated by sensors.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • A near-term outlook for big data
  • 12 tech leaders’ resolutions for 2012
  • Dissecting the data: 5 issues for our digital future

    


GigaOM