

During the Google I/O conference last week, visitors had their movements tracked and visualized into clear visualizations. It’s part of a project out of the Data Sensing Lab, run by a team of six, including faculty Rob Faludi.
At the Google event, reports the Economist:
[T]he Data Sensing Lab showed live visualisations of people flowing out of seminars and forming an eager cluster around a stand showcasing Google Glass wearable computers. It also highlighted the noisiest area (the keynote by Larry Page, Google’s co-founder) and the quietest (a pop-up shop selling Google-branded products). All the data will be made freely available online after the conference wraps up. The Lab, a project of O’Reilly Media:[H]as deployed over 500 sensor motes at key locations around the Moscone West centre. Each phone-sized mote is a self-contained computer based on a cheap Arudino micro-controller and linked with low power ZigBee digital radios. Some measure temperature, pressure, noise, humidity and light levels. Others are tracking air quality, the motion of crowds or how many mobile phones are being used nearby. Together, they form a network producing over 4,000 streams of data that are uploaded to Google’s Cloud Platform software for analysis.
Nice work by Rob and team!
Some well-deserved press:
- CNet: Sensor motes sniff out Google I/O data trends
- Gigaom: Google I/O sensors will detect motion and generate data for real-time visualization
- Venture Beat: At its conference, Google will be tracking your every step
- Economist: Mote Learning
- Tom’s Hardware: How Google Tracks Everyone and Everything at I/O 2013
- PC World: Tech innovation not limited to Google’s big showcase
- Engadget: Google I/O attendees will see their every move recorded in real time
- Information Week: Google I/O Features Sensor Network
- MAKE: The Road to the I/O Sensor Network
- Google Developer Blog: Data Sensing Lab at Google I/O 2013: Google Cloud Platform meets the Internet of Things