Google’s latest innovations debuted at Google I/O 2013, including a new 3D version of Google Maps.
In his opening remarks at the sixth annual Google I/O, held this past May 15–17, Google’s senior vice president of engineering, Vic Gundotra, welcomed the 6,000 developers who were physically present at San Francisco’s Moscone Center for the international web developers conference — as well as the 40,000-plus participants at 440 viewing parties across 90 countries, and the more than 1 million people watching live on YouTube. While those virtual attendee numbers are staggering, like most events, it was the in-per-son attendees who benefited most from their experience. Likewise for Google I/O organizers, who mined data — on a whole other level — from those face-to-face attendees.
Along with showcasing breakthrough technology for developers at Google I/O, organizers used the event itself as an incubator for revolutionary sensory-data-collecting technology. They installed 500 sensory panels throughout the center to record and monitor every loud noise, every spike in humidity — and pretty much every step taken by the thousands of attendees.
The idea for Google’s Data Sensing Lab came from a Google employee who had attended Strata 2013, O’Reilly Media’s annual computer-technology conference, held in Santa Clara, Calif., on Feb. 26–28. The O’Reilly Data Sensing Lab team gathered environmental data at Strata, then displayed visualizations of the findings. There were only 50 sensory boards at Strata; Google magnified that number tenfold for its own event, with each board featuring several sensors that detected noise, light, motion, humidity, and air quality, temperature, and pressure.
Sensory mats recording delegates’ steps were placed throughout high-traffic areas of the conference, including the Developer Sandbox, a lounge with couches, plasma TVs, and presentation space, where more than 100 developers shared their innovations based on Google I/O-featured technologies, through live demonstrations and Q&A sessions. “We put [the sensor boards] up all over Moscone,” Kim Cameron, a technical writer at Google, said during a 40-minute session held at Google I/O about the Data Sensing Lab. The boards were placed on all the floors, she said, as well as “some interesting places.” For example, sensors were placed inside one of the blimps circling the second floor of the center to film a live stream of the conference.
Using the Google Cloud Platform, data collected by the sensory boards was stored, analyzed, and turned into colorful, easy-to-read graphs that were displayed on the screens at the Developer Sandbox. Through real-time temperature maps, Google was able to see what rooms were the most populated. “We wanted to be able to do things like find correlations between data,” Cameron said, “or at least facilitate that for the future.” For instance, they found that more steps taken by attendees in one room resulted in poorer air quality in that area. Audio graphs displayed information like the 40 noisiest moments at Google I/O, including, not surprisingly, when Billy Idol performed the first night. “We’ve set up a framework where we can specify the kinds of things we want to monitor,” said Amy Unruh, a developer programs engineer for the Google Cloud Platform, during Google I/O’s Data Sensing Lab session, “and how frequently we want to look for them. We need to start mining this data, looking for interesting patterns.”
Close-up of a sensory panel
Google is laying the groundwork to gain valuable insights into the attendee experience. Both Cameron and Unruh stressed that they are still experimenting, and every event would use this technology differently, depending on the venue. “I think that is one of the ongoing challenges,” Cameron said, “like how do you know how many [sensors] to put, and where do you put them and how dense do you pack them in?