Smartphone Sensing Group

We began working on mobile phone sensing in 2006 when we first developed BikeNet using the Nokia N80 and a large number of on-bike sensors. We built four of these sensor bikes back then. Following that, we developed CenceMe for the Nokia N95 in 2007. CenceMe runs human behaviorals models on the phone and pushes what the user is doing  and the information on their surrounding context (collectively called sensing presence) to facebook, myspace and twitter. CenceMe was the first mobile sensing app  to be released when the appstore opened in 2008. We learnt a terrific amount from the development and deployment of  the CenceMe app. The appstore delivery system and supporting a large globally distributed user base has really changed how we do research - it's challenging but exciting. 

Since then we have gone on to build more sophisticated mobile sensing apps and systems including the Neural Phone (aka NeuroPhone) (2010),EyePhone (2010),SoundSense (2009), Darwin Phones (2009). We have just released a new app on Andriod Market and the Appstore called VibN - catch the vibe of the city. We've also developed two new sensing engines for mobile phones called Jigsaw (2010) and Visage (2010) - Visage uses the front-facing camera to infer head poses and facial expressions. Checkout the demos or download the apps - some of the more experimental apps are not available on the appstores.




Neural Phone (2o1o)

Neural Phone (pka NeuroPhone) was developed with Tanzeem Choudhury and Rajeev Raizada. 



EyePhone (2010)



Darwin Phones (2009)

SoundSense was developed with Tanzeem Choudhury.



SoundSense (2009)

SoundSense was developed with Tanzeem Choudhury.



CenceMe (2007) 

Built for using 30  n95 phone in 2007 and available on the AppStore since 2008.





BikeNet (2006) 

Nokia N80 Phone + a large number of on-bike sensors (CO2 map shown below from BikeNet user portal BikeView)

 

Three of the bikes are rusting chained up outside (new england weather) and one in the lab. I joke the business model was a stretch: each bike cost $45 from Wallmart and $2000 for the sensor from moteiv.



How did we do ground truth sensing? Checkout Emiliano modeling our quad video helmet (no business model again) but the local Hanover cops were quite perplexed by four people riding the sensor bikes around a small new england town. 





Finally, imagine December 2006. It is snowing. We need data for our paper. If only we worked on computational science we'd run our algorithms on a billion node cluster and we'd be done. But we chose bikes! Many 100s of killometers were cycled in the freezing cold in persuit of data. It was fun. We learnt a lot - like how you do "debugging on the go". It is a kin to tour de france with the backup car tailing the cyclist. It was quite exciting to see the sensor bikes on the roads around Hanover being followed by our intrepit "debug bike".  Emiliano is going to dig out some quad video footage from the ground truth archive... stay tuned.