began working on mobile phone
sensing in 2006 when we first developed BikeNet using the Nokia
N80 and a large number of on-bike sensors. We built four of these
sensor bikes back then. Following that, we developed CenceMe for
the Nokia N95 in 2007. CenceMe runs human behaviorals models on the
phone and pushes what the
doing and the information on their surrounding context (collectively called
myspace and twitter. CenceMe was the first mobile sensing app to
be released when the appstore opened in 2008. We learnt a terrific
amount from the development and deployment of the CenceMe app.
The appstore delivery system and supporting a large globally
distributed user base has really changed how we do research - it's
challenging but exciting.
Since then we have gone on to build more sophisticated mobile sensing apps and systems including the Neural Phone (aka NeuroPhone) (2010),EyePhone (2010),SoundSense (2009), Darwin Phones (2009). We have just released a new app on Andriod Market and the Appstore called VibN - catch the vibe of the city. We've also developed two new sensing engines for mobile phones called Jigsaw (2010) and Visage (2010) - Visage uses the front-facing camera to infer head poses and facial expressions. Checkout the demos or download the apps - some of the more experimental apps are not available on the appstores.