Ph. D. Defense
Enabling Context-Awareness in Mobile Systems via Multi-Modal Sensing
| Speaker: | Xuan Bao
xb6 at cs.duke.edu |
| Date: |
Friday, June 7, 2013 |
| Time: |
9:00am - 11:00am |
| Location: |
D344 LSRC, Duke |
|
|
Abstract
The inclusion of rich sensors on modern smartphones has changed mobile phones from simple communication devices to powerful human-centric sensing platforms. Similar trends are influencing other personal gadgets such as the tablets, cameras, and wearable devices like the Google glass. Together, these sensors can provide a high-resolution view of the user's context, ranging from simple information like locations and activities, to high-level inferences about the users' intention, behavior, and social interactions. Understanding such context can help solving existing system-side challenges and eventually enable a new world of real-life applications.
In this thesis, we propose to learn users’ context via multi-modal sensing. The intuition is that human behaviors leave footprints on different sensing dimensions - visual, acoustic, motion and even in cyber space. By collaboratively analyzing these footprints, the system can obtain valuable insights about the user. The analysis results can lead to a series of applications including capturing life-logging videos, tagging user-generated photos and enabling new ways for human-object interactions. Through these applications, we show that a wide spectrum of previously “invisible" behaviors can potentially be captured and revealed by tapping into the rich set of sensors embedded in modern commercial mobile devices.
Advisor(s): Romit Roy Choudhury
Landon Cox, Bruce Maggs, Alexander Varshavsky