TouchSense

08/2013 – 12/2013
Collaboration: Da-Yuan Huang, Ming-Chang Tsai, Ying-Chao Tung, Min-Lun Tsai, Li-Wei Chan

This project, TouchSense, provides additional touchscreen input vocabulary by distinguishing the areas of users’ finger pads contacting the touchscreen. It requires minimal touch input area and minimal movement, making it especially ideal for wearable devices such as smart watches and smart glasses. Results from two human-factor studies showed that users could tap a touchscreen with five or more distinct areas of their finger pads. Also, they were able to tap with more distinct areas closer to their fingertips. We developed a TouchSense smart watch prototype using inertial measurement sensors, and developed two example applications: a calculator and a text editor. We also collected user feedback via an explorative study. This work was presented at CHI’14.

Video

This slideshow requires JavaScript.