08/2013 – 12/2013
Collaboration: Da-Yuan Huang, Ming-Chang Tsai, Ying-Chao Tung, Min-Lun Tsai, Li-Wei Chan
This project, TouchSense, provides additional touchscreen input vocabulary by distinguishing the areas of users’ ﬁnger pads contacting the touchscreen. It requires minimal touch input area and minimal movement, making it especially ideal for wearable devices such as smart watches and smart glasses. Results from two human-factor studies showed that users could tap a touchscreen with ﬁve or more distinct areas of their ﬁnger pads. Also, they were able to tap with more distinct areas closer to their ﬁngertips. We developed a TouchSense smart watch prototype using inertial measurement sensors, and developed two example applications: a calculator and a text editor. We also collected user feedback via an explorative study. This work was presented at CHI’14.