Often, we operate mobile devices using only one hand. The hand thereby serves two purposes: holding the device and operating the touch screen with the thumb. The current trend of increasing screen sizes however, makes it close to impossible to reach all parts of the screen (especially the top area) for users with average hand sizes. One solution is to offer adaptive user interfaces for such one-handed interactions. These modes have to be triggered manually and thus induce a critical overhead. They are further designed to bring all content closer, regardless of whether the phone is operated with the left or right hand. In this paper, we present an algorithm that allows determining the users' interacting hand from their unlocking behavior. Our algorithm correctly distinguishes one- and two-handed usage as well as left- and right handed unlocking in 98.51% of all cases. This is achieved through a k-nearest neighbor comparison of the internal sensor readings of the smartphone during the unlocking process.
- Paper
Detecting Users Handedness for Ergonomic Adaptation of Mobile User Interfaces
In: Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia (Mum-15). November 30 - December 2, Linz, Austria, ACM, 2015.