Inferring Smartphone Users Handwritten Patterns by Using Motion Sensors

Abstract Mobile devices including smartphones and wearable devices are increasingly gaining popularity as platforms for collecting and sharing sensor data, such as the accelerometer, gyroscope, and rotation sensor. These sensors are used to improve the convenience of smartphone users, e.g., supporting the mobile UI motion- based commands. Although these motion sensors do not require users' permissions, they still bring potential risks of leaking users' private information reflected by the changes of sensor readings. In this paper, we investigate the feasibility of inferring a user's handwritten pattern on a smartphone touchscreen by using the embedded motion sensors. Specifically, our inference attack is composed of two key steps where we 1) first exploit the dynamic time warping (DTW) technique to differentiate any pair of time-series sensor recordings corresponding to different handwritten patterns; and 2) develop a novel sensor fusion mechanism to integrate information contained in multiple motion sensors by exploiting the majority voting strategy. Through extensive experiments using real-world data sets, we demonstrate the effectiveness of our proposed attack which can achieve 91.4% accuracy for inferring smartphone users' handwritten patterns.
Authors
  • Wei-Han Lee
  • Jorge Ortiz (IBM US)
  • Bongjun Ko (IBM US)
  • Ruby Lee
Date Jan-2018
Venue International Conference on Information Systems Security and Privacy 2018