Secure and Usable Authentication Using Avatar Expression Blendshapes in Virtual Reality

Natsuki Nagai, Tetsuro Takahashi, Takuya Kataiwa Masakatsu Nishigaki, Tetsushi Ohki
CHI EA '26: Proceedings of the Extended Abstracts of the 2026 CHI Conference on Human Factors in Computing Systems
[ Paper ] [ Web ]

Abstract

As interests in Virtual Reality (VR) continue to rise, head-mounted displays (HMDs) are actively being developed. Current user authentication methods in HMDs requires the use of a virtual keyboard, which has low usability and is prone to shoulder surfing attacks. To address this challenge, a facial expression based authentication that tracks users’ smile using face tracking capabilities integrated into certain HMDs has been proposed. In this paper, we extend this approach to a wider range of facial expressions and evaluate both its effectiveness and usability. A facial expression based authentication has the potential to be a secure and usable biometric system, achieving EER as low as 0.00167 and AUC of up to 0.999 in our experiments. Furthermore, a user study was conducted, demonstrating that a facial expression based authentication offers high usability, with a System Usability Scale (SUS) score of 71.75 and an average NASA-TLX subscale score of 39.6.

Updated: