University of Oulu

Y. D. Kwon et al., "MyoKey: Surface Electromyography and Inertial Motion Sensing-based Text Entry in AR," 2020 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Austin, TX, USA, 2020, pp. 1-4, doi: 10.1109/PerComWorkshops48775.2020.9156084

MyoKey : surface electromyography and inertial motion sensing-based text entry in AR

Saved in:
Author: Kwon, Young D.1; Shatilov, Kirill A.1; Lee, Lik-Hang2;
Organizations: 1Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR
2Center for Ubiquitous Computing, The University of Oulu, Finland
3Department of Computer Science, The University of Helsinki, Finland
Format: article
Version: accepted version
Access: open
Online Access: PDF Full Text (PDF, 0.3 MB)
Persistent link:
Language: English
Published: Institute of Electrical and Electronics Engineers, 2020
Publish Date: 2020-08-11


The seamless textual input in Augmented Reality (AR) is very challenging and essential for enabling user-friendly AR applications. Existing approaches such as speech input and vision-based gesture recognition suffer from environmental obstacles and the large default keyboard size, sacrificing the majority of the screen’s real estate in AR. In this paper, we propose MyoKey, a system that enables users to effectively and unobtrusively input text in a constrained environment of AR by jointly leveraging surface Electromyography (sEMG) and Inertial Motion Unit (IMU) signals transmitted by wearable sensors on a user’s forearm. MyoKey adopts a deep learning-based classifier to infer hand gestures using sEMG. In order to show the feasibility of our approach, we implement a mobile AR application using the Unity application building framework. We present novel interaction and system designs to incorporate information of hand gestures from sEMG and arm motions from IMU to provide seamless text entry solution. We demonstrate the applicability of MyoKey by conducting a series of experiments achieving the accuracy of 0.91 on identifying five gestures in real-time (Inference time: 97.43 ms).

see all

ISBN: 978-1-7281-4716-1
ISBN Print: 978-1-7281-4717-8
Pages: 1 - 4
DOI: 10.1109/PerComWorkshops48775.2020.9156084
Host publication: 2020 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)
Conference: IEEE International Conference on Pervasive Computing and Communications
Type of Publication: A4 Article in conference proceedings
Field of Science: 113 Computer and information sciences
Funding: This research has been supported in part by project 16214817 from the Research Grants Council of Hong Kong, and the 5GEAR project and the FIT project from the Academy of Finland.
Copyright information: © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.