University of Oulu

Lik Hang Lee, Tristan Braud, Farshid Hassani Bijarbooneh, and Pan Hui. 2020. UbiPoint: towards non-intrusive mid-air interaction for hardware constrained smart glasses. In Proceedings of the 11th ACM Multimedia Systems Conference (MMSys ’20). Association for Computing Machinery, New York, NY, USA, 190–201. DOI:https://doi.org/10.1145/3339825.3391870

UbiPoint : towards non-intrusive mid-air interaction for hardware constrained smart glasses

Saved in:
Author: Lee, Lik Hang1; Braud, Tristan2; Bijarbooneh, Farshid Hassani2;
Organizations: 1Center for Ubiquitous Computing, The University of Oulu, Oulu, Finland
2Hong Kong University of Science and Technology, Hong Kong SAR, China
Format: article
Version: accepted version
Access: open
Online Access: PDF Full Text (PDF, 11 MB)
Persistent link: http://urn.fi/urn:nbn:fi-fe2020061744658
Language: English
Published: Association for Computing Machinery, 2020
Publish Date: 2020-06-17
Description:

Abstract

Throughout the past decade, numerous interaction techniques have been designed for mobile and wearable devices. Among these devices, smartglasses mostly rely on hardware interfaces such as touchpad and buttons, which are often cumbersome and counterintuitive to use. Furthermore, smartglasses feature cheap and low-power hardware preventing the use of advanced pointing techniques. To overcome these issues, we introduce UbiPoint, a freehand mid-air interaction technique. UbiPoint uses the monocular camera embedded in smartglasses to detect the user’s hand without relying on gloves, markers, or sensors, enabling intuitive and non-intrusive interaction. We introduce a computationally fast and light-weight algorithm for fingertip detection, which is especially suited for the limited hardware specifications and the short battery life of smartglasses. UbiPoint processes pictures at a rate of 20 frames per second with high detection accuracy — no more than 6 pixels deviation. Our evaluation shows that UbiPoint, as a mid-air non-intrusive interface, delivers a better experience for users and smart glasses interactions, with users completing typical tasks 1.82 times faster than when using the original hardware.

see all

ISBN Print: 978-1-4503-6845-2
Pages: 190 - 201
DOI: 10.1145/3339825.3391870
OADOI: https://oadoi.org/10.1145/3339825.3391870
Host publication: MMSys '20: Proceedings of the 11th ACM Multimedia Systems Conference
Conference: ACM Multimedia Systems Conference 2020 (MMSys 2020)
Type of Publication: A4 Article in conference proceedings
Field of Science: 113 Computer and information sciences
Subjects:
Funding: The research is supported by Academy of Finland 6Genesis Flagship [grant number: 318927] / 5GEAR [grant number: 319669], and Project 16214817 from the Research Grants Council of Hong Kong.
Academy of Finland Grant Number: 318927
319669
Detailed Information: 318927 (Academy of Finland Funding decision)
319669 (Academy of Finland Funding decision)
Copyright information: © 2020 Copyright held by the owner/author(s). Publication rights licensed to ACM. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in MMSys '20: Proceedings of the 11th ACM Multimedia Systems Conference, https://doi.org/10.1145/3339825.3391870.