Please use this identifier to cite or link to this item:
|Title:||Holoscopic 3D Micro-Gesture Database for Wearable Device Interaction|
A Gaus, YF
|Citation:||13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), 2018, pp. 802 - 807 (6)|
|Abstract:||With the rapid development of augmented reality (AR) and virtual reality (VR) technology, human-computer interaction (HCI) has been greatly improved for gaming interaction of AR and VR control. The finger micro-gesture is a hot research focus due to the growth of the Internet of Things (IoT) and wearable technologies and recently Google has developed a radar based micro-gesture sensor which is Google Soli. Also, there are a number of finger micro-gesture techniques have been developed using Time of Flight (ToF) imaging sensors for wearable 3D glasses such as Atheer mobile glasses. The principle of holoscopic 3D (H3D) imaging mimics fly’s eye technique that captures a true 3D optical model of the scene using a microlens array, however, there is a limited progress of holoscopic 3D systems due to the lack of high quality public available database. In this paper, holoscopic 3D camera is used to capture high quality holoscopic 3D micro-gesture video images and a new unique holoscopic 3D micro-gesture (HoMG) database is produced. HoMG database recorded the image sequence of 3 conventional gestures from 40 participants under different settings and conditions. For the purpose of H3D micro-gesture recognition, HoMG has a video subset of 960 videos and a still image subset with 30635 images. Initial micro-gesture recognition on both subsets has been conducted using the traditional 2D image and video features and popular classifiers and some encouraging performance has been achieved. The database will be available for the research communities and speed up the research in the area of holoscopic 3D micro-gesture.|
|Appears in Collections:||Publications|
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.