Friday, 14 December 2018

Gestual interface for navigation in virtual world


Jump to: navigation, search


Xavier BAELE - Nadine WARZEE

Project description

This project, made in collaboration with a belgian company 72dpi, aims to develop an intuitive and not-intrusive interface to virtual environments. Thanks to this system, the user will be completly inside a virtual world and will be able to interact with this one without being physically "linked" to the computer through devices (mouse, joystick, keyboard, glove, ...).

Gesture3d.jpg Figure 1: real time tracking of gestural body

The system is based on the capture and interpretation in real time of the user position and movements (figure 1) thanks to the images analysis provided by steroscopic camera. The gathered data will make possible to control an avatar in the virtual environment according to the gestures of the visitor.

Figure 2 presents a schematic view of the complete system.

Gesture schema.jpg Figure 2: Gestural interface: technical point of vue

Camera delivers two images of the same scene seen with two slightly different point of view. The correlation between the two images allows to deduce a value named "disparity", proportional to the pixel depth. An existing library is used for this task: Small vision system.

The goal of the first part (1 and 1' on figure 2) is to find the body user position based on color and disparity images.

The second part (2 on figure 2) has then to interpret the body movements and to transmit orders to the virtual world based on a predefined language.

Finally, the virtual environment executes the order (avatar deplacement, object catching or manipulation, ...) which allows the user to receive a feedback.