Back to the main site


Test Specifications to assess Gaze/Gesture input devices

Posted on by Malcolm

For my university final year project I am looking to assess the viability of a gaze/gesture control system in a computer game context. In order to develop a intuitive (and fun!) control system I will be making a designing/implementing a couple of tests and asking for volunteers to try and give their feedback. The measurements I will be taking from the tests is largely based on Usability Benchmarks for Motion Tracking Systems (Lugrin et al, 2013).

The Tests:
 
Leap Motion (gesture input):
In this test the user will control a free-flying spaceship in an open world environment to fly through checkpoints. The checkpoints themselves will take the form of a circle with a hollow center where the user is meant to fly through. Colliding with the checkpoint will have no effect on the user (but the fact that there was a collision will be recorded in the test). As the duration of the test increases, so does the speed of the spaceship. The checkpoints themselves will be generated in real time based on a set of pre-defined track segments (e.g. 3-checkpoint corner, 2-checkpoint u-turn, etc).
 
GazePoint GP3 (gaze input):
This test will be a timed shooting gallery with a number of levels. With each level the time to complete the level decreases from the previous one. The scene will have a large number of targets of which a subset will be selected at the start of each level. The method of selection used will be a psuedo-random algorithm with a bias to ensure an even distribution of targets across the game screen.
 
The Measurements:

In accordance with Lugrin et al’s benchmarking methods I will be collecting both subjective measurements from the test subject and objective measurements from events in the game.
 
Subjective Measurements:
 
1) Presence – difference in perception in the game world and real world.
Leap: Did the spaceship move as expected? Did it feel natural?
Gaze: Did the game register your fixations as you would expect? Did it feel natural?
 
2) Simulator sickness – this usually happens when visual perception does not match the perceived movement (e.g. poor quality tracking).
Test subjects will be asked to complete a Simulator Sickness Questionnaire (Kennedy et al. 1993) which was developed by analysis of a large data set and identification of common symptoms with simulator sickness. Test subjects will be asked to fill out a pre-test and post-test questionnaire.
 
3) Preference – rate the experience from 1 to 10.
Did the control system meet your expectations as a gamer?
Compare with mouse and keyboard. What Human-computer interface (HCI) did you prefer?
 
Objective Measurements: (Task specific)
 
Leap test: (for each checkpoint)

Gaze test:

Now it’s time to start making the tests. In my previous final year project post I wrote that I would first take a look at what data we need from the tests in order to identify how complex my fixation algorithm needs to be. Now I can conclude that we’re not interested in missing data (such as blinking or the user looking away from the GazePoint GP3 device) however we only want to register actual gaze fixations. Therefore I will continue on my implementation of a velocity-based fixation algorithm to ensure that the test will ignore saccades (since we don’t want a target hit in the shooting range unintentionally).
 

This entry was posted in Final Year Project, University Work and tagged . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>