Back to the main site


Gesture based spaceship control with the Leap Motion and more!

Posted on by Malcolm

I’ve finished the first iteration of the gesture component of my multi-modal user interface. It was surprisingly easy with the data the Leap SDK provides (such as palm direction and palm normal). Below is a video showing my progress (the Leap Motion diagnostic visualizer is in the bottom right if you want to track my hand movements). Sorry for the bad quality, I’m using new screen capture software.


 
The gaze component of my final year project is proving to be much trickier; something I didn’t realise when I set the parameters of this project was that the GazePoint GP3 unit provides me with raw data with a low sample rate of 60Hz. The low sample rate means that low-pass noise filtering can potentially distort the data instead of clearing it up and I will have to be very careful when dealing with missing data (even in an ideal environment, people still need to blink) and interpolating between frames.
 
It feels like I’ve done a lot of research only to conclude how difficult this could be but I’ve at least identified that I will be implementing a velocity-based fixation algorithm if only to discard saccades and focus on the eye fixations. It might be a good idea to finalize the design for my user tests so I can define what I need from the gaze data. This is would give me a clearer perception on the requirements of the algorithm (i.e. I might not care about blinking, just the events in the test such as “target hit” or “time/distance of hit”).

This entry was posted in Final Year Project, University Work and tagged , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>