Gestures Engine

What is this?

This project aims at developing a gestures/strokes recognition engine, with the final goals including:

What does it do, right now?

It allows you to record sample strokes, and compare a new one to them.

Look at the documentation for hints.

Implementation details

The code is written in Perl, using Tk for X11 presentation.

There is now a C++ implementaion, by Ben Sunshine-Hill <bsunshin@usc.edu>. Look at project page.

Details on the algorithms can be read in the POD-generated documentation.

Why?

Because both libstroke and wayV use more-or-less the same algorithm, i.e. dividing the stroke area in a grid, and converting the stroke in a string of bin identifiers. I think there are better ways.

I might well be wrong (it wouldn't be the first time ;-)), but I like to experiment.

What's nedeed?

Testing, ideas, criticism...

Look at the source code, then write me.

How do I use the source?

Just check out the gestures module (by way of, for example, cvs -d:pserver:anonymous@cvs.gestures.sourceforge.net:/cvsroot/gestures login, followed by cvs -z3 -d:pserver:anonymous@cvs.gestures.sourceforge.net:/cvsroot/gestures co gestures), then run (with $DISPLAY pointing to a X11 display) the matching.pl script.

If it doesn't work, check that:

If it still doesn't work, write me.

SourceForge Logo
Dakkar