This project aims at developing a gestures/strokes recognition engine, with the final goals including:
It allows you to record sample strokes, and compare a new one to them.
Look at the documentation for hints.
The code is written in Perl, using Tk for X11 presentation.
There is now a C++ implementaion, by Ben Sunshine-Hill <bsunshin@usc.edu>. Look at project page.
Details on the algorithms can be read in the POD-generated documentation.
Because both libstroke and wayV use more-or-less the same algorithm, i.e. dividing the stroke area in a grid, and converting the stroke in a string of bin identifiers. I think there are better ways.
I might well be wrong (it wouldn't be the first time ;-)), but I like to experiment.
Testing, ideas, criticism...
Look at the source code, then write me.
gestures
module (by way of, for example, cvs -d:pserver:anonymous@cvs.gestures.sourceforge.net:/cvsroot/gestures login
, followed by cvs -z3 -d:pserver:anonymous@cvs.gestures.sourceforge.net:/cvsroot/gestures co gestures
), then run (with $DISPLAY
pointing to a X11 display) the matching.pl
script.
If it doesn't work, check that:
perl
in /usr/bin
Perl/Tk
installed (otherwise, go get it)$DISPLAY
environment variable is correctly set, and that you have a X11
server up and runningIf it still doesn't work, write me.
Dakkar