This project aims at developing a gestures/strokes recognition engine, with the final goals including:
It allows you to record sample strokes, and compare a new one to them.
Look at the documentation for hints.
The code is written in Perl, using Tk for X11 presentation.
There is now a C++ implementaion, by Ben Sunshine-Hill <email@example.com>. Look at project page.
Details on the algorithms can be read in the POD-generated documentation.
Because both libstroke and wayV use more-or-less the same algorithm, i.e. dividing the stroke area in a grid, and converting the stroke in a string of bin identifiers. I think there are better ways.
I might well be wrong (it wouldn't be the first time ;-)), but I like to experiment.
Testing, ideas, criticism...
Look at the source code, then write me.
gesturesmodule (by way of, for example,
cvs -d:pserver:firstname.lastname@example.org:/cvsroot/gestures login, followed by
cvs -z3 -d:pserver:email@example.com:/cvsroot/gestures co gestures), then run (with
$DISPLAYpointing to a X11 display) the
If it doesn't work, check that:
Perl/Tkinstalled (otherwise, go get it)
$DISPLAYenvironment variable is correctly set, and that you have a
X11server up and running
If it still doesn't work, write me.Dakkar