Gestures Engine

What is this?

This project aims at developing a gestures/strokes recognition engine, with the final goals including:

What does it do, right now?

It allows you to record sample strokes, and compare a new one to them.

Look at the documentation for hints.

Implementation details

The code is written in Perl, using Tk for X11 presentation.

There is now a C++ implementaion, by Ben Sunshine-Hill <>. Look at project page.

Details on the algorithms can be read in the POD-generated documentation.


Because both libstroke and wayV use more-or-less the same algorithm, i.e. dividing the stroke area in a grid, and converting the stroke in a string of bin identifiers. I think there are better ways.

I might well be wrong (it wouldn't be the first time ;-)), but I like to experiment.

What's nedeed?

Testing, ideas, criticism...

Look at the source code, then write me.

How do I use the source?

Just check out the gestures module (by way of, for example, cvs login, followed by cvs -z3 co gestures), then run (with $DISPLAY pointing to a X11 display) the script.

If it doesn't work, check that:

If it still doesn't work, write me.

SourceForge Logo