$N Multistroke Recognizer in JavaScript
Lisa Anthony1 and Jacob O. Wobbrock2
1University of Maryland—Baltimore County   and   2University of Washington

This page implements the "$N Multistroke Recognizer" that is based upon the $1 Unistroke Recognizer. Upon loading this page, only one multistroke is defined for each gesture below, but $N automatically generalizes each example to encompass all possible stroke orderings and directions. This means that you can make and define multistrokes using any stroke order and direction you wish, provided you begin at either endpoint of each component stroke. By default, multistrokes are regarded as fully rotation, scale, and position invariant, but a checkbox option allows you to limit rotation invariance (see below). Also, you can define your own multistrokes using the buttons beneath the canvas. See our Graphics Interface 2010 paper (PDF), limitations of this recognizer or a detailed pseudocode listing.

Use Golden Section Search. The original $N algorithm uses Golden Section Search to find the best angular alignment between the inputted multistroke and template multistrokes. It is a fast iterative search algorithm.
Use Protractor. Yang Li published an improvement to $1, on which $N is based, called Protractor, which avoids the iterative Golden Section Search and instead uses a closed-form formula based on cosine distances, making Protractor considerably faster.
Use bounded rotation invariance. Do not use full rotation invariance, but instead require gestures to be drawn within +/- 45 degrees of the orientation of the template. This can be used to disambiguate, e.g., the "H" and "I" gestures, or the line and exclamation gestures, since they differ mainly by orientation.
Require same # of strokes. Require the candidate and template to have the same number of component strokes. This option speeds recognition but reduces articulation flexibility. For example, the "N" template above was made with 3 strokes. If this option is checked, a 1- or 2-stroke "N" will not be allowed to match it. If this option is not checked, such gestures will be allowed to match. If you want to have this option checked but still want to allow for this kind of flexibility, you can simply define your own separate templates named "N" with 1 and/or 2 strokes.

Make strokes on this canvas. Recognition happens when you right-click the canvas. If a misrecognition occurs, simply add the misrecognized multistroke as an example of the intended type, or try different checkbox options.
The <canvas> element is not supported by this browser.
Add last multistroke as example of existing type:
Add last multistroke as example of custom type:
Delete all user-defined examples:  

$N Links and Downloads

References

Anthony, L. and Wobbrock, J.O. (2010). A lightweight multistroke recognizer for user interface prototypes. Proceedings of Graphics Interface (GI '10). Ottawa, Ontario, Canada (May 31-June 2, 2010). Toronto, Ontario: Canadian Information Processing Society, pp. 245-252.

Li, Y. (2010). Protractor: A fast and accurate gesture recognizer. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '10). Atlanta, Georgia (April 10-15, 2010). New York: ACM Press, pp. 2169-2172.

Wobbrock, J.O., Wilson, A.D. and Li, Y. (2007). Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. Proceedings of the ACM Symposium on User Interface Software and Technology (UIST '07). Newport, Rhode Island (October 7-10, 2007). New York: ACM Press, pp. 159-168.

$N Implementations by Others