gesture

gesture.py a component of openallure.py

Derive a signal from a processed webcam image

Copyright (c) 2010 John Graves

MIT License: see LICENSE.txt

class gesture.Gesture

Return the uppermost choice based on checking boxes on either side of a webcam image.

Background

A webcam image processed with the green screen approach (background subtraction) may have white pixels which can be used to extract a signal from a gesture (hand movement).

This implementation looks along rows of boxes, counting the non-black (white) pixels within each box. Any count over 5 (about finger width) is considered a valid selection.

choiceSelected(image, textRegions, margins, boxWidth=35, boxPlacementList=(0, 1, 2, 3, 4, 13, 14, 15, 16, 17))

Find which choice is selected in an image where textRegions contains a list of coordinates (upper left x y, lower right x y) of non-overlapping regions within the image.

A row of boxes is checked along the lower edge of each region.

Each box has width boxWidth and they can be placed using boxPlacementList in a manner that leaves a gap mid-image so head motion does not interfere with hand gestures.

Function returns a (choice, boxCount) tuple. This allows for auto-recalibration when choice 0 is selected with too many boxes.

If no choice is selected, a choice of -1 is returned.

TODO: Use a better system which allows hand gestures to be recognized even with a moving face as the background.

isBoxSelected(imageArray, imageCopyArray, xoffset, yoffset, threshold=10, n=11, spacing=2)
Determine whether a box located at (lower right) coordinate xoffset, yoffset in an imageArray is selected based on whether more than threshold number of pixels in an n x n matrix of pixels with spacing (to cover more area while processing fewer pixels) have non-black values

Previous topic

qsequence

Next topic

text