README.md (1795B)
1 gestures 2 ======== 3 4 Gestures is a lightweight tool for tracking gestures with your webcam and 5 executing global commands (like volume up). It detects gestures 6 from a colored object you hold in your hand. 7 8 It works by using OpenCV image processing library for object detection 9 and hidden Markov models for calculating the most probable gesture. 10 11 12 Requirements 13 ------------ 14 Python2 is required to run gestures. 15 16 The following non-standard libraries are required for Python: 17 - opencv 18 - ghmm 19 - numpy 20 - simplejson 21 22 Configuration 23 ------------- 24 You can configure own gestures in the files `models.py`. 25 At the top, there is a section `gestures` which is defined as a list of 26 tuples containing the gesture and the command to be executed. The gesture 27 is given as a list of movements, e.g. `[UP, DOWN, UP]` and a valid 28 command for `subprocess`. The latter one is a list of command parts where 29 you split your command at spaces, i.e. `firefox -safe-mode` would become 30 `['firefox', '-safe-mode']`. 31 32 The complete entry would then look like this: 33 34 ```python 35 gestures = [ 36 ([UP, DOWN, UP], ['firefox', '-safe-mode']) 37 ] 38 ``` 39 40 Pay attention to the fact that there are no quotation marks around directions 41 like `UP` and `DOWN`, because these are special keywords by gestures. 42 43 Training of Models 44 ------------------ 45 You can train your gestures by pressing the number keys on your keyboard 46 (currently this is limited to 0 to 3). This will trigger the training 47 mode and you can perform the gesture you want to be detected later. 48 Depending on the number you pressed before, the appropriate gesture 49 from the gestures list will be trained. 50 51 After a gesture has been performed in training mode, the program will 52 return to detection mode again. 53 54 License 55 ------- 56 This software is licensed under the Simplified BSD License.