README.md
1# Gesture Recognition Magic Wand Training Scripts
2
3## Introduction
4
5The scripts in this directory can be used to train a TensorFlow model that
6classifies gestures based on accelerometer data. The code uses Python 3.7 and
7TensorFlow 2.0. The resulting model is less than 20KB in size.
8
9The following document contains instructions on using the scripts to train a
10model, and capturing your own training data.
11
12This project was inspired by the [Gesture Recognition Magic Wand](https://github.com/jewang/gesture-demo)
13project by Jennifer Wang.
14
15## Training
16
17### Dataset
18
19Three magic gestures were chosen, and data were collected from 7
20different people. Some random long movement sequences were collected and divided
21into shorter pieces, which made up "negative" data along with some other
22automatically generated random data.
23
24The dataset can be downloaded from the following URL:
25
26[download.tensorflow.org/models/tflite/magic_wand/data.tar.gz](http://download.tensorflow.org/models/tflite/magic_wand/data.tar.gz)
27
28### Training in Colab
29
30The following [Google Colaboratory](https://colab.research.google.com)
31notebook demonstrates how to train the model. It's the easiest way to get
32started:
33
34<table class="tfo-notebook-buttons" align="left">
35 <td>
36 <a target="_blank" href="https://colab.research.google.com/github/tensorflow/tensorflow/blob/master/tensorflow/lite/micro/examples/magic_wand/train/train_magic_wand_model.ipynb"><img src="https://www.tensorflow.org/images/colab_logo_32px.png" />Run in Google Colab</a>
37 </td>
38 <td>
39 <a target="_blank" href="https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/micro/examples/magic_wand/train/train_magic_wand_model.ipynb"><img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png" />View source on GitHub</a>
40 </td>
41</table>
42
43If you'd prefer to run the scripts locally, use the following instructions.
44
45### Running the scripts
46
47Use the following command to install the required dependencies:
48
49```shell
50pip install numpy==1.16.2 tensorflow==2.0.0-beta1
51```
52
53There are two ways to train the model:
54
55- Random data split, which mixes different people's data together and randomly
56 splits them into training, validation, and test sets
57- Person data split, which splits the data by person
58
59#### Random data split
60
61Using a random split results in higher training accuracy than a person split,
62but inferior performance on new data.
63
64```shell
65$ python data_prepare.py
66
67$ python data_split.py
68
69$ python train.py --model CNN --person false
70```
71
72#### Person data split
73
74Using a person data split results in lower training accuracy but better
75performance on new data.
76
77```shell
78$ python data_prepare.py
79
80$ python data_split_person.py
81
82$ python train.py --model CNN --person true
83```
84
85#### Model type
86
87In the `--model` argument, you can can provide `CNN` or `LSTM`. The CNN
88model has a smaller size and lower latency.
89
90## Collecting new data
91
92To obtain new training data using the
93[SparkFun Edge development board](https://sparkfun.com/products/15170), you can
94modify one of the examples in the [SparkFun Edge BSP](https://github.com/sparkfun/SparkFun_Edge_BSP)
95and deploy it using the Ambiq SDK.
96
97### Install the Ambiq SDK and SparkFun Edge BSP
98
99Follow SparkFun's
100[Using SparkFun Edge Board with Ambiq Apollo3 SDK](https://learn.sparkfun.com/tutorials/using-sparkfun-edge-board-with-ambiq-apollo3-sdk/all)
101guide to set up the Ambiq SDK and SparkFun Edge BSP.
102
103#### Modify the example code
104
105First, `cd` into
106`AmbiqSuite-Rel2.2.0/boards/SparkFun_Edge_BSP/examples/example1_edge_test`.
107
108##### Modify `src/tf_adc/tf_adc.c`
109
110Add `true` in line 62 as the second parameter of function
111`am_hal_adc_samples_read`.
112
113##### Modify `src/main.c`
114
115Add the line below in `int main(void)`, just before the line `while(1)`:
116
117```cc
118am_util_stdio_printf("-,-,-\r\n");
119```
120
121Change the following lines in `while(1){...}`
122
123```cc
124am_util_stdio_printf("Acc [mg] %04.2f x, %04.2f y, %04.2f z, Temp [deg C] %04.2f, MIC0 [counts / 2^14] %d\r\n", acceleration_mg[0], acceleration_mg[1], acceleration_mg[2], temperature_degC, (audioSample) );
125```
126
127to:
128
129```cc
130am_util_stdio_printf("%04.2f,%04.2f,%04.2f\r\n", acceleration_mg[0], acceleration_mg[1], acceleration_mg[2]);
131```
132
133#### Flash the binary
134
135Follow the instructions in
136[SparkFun's guide](https://learn.sparkfun.com/tutorials/using-sparkfun-edge-board-with-ambiq-apollo3-sdk/all#example-applications)
137to flash the binary to the device.
138
139#### Collect accelerometer data
140
141First, in a new terminal window, run the following command to begin logging
142output to `output.txt`:
143
144```shell
145$ script output.txt
146```
147
148Next, in the same window, use `screen` to connect to the device:
149
150```shell
151$ screen ${DEVICENAME} 115200
152```
153
154Output information collected from accelerometer sensor will be shown on the
155screen and saved in `output.txt`, in the format of "x,y,z" per line.
156
157Press the `RST` button to start capturing a new gesture, then press Button 14
158when it ends. New data will begin with a line "-,-,-".
159
160To exit `screen`, hit +Ctrl\\+A+, immediately followed by the +K+ key,
161then hit the +Y+ key. Then run
162
163```shell
164$ exit
165```
166
167to stop logging data. Data will be saved in `output.txt`. For compatibility
168with the training scripts, change the file name to include person's name and
169the gesture name, in the following format:
170
171```
172output_{gesture_name}_{person_name}.txt
173```
174
175#### Edit and run the scripts
176
177Edit the following files to include your new gesture names (replacing
178"wing", "ring", and "slope")
179
180- `data_load.py`
181- `data_prepare.py`
182- `data_split.py`
183
184Edit the following files to include your new person names (replacing "hyw",
185"shiyun", "tangsy", "dengyl", "jiangyh", "xunkai", "lsj", "pengxl", "liucx",
186and "zhangxy"):
187
188- `data_prepare.py`
189- `data_split_person.py`
190
191Finally, run the commands described earlier to train a new model.
192