Assignment 2 - Gesture Recognition on an Android Phone
You will build an Android app that uses the accelerometer and/or gyroscope to disgtinguish between 3 different hand gestures when the phone is held facing up in your hand. You get to define the gesture set; examples of possible gestures include flicking the phone, rotating the phone, or drawing letters in the air. Your three gestures cannot be the same type of gesture going in a different direction. You could probably distinguish between these gestures using signal processing and heuristics like in Assignment 1; however, we want you to use machine learning this time around. Here is the step-by-step of how the app will work:
- The app will collect 3-5 examples of each gesture
- You will compute various features that summarize the gestures in a manner that makes them easy to distinguish (hint: the first two gestures involve rotating the phone while the third gesture does not, so you will want at least one feature that captures the notion of rotation)
- The app will train a classification model offline on the data you collected
- You should then be able to perform one of the gesture again and have the app correctly identify which one you performed. If your app is not predicting correctly, you may need to consider new features, more training examples, or a different model.
Grading:
- (50 points) We will briefly examine your code to ensure that you are using some form of machine learning to identify which gesture is being performed.
- (50 points) You will perform each gesture 3 times with the phone in your hand. -5 points for each mistake.
Starting Materials:
Starter code: linkDeliverables:
Source code (.zip, .tar, or GitHub link): link
In-class demo