Gesture Logging Garment

This is a project done with Laura Herrera Cisneros and Jess Peter for Kate Hartman’s wearable technology class.

Case:

Jim has a long string of interviews coming up with various companies. He has always been worried about his body language under stress, but wonders if he is overreacting. In order to verify whether he comes off as very closed off and unfriendly he decides to try out this gesture logging garment.

Jim activates it before the first interview and goes in. He is a little stressed but feels confident with his performance. When home he takes the SD card out of the garment and checks the graphs of his gestures over time. Everything seems reasonable, but he notices that he might appear to be too nervous as he scratches his collar bone a bit much as the interview progressed. By self-monitoring his body language, Jim is able to limit his nervous gestures so as to appear more confident in his following interviews.

Overview:

This garment logs a few basic gestures over any period of time. The wearer can then visualize them using a computer program. The following gestures are currently supported:

  1. Hold left and right side
  2. Grip left and right arm above the elbow
  3. Grip or scratch the collar bones on ether side of the neck.

The code:

There are two distinct areas for code, Arduino and Processing. The Arduino code can be seen in this gist. The goal was to minimize the code size and make it scalable by just adding a new set of values to the five major arrays. The following code excerpt demonstrates the basic logic in the program:

for(int i=0; i<6; i++){
    difference = data[1][i] - data[2][i];
    if (abs(difference) >= data[3][i]) {
        data[2][i] = data[1][i];
        logOut(i,currentMillis);
    } 
}

Each sensor has its own threshold that triggers its activation. If the threshold is met it goes on to write to the logger and adjust the old array value for the difference.

Processing Visualization code can be seen in this gist. And the result of the code can be seen below:

Grabbing the data is as easy as inserting the SD card into your computer and copying the log .txt file that you want to analyze. This breaks down the data in three distinct ways:

Visualization

  • The bar chart is meant to provide an easy way to see which gesture is performed most.
  • The body nodes show exactly where and how often each body part is pressed.
  • The line graph logs gestures over time allowing for a quick overview of what was done when.

Schematics:

Each sensor follows this basic schematic:

basictouchsensor

As there are six sensors this circuit is repeated six times and they all share a single power line while being attached to pins A0 – A5. The sensors were created using conductive fabric and Velostat, which varies in conductivity based on the pressure applied to it. All but the two sensors on the neck area are created with two contact points, reducing the chance for accidental presses. The two contact points act in series, and if only one is pressed the value passing through the sensor is still very low, as the second is highly non-conductive.

The data logger is a straight connection to the serial output of the FIO V.3.

Process Imagery

Final worn piece: