Tuesday, August 13, 2013

Capturing Arm Motion in Real-time using Two IMUs


Created by Cheni Chadowitz, Rui WangAbhinav Parate, Deepak Ganesan


Introduction

This project was one step towards using a single, wrist-worn sensor to recognize the motions we make throughout our day while in the midst of various activities. Things like eating, drinking, running, smoking all have unique motion associated with them and, by using an IMU (inertial measurement unit) worn on the wrist, we hope to be able to recognize when known activities begin and end. For our purpose, we used the MPU-9150, produced by InvenSense Inc. The MPU-9150 is a 9-axis sensor that includes a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis compass. Together, these sensors can produce an error-corrected quaternion that indicates the 3D orientation of the sensor with respect to magnetic North and gravity.

MPU-9150 by InvenSense Inc.
Source: http://invensense.com/mems/gyro/documents/AN-MPU-9150IMF%20InvenSense%20MotionFit%20SDK%20Quick%20Start%20Guide.pdf
For this project in particular, we used two MPU-9150 sensors to produce a realistic, real-time graphical representation of an arm moving in 3D space. By mounting one sensor on the bicep just below the shoulder, and a second sensor on the wrist, we can reconstruct the orientation of the entire arm in 3D space. We found this representation to be fairly intuitive when used to identify the activity the subject is partaking in at the time. Using this graphical representation, we will gather and label a large amount of training data so that we are able to recognize a variety of activities when subjects are wearing a single, wrist-mounted sensor.

We show a brief demo in the video above. Both sensors were mounted in the correct alignment with elastic medical bandages, and then the subject went through variety of short motions. The video on the left is a real-time representation of the orientations the sensors are producing, shown by manipulating a 3D model of a mannequin. The video on the right is of the subject himself. The software used is described and provided below.

Software Setup:

The software provided was developed using Processing due to its ease of use and the ability to hit the ground running. Since Processing is Java-based, you can view and edit the source code in any Java environment, however it may be easier to download and install the Processing IDE and edit the source code there.

Currently, the latest version of the software is available at Github, but a .zip archive download is available here, too.

Bluetooth setup:

In Windows and Mac OS X, you must first pair the two sensors with your computer. It is normally easier if you pair one at a time and wait for each device to complete the setup before pairing the next.

The software has not been tested under Linux, however if you are able to pair both sensors (as with Windows and Mac) and create a serial port for each, it may work successfully. The serial library include with Processing needs to be able to access the serial port. Check here for possible tips.

Once you have paired your sensors, you need to edit the config.json file located in the data folder to indicate the correct serial ports for each sensor. If you start the sketch (or the included executables) without changing the configuration, it will display the available serial ports so that you can use the correct values in the configuration file. Once you have saved the configuration, it will attempt to connect to the sensors the next time you start it.

Make sure to check below for your particular operating system to see any additional tips and details.

Windows: 

At the time this was written, the version of the serial library included with Processing (RXTXcomm.jar) has a bug that prevents it from working correctly. This will only effect you if you try to edit the source code in the Processing IDE after adding the source code as a sketch. The included standalone executables have been created using the fixed serial library. To fix this, follow these steps (after you have installed Processing):

  1. Download and extract the zip archive containing the fixed library here.
  2. Open [Processing folder]\modes\java\libraries\serial\library\
  3. Place the included RXTXcomm.jar library in the location you opened in the previous step.
  4. Place the included rxtxSerial.dll file in the windows32 folder.

Mac OS X:

If you are using the provided executable, you must edit the config.json file first. However, it is located inside the app itself, so to edit the configuration file you must first right click on the app and choose Explore package contents. Then browse to Contents/Resources/Java/data/, where you should find the config.json file.

To run the executable, follow the instructions in the provided readme.txt file.


Running the sketch:

Once you have Processing installed, you should copy the source code into your Sketchbook folder. To find out where this is located, launch Processing IDE and go to File > Preferences.
If you would like to use a different model, save your .obj files in the data folder within the sketch.

The source code has been commented so that it is easy to understand and edit.


Features:

There are a few simple features available. 

  • If you run the sketch without specifying the serial port numbers in the config.json file, it will list available ports so that you can easily specify the correct ports in the config file. Once the sketch has successfully connected to the sensors, you should see the arm moving in conjunction with the sensors. If one or both segments of the arm do not appear to be moving, you may have selected the input port for the sensor rather than the output (each sensor has two serial ports as they are capable of receiving commands). 
  • If you would like to see (below right) or hide (below left) the full human model while controlling the arm, press t while the window has focus to toggle that view.
Hiding the full body (left). Showing the full body (right).




  • By clicking and dragging the mouse horizontally across the window you are able to rotate the scene.

Hardware Setup:

The software has been designed with the expectation that the sensors will be worn in a particular orientation (as shown below). 

The first sensor should be worn on the outside of the right bicep, just below the shoulder. The micro-USB port should be pointing up, the on/off switch should be pointing forward, and the battery should be resting against your arm.

The second sensor should be worn on the top of your right wrist, where you typically have the face of a watch. Again, with your arm hanging straight down by your side, the micro-USB port should be pointing up with the on/off switch pointing forward, and the battery resting against your wrist.

The two sensors should be in line with each other when your arm is hanging down by your side.




In the video shown above, we used a 4" wide elastic bandage (normally used for wrapping sprains, etc) cut length-wise to make two 2" wide strips, each about 2 ft long. We wrapped them a couple times around the arm first, then positioned the sensors and continued wrapping the rest of the bandage. A more convenient solution may be using an arm or wrist band with a slot for a music player.



Links:



7 comments:

  1. This comment has been removed by the author.

    ReplyDelete
  2. Great job fellows, next step is a fighting simulator ;)
    I'm working on a project with IMU's and this product seems useful (or I'm considering razor IMU clones and hacks). There's a breakout board for the MPU-9150 at Sparkfun, and people complain that the actual chip does not output processed data (Sensor Fusion - MLDL Layer is hidden), but only raw values. You have to use their API for this purpose etc. I'm guessing the wireless evaluation board you use (which is $200) lets the onboard MCU deal with this part (I didn't see no such API in the code, or did I miss it)? I wonder if the regular evaluation board outputs processed data as well, since I don't need wireless access that bad...

    ReplyDelete
    Replies
    1. That's right, the board we are using calculates processed data in the firmware. I believe their firmware code is open-source, so it should be possible to port it from mcu code to standard C code.

      Delete
    2. trying to add a physics engine, http://www.flickr.com/photos/mrisney/10548440094/

      https://github.com/mrisney/accelerometer-visualization/blob/master/src/com/risney/motion/Squash.java

      any tips ?

      Delete
    3. (Note: I am the author of the blogpost)

      I haven't truly worked with a physics engine, but from what I know (assuming you already have a physics engine available to use), there are two options.
      1) Use the accelerometer (and gyroscopic) data to compute the force applied to the object in the scene that you are controlling, and let the physics engine take care of the rest of the interactions.
      2) Use the same process described in this post to get an orientation for the object in the scene that you are controlling, and let the physics engine take care of any collisions, etc, that may occur during the process of making the orientation change.

      Delete
  3. is it possible to use arduimu sensor to replace that particular sensor that u used?

    ReplyDelete
    Replies
    1. (Note: I am the author of the blogpost)

      I don't see why not, as long as it has the same types of sensors built-in, or can provide similar data (i.e. an orientation). If it doesn't do the error-correction onboard, however, you will have to implement that yourself. One reason for using this particular sensor is that it uses the raw data from the various sensors to self-correct on the fly and prevent large drifts.

      Delete