Modeling an object with internal IMUs

[Joseph Malloch] sent in a really cool video of him modeling a piece of foam twisting and turning in 3D space.

To translate the twists, bends, and turns of his piece of foam, [Joseph] used several inertial measurement units (IMUs) to track the shape of a deformable object. These IMUs consist of a 3-axis accelerometer, 3-axis gyroscope, and a 3-axis magnetometer to track their movement in 3D space. When these IMUs are placed along a deformable object, the data can be downloaded from a computer and the object can be reconstructed in virtual space.

This project comes from the fruitful minds at the Input Devices and Music Interaction Lab at McGill University in Montreal. While we’re not quite sure how modeled deformable objects could be used in a user interface, what use is a newborn baby? If you’ve got an idea of what this could be used for, drop a note in the comments. Maybe the Power Glove needs an update – an IMU-enabled jumpsuit that would put the Kinect to shame.

[youtube=http://www.youtube.com/watch?v=-Dqvf1CXPWg&w=470]

Comments

  1. japamalaillo says:

    First thing that comes to my mind is a robotic tentacle.

    Also iron man.

  2. anomdebus says:

    How about tactile oriented people model things in 3d using their preferred method?

  3. Zee says:

    A complex robotic arm like a tentacle would be perfect for this

  4. adr says:

    Cybersextoys.. of course

  5. umi says:

    Teledildonics; Don’t tell me you weren’t thinking it.

  6. matthew katzenstein says:

    if you are thinking about an IMU suit you should check out the work they did for skrillex for his mothership tour. he had a imu on every major joint as well as gloves and the datapoints were rendered into a skeleton and then rendered with different screens and projected onto the stage it was VERY cool but there was some very noticeable lag
    http://www.youtube.com/watch?v=e2ISfFmN0UE

  7. rasz says:

    delay kills the effect

    • Techartisan says:

      have to agree…would have been a bit better if…
      1. he wasnt visible on stage
      2. the music had a delay sync’d with the sensor/render lag.

  8. jez says:

    There are a lot of applications in biomechanics. I designed a similar (less visually impressive) radio based system for my university thesis.

  9. basroil says:

    I’m actually developing a robotic snake that happens to produce similar results using just a single IMU. And it can move itself.

    I think they should have just stuck to one IMU and then three pull sensors per segment, that would require a lot less data and allow higher time-resolution than IMU chips (even with SPI, a dozen chips wouldn’t be able to do more than 100fps)

    • There are only two IMUs used in the video – one at each end of the object.

      There are a few advantages to using IMUs instead of pull sensors. Pull sensors are mechanically weak and will break after a while, and they depend on the object to be sensed being stiff enough – the IMU approach will work even on very flexible objects. Also, using the IMUs we can easily sense twisting of the object, which is much more complex with pull sensors.

  10. Dr T says:

    Cool, can anyone see the part number or manufacturer for this all in one imu device?

  11. brassomat says:

    It might do a hell of job when used for motion capturing in movie animation (pixar..). why program every little movement of a virtual human, when you can just put on the motion capturing suit and simply play the movement?
    Or one step further: first 3d-scan some person/animal into your animation studio and then record the motion and deformations with these sensors!
    Car crash tests would be another application. Recording the material deformation in realtime would make dozens of high speed cameras obsolete.

  12. MusashiAharon says:

    I think this could be used for robots with different limbs, as a secondary positioning system. Instead of having to calculate the acceleration felt in the limbs based on rotary encoders and servo data (which might not account for bending and oscillation), you could just measure it.

    Also, I think this is the way most people sense where their limbs are. It could lead to robots that walk with more natural gaits than ever before.

  13. R3L1C says:

    i find this too complicated, costly, and resource intensive … how about 1 or 2 cheap webcams and have the freakin snake colorcoded? as with the 3d glove .. using 1 webcam ?

    • This is a test for objects that will be used by dancers for a project were working at the IDMIL. Webcams would not work for us since the we need very nuanced control, the area to be covered is huge, the lighting will be unpredictable, and the dancers’ movements will affect visibility. Also, the visual appearance of the objects is important to the piece! (what you see in the video is only for testing the sensors).

      Of course, this method of sensing is not necessarily the best for every project, but for this one it is working very well. IMUs are quickly becoming much cheaper, so it’s also interesting to try out some new approaches to using them…

      • R3L1C says:

        I see… im not sure about efficiency .. but i got one more idea for you then … but not sure if under the conditions , dancing environment is the worst case scenario i guess … how about IR paint instead, and fudicials? and the webcams without IR filter … just and idea

      • R3L1C says:

        sorry for the double post .. but considering more in detail .. i guess in your setup … the complicated HW seems essential indeed .. pardon me

Speak Your Mind

*

Related Hacks in news

  • Graphic equalizer display flashes LED sign to the beat
  • Hackaday around the web and into the future
  • The making of a Vacuum Tube
  • Amphi-Cycle lets you ride the trails, the waves, and back again
  • Heathkit closes down, again.