From 2003 to 2005 I worked as a programmer in a startup project which create motion capture system for fastening creating 3D animation.
The idea was to create mechanical costume which measures angles of human’s body. This information was passed to the PC and analyzes in the module which I developed and transforms to the commands for the 3D renderer. Then, it goes to the rendered to obtain live image for preview.
Next, the recorded motion tracking session was exported to a text file, which was passed to 3DsMax script and creates animation keyframes there. In this way, we could create 3DsMax animation using our technology (Video 1).
There several versions of sensors. The first one was pure mechanical with “computer vision”: angle motion to pear’s motions, which were detected by webcamera (Fig. 1, web camera is in the box). In next versions used custom sensors which values were measured and passed to PC via soundcard input (as amplitude pulses).
Today we have Kinect, Leap motion, Arduino and BlackTrax technologies for motion capture, as well as Unreal Engine for realtime rendering 3D objects. But, at 2003-2005 we don’t touch such things in Russia and was wondering if we could make it.
Unfortunately, motion tracking based on internal human’s angles works worse. Only glove works more or less good. So, I can say - for our experience, external positioning is much better for human motion detection.
Motion capture system Polikarp was made by Alexey Moiseev (idea, head of development), Michail Chameev (creating of capture system and electronics), Pavel Vasev, Denis Perevalov, Michail Bachterev, Vyacheslav Kusnetsov (software), Grigory Malyschev (3D design).
Fig. 1. Mechanical motion tracking suit, the first version using pears and webcamera.
Fig. 2. Mechanical motion tracking glove. You can see “black box” with pears which motion is captured by webcamera
Video 1. Video made using Polikarp motion capture system (latest version of motion tracking, using electromechanical sensors). Thanks Grigory Malyschev for video.