This project is dedicated to Eadweard Muybridge, who has captured the first photographic recordings of humans and animals in motion in his famous publications "Animal Locomotion" in 1887. Click here to learn more about Muybridge's history.
Digital Muybridge is about the analysis and synthesis of human motion from video streams (or photo plate sequences as in Muybridge's motion studies). We focus on articulated full body motions. Examples are videos of people walking, running, dancing, and other body gestures. Earlier projects focus on facial motion like "talking lips and heads".
This is in part funded by Interval Research Corp. and California State MICRO program.
|Here are some example MPEG movies on articulated body tracking on lab recordings. Note that this is very challenging footage. Just watch the moving folds and changing appearance on the upper leg. Any standard region based tracker will fail on this.||3D!|
|We are also able to recover Eadweard Muybridge's motion study plates. Here is an example movie (play it with slow framerate!). We will put more plates online soon, but if you can't wait, ask for our video tape.|
|First Level: Motion Coherence Blobs: Given two consecutive images we are able to group pixel regions with coherent motion and spatial proximity using Expectation Maximization (EM) search. Blob models are initialized in clustering Optical Flow field vectors. The iterative improvement of the blob-models is done with repetitive E- and M-steps.||
|The full architecture can be seen in following mpeg movie. Only one motion blob is illustrated. In the center display you see the translation and angular velocities, and in the top display you see the state probabilities for the different moveme categories. Note that state one is learned to roughly correspond to "ground support".|
Also check out the bookchapter, that talks about the early stages of this research:
|Muybridge recorded each time-step with sevral camera views. This enables us to recover detailed 3D information. We were able to re-animate several peoples walk cycles with a computer model, and could inspect it from any desired view angle. This was done in Open-GL by Charles Ying. Movies will come soon to this web-page.|