Projects

Responsive Animated Characters

The MRL's Responsive Animated Character research began when Prof. Ken Perlin brought his groundbreaking noise techniques from image rendering to character movement. Characters created in this manner can smoothly blend and layer animations, compositing them in time much as programs like Photoshop composite images in space. This type of smooth blending is often referred to as procedural animation.

These characters can also use noise-influenced models for decision making at many levels: ranging from low-level animation triggering (e.g., eye blinking), to mid-level behaviors (e.g., approach/avoid), to high-level attitudes that develop over time. These characters aren't attempting to be intelligent in their behavior, but rather to use carefully-crafted statistical models to engage their audience of users.

This combination of techniques for creating improvisational virtual characters was first demonstrated in 1987 - with a wire-frame figure that made its own decisions about when to reach toward an animated bird, and then smoothly animated the appropriate behaviors and transitions between them. Several years later, Prof. Perlin decided to take these research results and push them forward, and in 1994 submitted a film to the SIGGRAPH Electronic Theatre. This film showed a continuously animated shaded 3D dancer, displayed on a single-processor machine, in a real-time performance created by a piano player and director improvising together along a storyline. This lead to a great deal of interest, and increasingly sophisticated versions of this work were demonstrated at SIGGRAPH 95, 96, 97, 98, and 99 under the name "Improvisational Animation." During this time, significant project contributions were made by Athomas Goldberg, Clilly Castiglia, Duane Whitehurst, Jon Meyer, Troy Downing, Eric Singer, and Sabrina Liao. In 1999 the Improv system, developed through these projects, was spun off by the NYU Center for Advanced Technology. The company, Improv Technologies, is headed by Athomas Goldberg.

At the MRL, work in Responsive Animated Characters continues, including characters for the web, handheld devices, live performance, and virtual worlds - as well as tools for content creators, at all skill levels, working in these contexts.

This work also influences the MRL's engineering work. For example, recent research in autostereoscopic displays was motivated by a desire to make it possible for these characters to engage their audience more fully. In the past, the Improv system has been used in engineering contexts for robot control.

Responsive Face

Some of the MRL's most recent work has been on expressive characters with minimal geometry - suitable for the web or handheld devices. This face implements a small number of degrees of freedom, but can still express a significant subset of Paul Ekman's Facial Action Coding System. This face was shown, on an HP Jornada, running in Java, at SIGGRAPH 2000. Chris Poultney and Lisa Mackie worked with Prof. Perlin on this project.

Responsive Face Demo

Dancer and Body Guy

The dancer was first shown at the SIGGRAPH 99 Web 3D Roundup. Her simple, expressive full-body design was entirely implemented in Java 1.0. How might children or naive users be able to design their own characters like this? Body Guy demonstrates an intuitive, constraint-based approach to character authoring for children.

Body Guy Demo

Leon at SIGGRAPH 99

Improv Technologies (an MRL/CAT spin-off) was launched at the SIGGRAPH 99 Electronic Theatre, where their interactive character Leon was Master of Ceremonies. This project was a collaboration between Improv Technologies, Mitch Butler of Flimsy Entertainment (who created Leon for SIGGRAPH 98's "The Smell of Horror"), and Clilly Castiglia of the MRL.

Aria II

The MRL has a history of collaboration with the Integrated Systems Laboratory at the University of Sao Paulo especially sound researcher and composer Ruggero Ruschioni, lab director Prof. Marcelo Zuffo, and parallel architecture expert Prof. Sergio Takeo Kofuji. For SIGGRAPH 96, our labs worked together to create Aria - a demonstration in the Digital Bayou that allowed visitors to conduct the tone and tempo of a virtual opera singer. In 1998 and 1999 more advanced versions of Aria were created and shown in venues ranging from the Itau Cultural museum in Sao Paulo to SuperComp 99 in Portland, Oregon. Currently, the two labs are working together with Itau Cultural, artist Ricardo Ribenboim, and composer Wilson Sukorski on an opera featuring responsive animated characters.

Sid and the Penguins

At the SIGGRAPH 98 Electronic Theatre, the MRL presented a live performance by a troupe of virtual actors - that took place within a web browser. This project was a collaboration involving dozens of undergraduate computer science and animation students, as well as faculty from both disciplines. It tells the story of the lovable Sid and his interaction with a group of dancing penguins.

Willy

Working with NYU Music Technology professor Robert Rowe, in 1998 the MRL created Willy - a saxophone player who improvises music and movement in response to a jazz piano player. At their first rehearsal, Clilly Castiglia remembers the piano player jumping back from his keyboard to say, "This is the first time in years someone's really listened to me." Willy was presented in a series of performances at Lincoln Center, as well as at conferences such as ISEA and ICMA.

Wendy

At SIGGRAPH 97, Wendy helped the MRL introduce the then-new Java/VRML version of Improv. Her click-controls are still one of the best introductions to our motion layering and blending techniques.

SIGGRAPH 96 Bayou

For the SIGGRAPH 96 Digital Bayou, the MRL created "Botanica Virtual." This room-size installation allowed a single user (wearing a stereoscopic carnival mask) to interact with mysterious bayou denizens while a larger audience watched on a projection screen. All graphics, behavior, and environmental sound were produced in real time on Silicon Graphics and Apple computers.

SIGGRAPH 95 Characters

At SIGGRAPH 95 the MRL allowed audiences a number of ways of interacting with Responsive Animated Characters. Using voice recognition, they could play "Simon Says" with the comical Otto. Motion tracking made it possible for participants to turn themselves into virtual bats, flapping their arms to fly around a castle inhabited by interacting Improv characters. The more severe Gregor character from this scene joined Otto, and the Improv system of this period, to make possible Barbara Hayes-Roth's "Master/Servant Scenarios" at Stanford University.

Gregor and Otto also starred in a number of important experiments at the MRL, using relatively simple techniques to achieve engaging emotional effects.

SIGGRAPH 94 Dancer

This 1994 demonstration, shown in the SIGGRAPH Electronic Theatre, marked the initiation of Responsive Animated Character research as a major focus of the Media Research Laboratory. Its visual and musical story is of a dancer, becoming more comfortable and free in her movements, until she breaks the bounds of conventional movement altogether. It was produced in real time, on a single processor machine.

Patents

Improv Patent

Publications

Responsive Actors in Shared Virtual Worlds
K. Perlin
2000 International Conference on Virtual Worlds and Simulation, San Diego.

Improvisational Animation
K. Perlin, A. Goldberg
1999 International Conference on Virtual Worlds and Simulation, San Francisco.

Texturing and Modeling; A Procedural Approach, Second Edition
Ebert D. et al
AP Professional; Cambridge, 1998
Chapter entitled: Noise, Hypertexture, Antialiasing and Gestures

Layered Compositing of Facial Expression
K. Perlin
ACM SIGGRAPH 97 Technical Sketch

Improv: A System for Scripting Interactive Actors in Virtual Worlds
K. Perlin, A. Goldberg
Computer Graphics; Vol. 29 No. 3., 1996

Real Time Responsive Animation with Personality
K. Perlin
IEEE Transactions on Visualization and Computer Graphics; Vol 1
No. 1., 1995