In honor of this weekend’s release of The Hobbit: The Desolation of Smaug, part of the franchise that has arguably done more to make motion capture a household term than any other, we’re looking into how motion and performance capture technology could be less out of reach than you’d think for a non-blockbuster.

In 2012, the unthinkable nearly happened. With the support of 20th Century Fox, fellow co-star James Franco, and critics’ groups all across the country, British actor Andy Serkis almost landed an Oscar nomination for his performance as a computer-generated ape (Caesar) in Rise of the Planet of the Apes—a role that defied traditional acting methods as neither Serkis’ actual face nor body ever appeared on film.

Through the art of performance capture, Serkis created quite the spectacle, bringing a range of emotion to the animated chimp that slyly outwits his captors and sparks a revolution. Accessing the almost-tangible texture of Serkis’ off-screen persona, audiences were moved by Caesar’s energy and excitability on screen, his frustration and sadness, and ultimately, his rage.

Serkis says, “I’ve never drawn a distinction between acting in live action or on stage, and acting in a performance-captured role. For me, the only difference is the way in which the performance is recorded.” Although the role ultimately failed to earn Academy Award consideration, the campaign for Andy Serkis created a serious debate about the validity of performance capture and the future of technology, acting, and moviemaking as a whole.

Motion capture (“mocap” for short) and performance capture, i.e. the art of recording the movement of objects, animals, and people, is nothing new. The technology has been around since the early ’70s, but was never fully incorporated into feature films until Total Recall (1990), Lawnmower Man (1992), and Lost in Space (1997). Often used synonymously, motion capture actually focuses on isolated or subtle features like the face or fingers, while performance capture focuses on the movement of the entire body. Today, the technology can be witnessed everywhere, from District 9 to Real Steel to The Avengers and Avatar, the sequel of which will pioneer a new kind of underwater motion capture.

Outside of feature film, motion capture is heavily utilized in video games like Batman: Arkham Origins, Metal Gear Solid 5, and inFamous: Second Son. Even the recently shelved game Star Wars 1313 will find life after Disney acquisition as a post-production movie tool for motion capture. Combined with real-time motion capture, the technology will “take the post out of post-production,” as the CG can be captured during normal filming and eliminate the modeling and animation process.

So, with all the buzz surrounding motion and performance capture today, it’s no surprise that many companies such as PhaseSpace, Mocapone, Reallusion, and Centroid are popping up. These groups aim to educate, train, provide facilities and soundstages for projects, and create software and tools for students and filmmakers to streamline efforts through animation, pre-visualization, and post-production.

Says Phil Stilgoe, Operations Director for Centroid: “Filling in pixels and green screen shots is one of the biggest uses of motion and performance capture, but you also have pre-visual for live action films so that directors and DPs can work out their cameras. And thanks to Andy Serkis, you now have principal characters being developed with performance capture as opposed to just crowds and backgrounds.”

For all of these things, the changes in technology and costs are making it feasible for independent filmmakers to experiment and deliver amazing results. Using software packages like iClone or iPi Soft along with Sony Playstation Eye cameras, indie filmmakers can track 3D body movement and produce 3D animation without the need for expensive facility space, lighting, backgrounds, sensor suits with reflective markers, or teams of engineers. The $10-20K motion capture systems of today easily outperform the $100-200K systems from five years ago.

“Motion capture is in a rapid innovative period and techno-friendly filmmakers have lots of options to become mad scientists,” says John Martin, VP of Product Marketing for Reallusion. “With very little investment up front, indie filmmakers can get production-quality results, reach further than ever before through social media, and command and view performances and direction in real-time.”

Some of the most immediate benefits that motion and performance capture offer independent productions include:

• One set: Only one stage or room is needed to capture all kinds of motions for a project. With the latest advances, you can even go outdoors and capture everything in a more natural environment.
• Lower costs: Motion and performance capture reduces the need for costumes, actors, and sets. One actor can play multiple roles within a single film and there’s no need to have special lighting, colors, or filters when filming. Costumes, makeup, body size, and age can be adjusted on the fly.
• High volume: Motion and performance capture tools can produce limitless possibilities during and after filming for body rotation, complex movements, and camera angles.
• Rapid results: It takes far less time than traditional 3D animation, as actual motions can be captured and downloaded to a computer and viewed in real time.

“Performance capture is moving very quickly with great hardware innovations and software implementations,” says Martin. “The trend is moving toward more motion-engaging content via TV, video games, and how advertisers interact on digital signage. There will be improved functionality, hybrids, and higher quality results, such as motion sensors that can capture point cloud data and make 3D models instantly just by scanning an object, room, or person – all of which are a plus for filmmakers and innovators.”

Adds Stilgoe, “Video games used to be our bread and butter, but now game companies have built their own in-house studios, so feature and indie films are where we’re seeing the biggest growth. So much more is being shot against the green screen. No doubt about it, [Serkis’ most famous role] Gollum really put performance capture on the map as a performance tool rather than an afterthought that a technician did with extras.”

And speaking of Andy Serkis? The face of motion capture that helped bring to life popular characters in The Lord of the Rings, King Kong, The Adventures of Tintin, Rise of the Planet of the Apes, and more has infused his own equity (along with Jonathan Cavendish) into the Imaginarium, a full-fledged production studio dedicated to the development, education and perfection of performance capture technology.


Says Serkis, “The Imaginarium is an amazing playground—a cross between a creative lab for furthering the art and craft of performance capture, and a full service production company. We service other people’s films, video games and next generation story content. But we’re also building our own tools and driving our own projects.”

In addition to collaborating with current filmmakers like Peter Jackson, James Cameron, and Rupert Wyatt, and training a new generation of performance capture experts and actors, the Imaginarium is currently developing an updated take on the classic George Orwell fable, Animal Farm. “It’s going to be completely performance capture driven. Every single character on stage with a director,” says Serkis.

Fast becoming a moviemaking necessity, motion and performance capture are helping to bring other worlds and characters to life. With the latest advances in technology, cameras, and software and the reduction in costs, the possibilities are now available for indie filmmakers to experiment and push the boundaries between live acting and animation even further. As Serkis puts it, “It’s such an enormous part of the future because it stands right in the center of convergent space between film, video games, and live theater.” We’d go one further and call it the convergence between reality and make believe itself. MM

This article appears in MovieMaker‘s Fall 2013 issue.