Press Releases

 

For Immediate Release -- August 7, 2012

 

Researchers applied Hollywood's "Avatar"-like motion capture techniques to reveal new insights on Olympic Gold Medal winners in London 

After six months of rigorous research, Manhattan Mocap, LLC, together with the New York Times and the NYU Movement Lab, released its first analysis of three-times gold medalist Dana Vollmer of the US Olympic swim team, silver medalist Abby Johnston of the US Olympic diving team, and Nick McCrory, who competes this week in the 10-meter platform-diving event for the US team in London. Researchers studied these athletes last spring as they trained in pools across the United States, then deployed cutting-edge motion capture technology similar to the visual-effects systems used in movies like "Avatar".

The researchers achieved a technical breakthrough with a first-of-its-kind underwater capture system, AquaCap (TM), that can accurately analyze Dana Vollmer's famous butterfly stroke and underwater dolphin kick. Vollmer's dolphin-style propulsion enabled her to win her first gold medal and break the world record on June 29th, when she became the first woman to swim the 100-meter butterfly in under 56 seconds. The New York Times published Manhattan's Mocap AquaCap(TM) first analysis to reveal Vollmer's performance and subtle speed variations during a stroke cycle. Additional simulations will be released soon to analyze Vollmer's underwater dolphin kick and compare it to real dolphins swimming underwater. The studies, done in collaboration with a water-simulation team at NYU, show how closely Vollmer replicates dolphins' underwater performance.

Soon, the research team will also release new unseen angles and analyses of 3-meter and 10-meter somersaults and other diving techniques of Nick McCrory and Abby Johnston, with technical breakthroughs in motion capture never done before at that scale. More information: http://manhattanmocap.com/olympics2012

Motion capture records movements of individuals, who wear suits that reflect light to enable digital recording of their actions. Software then translates these movements into digital models for 3D animation often used in video games and movies, such as “Avatar” and “Iron Man". The more sophisticated computer-vision technology, by contrast, allows for tracking and recording of these movements straight from video and without the use of motion capture suits.

Manhattan Mocap LLC is a new spinoff of the NYU Movement Lab, the leader in motion capture in sports, entertainment, and scientific research. Previous projects include analysis of New York Yankees baseball pitchers Mariano Rivera (http://movement.nyu.edu/rivera/), gesture analysis of New York Philharmonic conductor Alan Gilbert (http://manhattanmocap.com/conductor), and a new method to identify and compare the body language of different speakers, a trait the researchers call “body signatures".

Titled GreenDot, the project employs motion capture, pattern recognition, and “intrinsic biometrics” techniques (http://movement.nyu.edu/). In 2008, Manhattan MoCap's results showed that actress Tina Fey, who was widely praised for imitating Republican Vice-Presidential nominee Sarah Palin’s voice and appearance, also effectively channeled the Alaska governor’s body language.

For more information,see: http://manhattanmocap.com and http://movement.nyu.edu.

 

Press contact: info@manhattanmocap.com

 

 


In The News