For Immediate Release -- August 6, 2012
Researchers applied Hollywood's Avatar like motion capture techniques to reveal new insights on Olympic Gold Medal winners in London
After six month in the making, Manhattan Mocap, LLC, together with the New York Times and the NYU Movement lab releases its first analysis of three times gold medalist Dana Vollmer from the US Olympic swim team, and Silver medalist Abby Johnston from the US Olympic dive team, as well as Nick McCrory who will be competing this coming week on 10 meter platform diving for the US Olympic team in London. A research team followed these athletes over the spring during their training in pools across the United States, and deployed cutting edge motion capture technology, similar to special effects systems used in movies like Avatar.
One particular technical breakthrough has been achieved with a first-of-its-kind underwater capture system, AquaCap (TM), that can accurately measure Dana Vollmers famous butterfly stroke and underwater dolphin kick. Vollmer's specific dolphin style made it possible for her to win her first gold medal and broke world record last Sunday (June 29th) for the first woman to swim under 56 seconds the 100 meter butterfly. The New York Times already released the first analysis of Manhattan's Mocap AquaCap(TM) to show Dana's performance and subtle speed variations during a stroke cycle. New insights will be released soon with detailed water simulations around Dana Vollmer underwater dolphin stroke and its comparison to real dolphins swimming underwater. This was done together with a water simulation team at NYU, and shows how close Vollmer is performing to dolphins. Furthermore the team will release new unseen angels and analysis of 3 meter and 10 meter somersaults and other diving performances of Nick McCrory and Abby Johnston, another technical breakthrough in motion capture never done before at that scale. More information can be found on http://manhattanmocap.com/olympics2012
Motion capture records movements of individuals, who wear suits that reflect light to enable the recording of their actions. It then translates these movements into digital models for 3D animation often used in video games and movies, such as “Avatar” and “Iron Man.” The more sophisticated computer-vision technology, by contrast, allows for the tracking and recording of these movements straight from video and without the use of motion capture suits.
Manhattan Mocap LLC is a new spinoff of the NYU Movement Lab, that is the leader in motion capture in sports, entertainment, and scientific research. Previous projects include the analysis of New York Yankees baseball pitchers Mariano Rivera (link), gesture analysis of New York Philharmonic's conductor Alan Gilbert (link), and a new method to identify and compare the body language of different speakers—a trait they call “body signatures.” Titled “GreenDot,” the project employs motion capture, pattern recognition, and “Intrinsic Biometrics” techniques (link). In 2008, their results showed that actress Tina Fey, who was widely praised for imitating Republican Vice-Presidential nominee Sarah Palin’s voice and appearance, also effectively channeled the Alaska governor’s body language.