<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><div><blockquote type="cite"><div style="margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px;">This is a reminder than there will be a talk of broad interest by Chris
Bregler from NYU in the PIXL lunch talk series</div><div bgcolor="#ffffff" text="#000000">
at noon on Monday in CS 402. All are welcome.<br>
<strong></strong><p class="smallheader"><em></em><br>
<b>From Eye-Balls to Ball-Games: Next-Gen Motion Capture for
Science and Entertainment</b><br>
<em>Chris Bregler, NYU<br>
</em></p><p><strong>Abstract</strong><br>
This talk will cover several research projects centered around the
use of vision and motion capture for animation, recognition, and
gaming. This includes human movements as diverse as subtle
eye-blinks, lip-motions, spine-deformations, human walks and
dances, politicians, base-ball pitchers, and the production of the
largest motion capture game to date. The technical content of the
talk focuses on the trade-off between data-driven models of human
motion vs. analytically derived and perceptually driven models
using dancers, animators, linguists, and other domain experts.
This is demonstrated by sub-pixel tracking in Hollywood
productions, reading the body-language of public figures,
visualizing the pitches of NY Yankees Mariano Rivera, and the
making of crowd mocap games in various cultures.</p>
<hr noshade="noshade">
</div>
</blockquote></div><br></body></html>