Posted Feb 06 2019
A collaboration between the labs of Princeton professors Mala Murthy and Joshua Shaevitz, and graduate student Talmo Pereira, has gone a step further, using the latest advances in artificial intelligence and neural networks to automatically track animals’ individual body parts in existing video. Their new tool, LEAP Estimates Animal Pose (LEAP), can be trained in a matter of minutes to automatically track an animal’s individual body parts over millions of frames of video with high accuracy, without having to add any physical markers or labels.
LEAP has received Princeton Intellectual Property Accelerator funding for promising technologies to explore ways to make their discoveries available to the public via licensing, startups or entrepreneurial ventures.
Awarded to promising technologies with the potential to benefit society and spur the economy, LEAP has been awarded funding to bridge the gap between laboratory research and the development needed to move promising ideas into the global marketplace. Princeton’s Intellectual Property Accelerator Fund provides up to $100,000 to faculty-led teams looking to explore ways to make their discoveries and inventions available to the public via licensing, startups or entrepreneurial ventures. The funds enable researchers to conduct proof-of-concept studies, gather additional data, make prototypes or otherwise demonstrate the potential applications of the technology.
With support from the IPA fund, the team plans to improve the software for use by the broader research community and will enable new research in fields ranging from neuroscience to ecology.