• The researchers on the project include (from left): Michail Kislin, a postdoctoral research associate; Lindsay Willmore, a graduate student; Professor Joshua Shaevitz; Professor Sam Wang; Talmo Pereira, a graduate student; and Professor Mala Murthy. Photo by: Denise Applewhite, Office of Communications

  • In the News: Collaboration between Murthy, Shaevitz and Wang labs

    A collaboration between the labs of Princeton professors Mala MurthyJoshua Shaevitz, and Samuel Wang has gone a step further, using the latest advances in artificial intelligence and neural networking to automatically track animals’ individual body parts in existing video.
     
    Their new tool, LEAP Estimates Animal Pose (LEAP), can be trained in a matter of minutes to automatically track an animal’s individual body parts over millions of frames of video with high accuracy, without having to add any physical markers or labels. 
     
    “This is a flexible tool that can in principle be used on any video data,” said Talmo Pereira, a PNI graduate student who is the first author on the paper. “The way it works is to label a few points in a few videos and then the neural network does the rest. We provide an easy-to-use interface for anyone to apply LEAP to their own videos, without having any prior programming knowledge.”
     
    The paper detailing the new technology will be published in the January 2019 issue of the journal Nature Methods, but its open-access version, released in May, has already led to the software being adopted by a number of other labs.
     
    Watch the videos and find out more: