podcast

Training AI to read animal facial expressions, NIH funding takes a big hit, and why we shouldn’t put cameras in robot pants

13.02.2025
Listen to the episode on your favorite platforms:
  • Apple Podcasts
  • Spotify
  • Castbox
  • Pocket Casts
  • Stitcher
  • iHeart
  • PlayerFM
  • Overcast
  • Castro
  • RadioPublic

First up this week, International News Editor David Malakoff joins the podcast to discuss the big change in NIH’s funding policy for overhead or indirect costs, the outrage from the biomedical community over the cuts, and the lawsuits filed in response.

 

Next, what can machines understand about pets and livestock that humans can’t? Christa Lesté-Lasserre, a freelance science journalist based in Paris, joins host Sarah Crespi to discuss training artificial intelligence on animal facial expressions. Today, this approach can be used to find farm animals in distress; one day it may help veterinarians and pet owners better connect with their animal friends.

 

Finally, Keya Ghonasgi, a postdoctoral fellow at the Georgia Institute of Technology, talks about a recent Science Robotics paper on the case against machine vision for the control of wearable robotics. It turns out the costs of adding video cameras to exoskeletons—such as loss of privacy—may outweigh the benefits of having robotic helpers on our arms and legs. 

 

This week’s episode was produced with help from Podigy.

 

About the Science Podcast

 

Authors: Sarah Crespi; Christa Lesté-Lasserre; David Malakoff

Learn more about your ad choices. Visit megaphone.fm/adchoices

email
Auto light/dark, in dark modeAuto light/dark, in light modeDark modeLight mode

© 2020–2025 PC.ST

Developed by — Pavel Kozlov

Design by — Bonkers!

Training AI to read animal facial expressions, NIH funding takes a big hit, and why we shouldn’t put cameras in robot pants
00:40:05