MIT researchers have added blind locomotion to their cheetah robot, making it capable to handle unexpected obstacles while moving fast.
The new function will allow the Cheetah 3 robot to leap and gallop across rough terrain, climb a staircase littered with debris, and quickly recover its balance when suddenly yanked or shoved.
“There are many unexpected behaviors the robot should be able to handle without relying too much on vision,” says the robot’s designer and associate professor, Sangbae Kim. “Vision can be noisy, slightly inaccurate, and sometimes not available, and if you rely too much on vision, your robot has to be very accurate in position and eventually will be slow. So we want the robot to rely more on tactile information. That way, it can handle unexpected obstacles while moving fast.”
The robot also comes with improved hardware, including an expanded range of motion compared to its predecessor Cheetah 2, that allows the robot to stretch backwards and forwards, and twist from side to side, much like a cat limbering up to pounce.
The basis behind the robot’s advanced prowess are two new algorithms developed by Kim’s team: a contact detection algorithm, and a model-predictive control algorithm.
The contact detection algorithm helps the robot determine the best time to transition a leg between swing and step, by constantly calculating for each leg three probabilities: the probability of a leg making contact with the ground, the probability of the force generated once the leg hits the ground, and the probability that the leg will be in midswing.
The model-predictive control algorithm on the other hand predicts how much force a given leg should apply once it has committed to a step.
It helps calculate the multiplicative positions of the robot’s body and legs a half-second into the future, if a certain force is applied by any given leg as it makes contact with the ground.
According to Kim, the predictive algorithm is designed to make these calculations for each leg every 50 milliseconds, or 20 times per second.
Image, video and content: Sangbae Kim et al/Jennifer Chu/MIT News