A Keen-Eyed Robot Goes to Work for a Paralyzed Veteran
In December of 2016, a team of researchers showed up at Romy Camargo's house with a better-than-average holiday gift. The front of the nondescript silver box lowered—like one of those spaceship doors from Star Wars, minus the dramatic clouds of vapor—to reveal a fetching robot, with cameras for eyes and a flatscreen for a hat. With the assistance of its human handlers, the Human Support Robot, as Toyota calls it, wheeled into Camargo's home on a mission: to support the quadriplegic veteran and in the process pave the way for truly useful care robots.
First, though, the HSR had to surmount a host of obstacles.
Robots already work well in uniform environments. That's why self-driving cars are so promising: Urban planners have certain rules for signage, for instance, that the car can read. And industrial bots have already taken over factory floors. “Robotics works very well in a more manufacturing situation, where you can make the environment very static,” says Allison Thackston, a roboticist at the Toyota Research Institute who's helping develop the HSR. “You can structure it in a very specific way and the robot can do the same repeatable action over and over again very quickly and very accurately.”
But a human home is anarchy. Even if you’re working with a cookie-cutter floor plan in a McMansion development, what’s inside the home is changing day by day or hour by hour. So the HSR, a wheeled robot with a single arm outfitted with a gripper to snag objects like bottles and even a vacuum to suction-grasp pieces of paper, has to adapt to the chaos. To find its way around, it uses 3-D cameras and lasers—just a like a self-driving car, only it’s necessarily far more cautious with its speed. And for the time being, it has to identify objects in Camargo's home using QR codes.
For Camargo—who was shot in the neck during combat operations in Afghanistan in 2008—that means he can use a mouth stick (think of it like one of those pens that work on touchscreens, only longer and held in the mouth) to navigate a special interface and command the robot to, say, fetch a bottle of water. The operators have already given the robot a hint as to what room it would find the bottle in, so then it's a matter of the HSR getting there and recognizing the QR code it's after. After getting good grasp, the robot makes its way back to Camargo.
The bot can pull off other tricks like opening doors. “I would be there by the window outside my door, and it would come up to me and do facial recognition,” Camargo says. “And then it would just go backwards and open the door, which is another thing I didn't have to have my nurse around for.” That means more freedom both for Camargo and for his nurse, who can instead concentrate on more complex tasks. The HSR, after all, isn’t meant to be a replacement for caregivers, but a complement to their services.
So the HSR requires humans to order it around. But you can imagine a day when robots will, for instance, pick up clutter on their own so the elderly don’t trip. Really, just like the HSR has to adapt to unique environments, it also has to adapt to unique users. For now, that's a matter of its handlers programming it for different situations. But in the future, expect bots to more actively adapt to our various needs.
- Matt Simon
Meet Salto, the One-Legged Robot With an Incredible Leap
- Matt Simon
Gecko-Inspired Gripper May Soon Snag Space Junk
- Davey Alba
Google.org's Giving $20 Million to Engineer a Better World for the Disabled
Related Stories
Human needs won't just be dictated by age or health, because not everyone is going to be comfortable with a robot autonomously wandering their home. “If we have the robot that has the capability to both identify, manipulate, and navigate to a high level of autonomy, then we might pair that with what the user is comfortable with,” says Doug Moore, who's also working on the HSR. “An elderly person might be less or more comfortable with more autonomy versus a younger person that's more comfortable with technology, right.”
For now, Toyota will keep testing the HSR, experimenting with different hardware. On the software side, it'll keep training the robot to better recognize facial cues so it can tell, for example, exactly when its subject is ready to sip some water. And Camargo, for his part, will keep inviting the robot back. Sadly, though, the HSR won't be roaming your halls anytime soon—it'll take some time to get to the market.
These are early days in robotics, after all. This is all about experimenting with the limits and potential of a flexible platform, because the machines of the future won’t be one-size-fits-all. As much as they’ll have to adapt to us, we’ll have to adapt to them. You might, for instance, take a test to determine what kind of driver you are and how your self-driving car should mimic that. And one day, it might be something like an HSR that you customize to fit your housekeeping needs. Jetsons, here we come.