Researchers at Duke University have come up with an algorithm that helps robots see the world a little more like a human does.

Robots see via sensors, like a camera. This gives them simple depth and shape information. That's good enough for industrial applications, but it falls short in less structured environments, like houses and offices.

In order to do better, the robot has to use information from things it's seen to then differentiate between similar items.

Ben Burchfiel's team at Duke University came up with the algorithm. He says it has many practical applications.

“We really want, eventually, systems like this to be in people's houses doing things. We always use the example of 'we want a robot that can make you tea,' or 'we want a robot that can do the dishes.' So we're really focused on having robots understand the shape of objects around them.”

Results of the research show that robots know what they're looking at 75 percent of the time - 25 percent better than before Burchfiel's innovation. But he says there's still a ways to go before we can put a robot in a home and expect it to do well.

300x250 Ad

300x250 Ad

Support quality journalism, like the story above, with your gift right now.