Humans, Robots Will Read Your Feelings Through Body Language

Humans, Robots Will Read Your Feelings Through Body Language

Body language says a lot about a person. If your guest says he is comfortable but you see him sitting with his arms crossed and jaw clenched, you will doubt his honesty. This is because, as humans, we've been making keen observations all our lives to understand body language and what it conveys. But when it comes to robots, they might believe that the guest is comfortable because he audibly said so. With advancements made in computer vision and facial recognition technology, robots are not being able to pick up on subtle body movements.

In Order To Co-Exist, Robots Needs To Understand Social Cues

Researchers at Carnegie Mellon University created a body-tracking system to counter this issue. OpenPose is a system that can track body movement like that of the hands and face in real-time. This system uses computer vision and machine learning to process video frames. It can track the movements of multiple people simultaneously which will improve human-robot compatibility and make way for more augmented reality games and intuitive user interfaces.

If you think a robot tracking the user's head, torso and limbs are advanced, the OpenPose system can also track individual fingers and their movement. To make this happen, researchers used Panoptic Studio, a dome lined with 500 cameras that were used to capture body postures from a variety of angles. These images were used to build the data set for the system.

All those several images were then passed through a keypoint detector which helped identify and label the body parts. OpenPose learns to associate the body parts with its individuals which makes tracking multiple people possible without creating chaos regarding whose hand is where.

Initially, images in the dome were captured in 2D but researchers converted them into 3D to help the body-tracking algorithm understand different poses in different angles. This allows the system to recognize how person A's hand looks even if something is obstructing the system's vision. Because OpenPose has all this data to rely on, it can run with one camera and laptop instead of a camera-lined dome, making this technology more accessible.

Similar to OpenPose, scientists are working to make more such empathic robot systems that can read gesture cues. Another such example is a robot called Forpheus that does more than just play table tennis. It reads body language to get a glimpse of its opponent's ability and offers advice and encouragement. "It will try to understand your mood and your playing ability and predict a bit about your next shot", said Keith Kersten of Omron Automation, a Japanese company that developed Forpheus.

According to researchers who made OpenPose, this type of machine learning technology can be used to facilitate all sorts of interactions between humans and machines. It can improve VR experiences by detecting finger movements of the users without any additional hardware attached to the users, like gloves or stick-on sensors.

There is a possibility of a future where humans will have robots as companions, at home, and at work. With these advancements, humans can have more natural interactions with robots. You can tell a home robot to pick up something by pointing towards it and the machine will understand what you're pointing at.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net