Fintech PR

Robots Learn to Speak Body Language?

Published

on

Reading Time: 3 minutes

 

Body language is very valuable in everyday conversation and can help clear out the intentions of a particular individual.

For example, if someone walks fast without raising his head or seems to scan the environment, that person knows where he is going. There are other less noticeable body language signs that can help humans determine whether a person is happy or sad.

It is easy for humans to pick these body language signs because they are of the same species and can read emotions from those movements. The difference is that now, it isn’t humans only that can accomplish this task but also robots have learned how to speak body language.

The robot that can read your own body language is one of the best robots to look for in 2020 because of its advanced technology. How does it work? Here is an explanation of how this robot works:

Carnegie Mellon University’s OpenPose

The researchers of Carnegie Mellon University made this whole robot a reality because they decided on building a body-tracking system.

They gave the proprietary technology the name OpenPose which is a software that helps detect body movements. To use this software technology, they used CMU’s Panoptic Studio which is a dome with 500 cameras.

The use of the dome was essential to try and capture the body poses at many different angles with the intention of gleaning enough imagery to create a data set.

Advertisement

Once that data set is built, the use of the Panoptic Studio is no longer required but only one camera can be used. With the data set, you only need a laptop and that camera which makes the system more mobile and accessible.

What does it detect?

Picking on big movements such as the strides made by an individual when walking has been mastered by Artificial Intelligence systems. There are some technologies that also try and make the movements of athletes more efficient by monitoring their performance and giving pointers where they could improve.

Those are all great achievements for the AI sector but they don’t quite match the technology used by the OpenPose system.

Frank Tudor, who’s a science and technology essay writer for an online academic writing service, throws some more light on it. He says, ‘’the system detects even the smallest movements and tracks each pose made by the individual subjected to it. That even includes movements done by the fingers and body language that goes on the face and head. Even the most subtle movements done by any visible body part can be recorded and analyzed by OpenPose.’’

Applications of this technology

The use of the technology we mentioned above, such as measuring the strides of individuals, has been used by law enforcement agencies and the sports industry. Imagine how much value they would add to their plethora of technological resources if they were to use this system.

For example, FBI analysts could use this technology to notice even the smallest movements made by a suspect and uncover more data from their body language.

While home robots’ developers could use OpenPose to create machines that will understand what their owners are saying just by analyzing their body language. Those are just a few examples. There are many more applications of this technology, especially in augmented reality and any other interaction of robots and humans.

Potential capabilities of this technology

Advertisement

OpenPose’s technology is heading to where there is minimal difference between robots and humans. What separates robots and humans is that the other has emotions, while the latter doesn’t. Since robots can have the capability of picking on body language, they can learn how to read emotions from that.

As time goes on, once robots can learn how to read emotions, the next step would be processing them and reacting accordingly. Obviously, we are far away from having psychologist robots but there is a possibility that this technology could aid robots to understand emotions better.

How is this technology accessible?

Understandably, a lot of people might want to get their hands on this technology for various applications. Some might want to use it for recreational purposes while others need it for professional usage. Whatever your case may be, the technology has been made available to the general public.

The researchers made the code that builds this system open-source so others could experiment with it. You can make additions to the software to create a fully-fledged system with a particular purpose in mind. The generosity of the researchers will help this technology grow and become more easily accessible in the near future.

The bottom line

The time for robots to understand emotions is near than where it was a decade ago. Technological developments such as this one are bringing society much closer to a world where robots and humans are one. Undoubtedly, this is a step in the right direction because OpenPose could help many different industries develop systems that will analyze human behavioral patterns.

About the Author:

Justin is a marketing specialist and blogger from Leicester, UK. When not working and rooting for Leicester FC, he likes to discuss new trends in digital marketing and share his own ideas with readers on different blogs and forums. Currently, he is working as a content marketer at uk.bestessays.com.

Advertisement

Trending

Exit mobile version