Eerily smart new robots can read human body language

Do you think you’re subtle in how you communicate at work? In a few years, you won’t be able to hide what you’re really thinking from highly advanced robotic computer programs that are attuned to the messages our bodies give off.

Sometimes, it all boils down to the signals your body is giving off—  at times, without you knowing it— but luckily, technology has made leaps and bounds in this space.

Researchers at Carnegie Mellon University’s Robotics Institute have reportedly gotten a computer to decipher the body language of people in a group on live video, but that’s not all: “including, for the first time, the pose of each individual’s hands and fingers,” according to the school.

The robots are only going to get smarter.

How do we know? Because the team also reportedly posted their computer code online in an effort to spur similar more advances and “applications,” and released the data.

So to whom do we owe our thanks? Reportedly, to a dome-shaped computer named the Panoptic Studio that can build 3D models of humans.

Tech Crunch had the lowdown on what’s inside and what it can do— it has “480 VGA cameras and 31 HD cameras as well as 10 Kinect sensors” and the ability to construct “wireframe models of participants inside the dome” to let computers in on what’s going on in our heads.

CMU reported that this work at the studio led to the ability to tell how people in a group are positioning themselves using one camera and a laptop.

Here’s the dome’s website so you can see it up close.

The research team had reportedly been scheduled to show off what they found out at the Computer Vision and Pattern Recognition Conference— also known as CVPR 2017— in Honolulu, which runs from July 21-26.

Check out how it works

Watch this video to see the technology in action.

Yaser Sheikh, associate professor of robotics, commented on the research in a statement.

“We communicate almost as much with the movement of our bodies as we do with our voice…But computers are more or less blind to it,” Sheikh said.

What this could mean for your life

Tech Crunch highlighted what this system could be used for, before later stating that “isn’t exactly ready for using at the Super Bowl or your local Denny’s.”

“Interestingly the system can also be used to help patients with autism and dyslexia by decoding their actions in real time. Finally a system like this can be used in sports by scanning multiple participants on a playing field and see where every player was at any one time.” the article says.

Technology is getting a handle on more than we every could have predicted— even the subtle ways we communicate.