Researchers at Carnegie Mellon University in America have developed a computer code that can help robots to understand human body language! You know how sometimes you say something but mean something else? Like if you slept at midnight, but are trying to fool your mum or dad that you slept earlier, you might say “Oh! I was in bed at 9 pm!” but your body language might give you away, if you smiled a little smile or winked at your younger sister.
Computers currently cannot read body language so they wouldn’t be able to catch your smile and your wink and tell you were fibbing. They would take you at your word and record that you slept at 9 pm. But your parent would definitely be able to catch you out!
The researchers at Carnegie Mellon have used a recording studio called the Panoptic Dome, fitted out with 500 cameras, to record human movement and turn it into 2 dimensional images – video. The code then uses this information available from multiple people and tells the computer how to read small and large movements. It can apparently even break it down to the movement of each finger and movements of the mouth.
They say that they are hopeful that once this code is effectively developed, they can use it to help self-driving cars better predict what pedestrians are going to do next and improve their accuracy at avoiding them. We are sure there will be many, many more uses. The researchers have given the software to different research groups who can develop it further.