Facebook is teaching robots to appear more human-like. The company’s AI lab has developed a bot that can analyze the facial expressions of the person it’s interacting with, and adjust its own appropriately.
Yes, Facebook doesn’t just want its AI-trained bots to know how humans speak—it wants them to understand our faces, too.
In a newly published paper, the company’s AI researchers detail their efforts to train a bot to mimic human facial expressions during a conversation.
The algorithm was trained on Skype so that it could learn and then mimic how humans adjust their expressions in response to each other. To optimise its learning, the algorithm divided the human face into 68 key points that it monitored throughout each Skype conversation, the research that Facebook would present at the International Conference on Intelligent Robots and Systems in Vancouver, Canada, from September 24-28.
The system learned to do acts like naturally producing nods, blinks and various mouth movements to show they are engaged with the person they are talking to.
The bot later learned speaking like humans and choose in real time what the most appropriate facial response would be.
They then tested the bot on a group of humans.
They were asked to “look at how the (bot’s) facial expressions are reacting to the user’s, particularly whether it seemed natural, appropriate and socially typical”, and decide on whether or not it seemed to be engaged in a conversation.
They say the natural next step is to make the bot interact with a real human in the real-world.
The research is still in its early days for now, but the scientists say they hope their work will inspire more groups to look at similar situations.