Innovation

How companies are figuring out what you’re really thinking

A new crop of startups are using facial-recognition software to uncover consumers’ true reactions

Woman making various faces at the camera

(iStock)

Businesses are understandably obsessed with knowing what their customers are thinking and how they feel. Unfortunately, traditional tools for getting feedback aren’t foolproof, because people don’t always say what’s actually on their minds. But their facial expressions can give them away, and startups are developing technology to read human emotions even better than the naked eye.

Typically, we’re able to detect so-called macro expressions, which last between 0.5 and four seconds. The proper software, however, can read micro expressions. Although they last as little as 1/25th of a second, micro expressions can be revealing nonetheless. Companies such as Affectiva Inc. and Eyeris have developed algorithms to analyze faces and look for patterns that can predict emotional reactions and behaviour. Affectiva, a spinoff of MIT’s Media Lab, claims to have built the world’s largest database of facial expressions and their corresponding emotions, and is using its discoveries to help media companies, market research firms and brands get more detailed consumer insights. The technology can be deployed in focus groups, for example. Participants’ reactions to an ad or product concept up for evaluation are caught on camera and analyzed in real time. “Moderators can see, for example, that John in the back row wasn’t responding that well to the content 20 seconds in, and it allows them to probe a little deeper and ask John a question,” says Gabi Zijderveld, vice-president of marketing and product strategy at Affectiva.

Kellogg’s, for example, used Affectiva’s software to help devise an ad strategy for its Crunchy Nut cereal. The company showed different versions of the same commercial featuring various animals. While viewers laughed most at an ad featuring a snake, Affectiva’s software revealed they weren’t as engaged upon viewing it again. Kellogg’s chose to spend more money promoting a version of the ad featuring an alien instead, because it held up better upon second viewing. Zijderveld says that ad helped drive sales of Crunchy Nut.

The media and advertising space is not the only area where this facial expression–detecting technology is being applied. OoVoo, a video chat and instant-messaging service based in New York, has partnered with Affectiva to include the technology in its cross-platform video chat application. When ooVoo users join a chat, they can gauge how other participants are feeling by viewing a dashboard depicting their emotion metrics. “The essence of communication is interaction, and they say about 85% is non-verbal,” says JP Nauseef, managing director of ooVoo Labs. “This partnership [with Affectiva] allows us to start delivering a deeper, richer experience for our users to let them communicate in a more intimate way with their network around the world.” Nauseef sees the potential for ooVoo to be used in distance learning, so an educator can determine whether students understand a question being asked or if they’re too nervous to ask for clarification. It can even be used by people separated from their loved ones. “If I’m talking to my nine-year-old son when I’m on a trip, I want to know if he’s sad or worried about something,” says Nauseef.

Eyeris, another company in this space, has already sold its flagship software, EmoVu, to federal law-enforcement agencies for use in interrogations. “We work with multiple three-letter word agencies,” says Modar Alaoui, Eyeris founder and CEO. The company is now looking to embed EmoVu into consumer electronics like mobile devices, gaming consoles, computers and televisions. For instance, EmoVu could tap into a smart TV’s camera to monitor viewers, measure how viewers are responding to content and learn their preferences in order to recommend other programs to watch. If it sounds creepy, rest assured that EmoVu would be a feature users have to choose to turn on. “We don’t think our technology is any more privacy invasive than Bluetooth,” Alaoui says, since both needs activation by the user.

Alaoui sees applications beyond the home, too, and says the technology could be useful for hospitals. “Think about, for example, a waiting room that has a camera there and is trying to measure every person’s level of pain or stress and prioritizing them based on these factors,” he says. The possibilities for this kind of software will only grow the more we rely on technology. Both Affectiva and Eyeris say including their technology in vehicles is in the works. “We strongly believe cars will be able to detect emotion and interact with us,” says Zijderveld. “If you’re yawning or falling asleep, the car will know that.”

MORE ABOUT TECHNOLOGY: