Emotient is perfecting the science of measuring emotions through facial expression recognition, a nascent field with applications in both the retail and healthcare industries. The company uses a proprietary algorithm that analyzes, and then delivers, data on the involuntary movements of facial muscles and what they indicate about hard-wired feelings.

Marian Bartlett, PhD, co-founder and lead scientist at Emotient (which rhymes with “quotient”), became interested in facial expressions in the early 1990s. While working on her doctorate in cognitive science and psychology at UC San Diego, she explored how biological vision, where images are processed by the brain, can intersect with computer vision, the perception and understanding of images by a machine.

Over the next 10 years, Bartlett collaborated with several researchers in UCSD’s Machine Perception Laboratory, including Terry Sejnowski, PhD, Ian Fasel, PhD, and Javier Movellan, PhD. The four scientists would eventually join together to launch Emotient in 2012.

“We started applying ideas to facial recognition, which is different than facial identity,” Bartlett said. “The way the face moves and changes gets at what the person’s underlying emotions might be. As I learned more about facial expressions, I saw the potential for so many signals in the face that reveal people’s gut reactions and predict their decisions.”

The company’s flagship product, called Emotient API, provides anonymous data on how groups of individuals react to a product or service. Using a camera-enabled device or an external webcam, Emotient API captures facial “microexpressions” like the raising or lowering of the eyebrows, wrinkling of the nose, pursing of the lips, and dropping of the jaw.

“The real commercial value is anonymized aggregate analyses of how groups and sub-groups react to stimuli like products or merchandise,” said Emotient CEO Ken Denman. Individuals’ identities are masked, so there’s no invasion of privacy. “We can measure biological tendencies and markers that are highly accurate over large groups of people,” he added.

The company got its first infusion of cash from a venture capitalist named Seth Neiman, who wanted to develop a company in the area of facial expression analysis. After meeting with the UCSD Machine Perception Laboratory group in 2012, Neiman invested $2 million. In February 2014, Emotient closed a $6 million Series B round of financing. Altogether, Emotient has been granted two patents, two provisional patents, and several copyrights and trademarks.

Commercial uses for the company’s products continue to grow. Emotient API detects and tracks customers’ reactions to new products or displays and signage, indicating their preferences with scientifically valid results. Proctor & Gamble has used the software during focus group studies. In healthcare, Emotient’s technology can provide actionable intelligence on a patient’s condition.

“Spontaneous emotion has its own pathway to facial muscles,” Bartlett explained. “It’s subcortical vs. cortical, and the subcortical pathway is faster. Flashes of expression give a brief window into the emotional system before the logic system takes over.”

The company’s Facial Action Coding Systems [FACS] measures primary emotions like anger, contempt, disgust, fear, joy, sadness and surprise. It can also register positive, negative or neutral response. Emotient has partnered with iMotion Inc., a biometric research firm, to deliver the results in a desktop application.

During the 2014 Super Bowl, Emotient joined with Interscope Research to deliver real-time feedback on focus group responses to television commercials broadcast during the game. The results, aggregated by gender, showed a definite divide between male and female preferences for ads touting beer, cars, and yogurt. But everyone seemed to love Stephen Colbert’s “Wonderful Pistachios” spot; it ranked #1 among both men and women on the “joy” response scale.


Emotient
4435 Eastgate Mall, Suite 320, San Diego, CA 92121
Email: Info@emotient.com
URL: http://www.emotient.com
Founded: 2012
Employees: 8

Ken Denman – President, CEO
Edward Colby – SVP, Product & Business Development
Stephen Ritter – SVP, Product Development
Marian Bartlett , PhD – Founder, Lead Scientist
Ian Fasel, PhD – Founder, Lead Developer
Javier R. Movellan, PhD – Founder, Lead Researcher
Dan Nguyen – VP, Products
Vikki Herrera – VP, Corporate Marketing

Financing: Private funding

Technology Innovator:

Movellan

Javier R. Movellan, PhD
Institute for Neural Computation