News  |    |  October 23, 2019

How and Why Companies Will Engineer Your Emotions

News article by Richard Johnson.
Published in IEEE Spectrum.


Technology has become more physically and psychologically intimate, which has created a demand for new technologies that can infer emotional states from humans. The term “affective computing” was coined in 1995 by Professor Rosalind Picard, founder and director of the affective computing research group at the MIT Media Lab. She recognized the extent to which emotions governed our lives and decided to drive forward the concept of “engineering emotion.”

What is affective computing?

Affective computing systems are being developed to recognize, interpret, and process human experiences and emotions. They all rely on extensive human behavioral data, captured by various kinds of hardware and processed by an array of sophisticated machine learning software applications.

AI-based software lies at the heart of each system’s ability to interpret and act on users’ emotional cues. These systems identify and link nuances in behavioral data with the associated emotion.

The most obvious types of hardware for collecting behavior data are cameras and other scanning devices that monitor facial expressions, eye movements, gestures, and postures.  This data can be processed to identify subtle micro expressions that a human assessment might struggle to identify consistently. [ . . . ]