Emotional AI: Ethics, Policy, Art, Culture

Website led by Professor Andrew McStay of Bangor University (Wales, UK). Contains social science insights, project details, art projects, analyses and reports on technologies that pertain to detect and interact with emotional life.


What is Emotional AI?

Emotional derives from affective computing techniques and advances in machine learning and artificial intelligence (AI). This is a weak form of AI in that these technologies aim to read and react to emotions through text, voice, computer vision and biometric sensing, but they do not have sentience or emotional states themselves. The following techniques are used to try to sense and discern people’s emotions and expressions:

  • Analysis of online sentiment which analyses online language, emojis, images and video for evidence of moods, feelings and emotions. 
  • Facial coding of expressions: the effectiveness of its methodology is debatable, but this analyses faces from a camera feed, a recorded video file, a video frame stream or a photo.
  • Voice analytics: the emphasis is less on analysing natural spoken language (what people say), but how they say it. These include elements such as the rate of speech, increases and decreases in pauses, and tone.
  • Eye-tracking: this measures point of gaze, eye position and eye movement.
  • Wearable devices sense galvanic skin responses(GSR) that indicate emotional stimulation; muscle activity and muscle tension (electromyography); heart rate (blood volume pulse); skin temperature; heart beats and rhythms (lectrocardiogram); respiration rate; skull caps and headwear to measure brain activity (electroencephalography):
  • Gesture and behaviour: cameras are used to track hands, face and other parts of the body said to communicate particular messages and emotional reactions.
  • Virtual Reality (VR): here people cede themselves to a synthetic environment. It allows remote viewers to understand and feel-into what the wearer is experiencing. Headwear may also contain EEG sensors and sensors to register facial movement.
  • Augmented Reality (AR): this is where reality is overlaid with additional computer-generated input (such as graphics). Remote viewers can track attention, reactions and interaction with digital objects.