TED 2018: Technology reveals fear and other emotions

Bret Hartman/TED Poppy CrumBret Hartman/TED
Prof Crum believes we should give away more data

The fear levels of an audience have been measured to show how machines are beginning to reveal people's deepest feelings.

The demonstration was part of a talk given by Dolby Labs chief scientist Poppy Crum at the TED conference in Vancouver.

The ability to hide emotions is becoming "a thing of the past", she said.

The professor believes this could usher in an era of empathetic technology.

"We like to believe we have cognitive control over what someone else knows, sees, understands about our own internal states - our emotions, insecurities, bluffs or trials and tribulations," she explained.

"But technologies can already distinguish a real smile from a fake one."

She added that this went far beyond recording people's actions via a camera or microphone. To demonstrate this, she revealed that the carbon dioxide her audience was exhaling was being monitored.

"There are tubes in the theatre - lower to the ground since CO2 is heavier than air," she explained.

"They're connected to a machine that lets us measure with high precision the continuous differential concentration of carbon dioxide."

Prof Crum then showed a real-time data-visualisation, outlining the changes in the density of the gas in the room as larger and deeper coloured clouds.

"It takes about 20 to 30 seconds for the CO2 from your reactions to reach the machine. You can see where some of us jumped as a deep red cloud. It's our collective suspense creating a spike in CO2," she said.

Dolby Laboratories Thermal imagingDolby Laboratories
Thermal signatures give away how people are feeling, said Prof Crum.

In her day job at Dolby Labs, the neurophysiologist has spent the last few years studying people watch movies.

Willing volunteers are hooked up to electroencephalogram (EEG) caps, heart rate monitors, thermal imaging cameras and skin response sensors in order to observe the biophysical and emotional response that humans experience while watching videos.

"The dynamics of our thermal signature give away changes in how hard our brains are working, how engaged or excited we might be in the conversation we are having, and even whether we're reacting to an image of fire as if it were real," she said.

"We can actually see people giving off heat on their cheeks just looking at a picture of flame."

In the future Prof Crum thinks that similar tech could improve people's daily lives.

Hearing aids might identify when the wearer is stressed and alter volumes. Personal assistants in the home could pick up on the owner's mood. And sensors could make teachers aware when pupils are struggling to understand a lesson.

Speech analysis technologies are already being developed to provide insights into people's mental and physical health, Prof Crum added. She gave three examples:

  • machine learning can analyse the words people use to predict whether someone is likely to develop psychosis
  • dementia and diabetes can be revealed by alterations in the "spectral colouration" of a person's voice. The term refers to a way the frequencies and amplitudes of sound can be represented on a graph
  • linguistic changes associated with Alzheimer's disease can appear more than 10 years before a clinical diagnosis

In an era when many are considering cutting down their digital footprint, Prof Crum urged the opposite approach.

"If we share more data with transparency and effective regulations, it can help create empathetic technologies that can respond in a more human way that improves our lives," she said.