Purpose

Emotions are fundamental to human experience. As such they have strong implication for our well-being and are related to many psychiatric disorders such as major depressive disorder. An understanding of neural mechanisms behind emotions is important to develop new therapies for emotion related disorders. However since emotions are complex and internal it is difficult to identify neural correlates and their computational mechanisms. Here we will focus on emotional face-to-face interaction. Faces contain emotional information highly relevant for social interaction. Seeing emotional expressions on other faces triggers our own emotions which in turn let us make a facial expression. We will study face-to-face interaction and develop a computational model that allows us to provide information about internal emotional states based on external measured facial expression. We will non-invasively record facial expressions and other physiological parameters in healthy human participants while they watch images with emotional content. Facial movements will be recorded with video cameras and state-of-the-art deep learning methods will extract a high dimensional description of the facial movements. Physiological parameters include heart rate pupil diameter and electrodermal activity which are known to correlate with internal emotional states. We will ask participants to rate the stimuli and/or their own feelings when seeing the stimuli. Images of facial expressions are known to elicit facial expressions and correspondingly emotions. We will test how well faces elicit facial reactions compared to other visual stimuli. Then we will study how facial reaction of the participants can be influenced by manipulating the appearance of the face and contextual information. Next we will create computational models that allow us to identify internal emotional states based on the measured parameters. Those models can predict behavioral reactions to yet untested stimuli or contexts. Validating such predictions by further experiments will proof the validity of the computational model for face-to-face interaction and as a readout for internal emotional states. Analyzing the model will give us mechanistic insights into how the brain generates facial expressions and processes emotions. Reported feelings will be used to further validate the results. However neither the model nor the core paradigm depends on this self-report. In the future this will allow to apply paradigm and model to invasive research with non-human primates to test the mechanisms of the computational model.

Condition

Eligibility

Eligible Ages
Between 18 and 100
Eligible Genders
All
Accepts Healthy Volunteers
Yes

Study Design

Phase
Study Type
Observational

More Details

Status
Recruiting
Sponsor
Rockefeller University

Study Contact

Recruitment Office
8007822737
rucares@rockefeller.edu

Notice

Study information shown on this site is derived from this institution's local clinical trials team. The listing of studies provided is not certain to be all studies for which you might be eligible. Furthermore, study eligibility requirements can be difficult to understand and may change over time, so it is wise to speak with your medical care provider and individual research study teams when making decisions related to participation.