TY - JOUR
T1 - A uniform human multimodal dataset for emotion perception and judgment
AU - Sun, Sai
AU - Cao, Runnan
AU - Rutishauser, Ueli
AU - Yu, Rongjun
AU - Wang, Shuo
N1 - This research was supported by the AFOSR (FA9550-21-1-0088), NSF (BCS-1945230, IIS-2114644), NIH (R01MH129426), and Dana Foundation (to S.W.), and FRIS Creative Interdisciplinary Collaboration Program, Tohoku Initiative for Fostering Global Researchers for Interdisciplinary Sciences (TI-FRIS), Operational Budget of President’s Discretionary Funds at Tohoku University, and Japan Society for the Promotion of Science Grant-in-Aid for Early-Career Scientists (No. 22K15626) (to S.S.). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Publisher Copyright:
© 2023, The Author(s).
PY - 2023/11/7
Y1 - 2023/11/7
N2 - Face perception is a fundamental aspect of human social interaction, yet most research on this topic has focused on single modalities and specific aspects of face perception. Here, we present a comprehensive multimodal dataset for examining facial emotion perception and judgment. This dataset includes EEG data from 97 unique neurotypical participants across 8 experiments, fMRI data from 19 neurotypical participants, single-neuron data from 16 neurosurgical patients (22 sessions), eye tracking data from 24 neurotypical participants, behavioral and eye tracking data from 18 participants with ASD and 15 matched controls, and behavioral data from 3 rare patients with focal bilateral amygdala lesions. Notably, participants from all modalities performed the same task. Overall, this multimodal dataset provides a comprehensive exploration of facial emotion perception, emphasizing the importance of integrating multiple modalities to gain a holistic understanding of this complex cognitive process. This dataset serves as a key missing link between human neuroimaging and neurophysiology literature, and facilitates the study of neuropsychiatric populations.
AB - Face perception is a fundamental aspect of human social interaction, yet most research on this topic has focused on single modalities and specific aspects of face perception. Here, we present a comprehensive multimodal dataset for examining facial emotion perception and judgment. This dataset includes EEG data from 97 unique neurotypical participants across 8 experiments, fMRI data from 19 neurotypical participants, single-neuron data from 16 neurosurgical patients (22 sessions), eye tracking data from 24 neurotypical participants, behavioral and eye tracking data from 18 participants with ASD and 15 matched controls, and behavioral data from 3 rare patients with focal bilateral amygdala lesions. Notably, participants from all modalities performed the same task. Overall, this multimodal dataset provides a comprehensive exploration of facial emotion perception, emphasizing the importance of integrating multiple modalities to gain a holistic understanding of this complex cognitive process. This dataset serves as a key missing link between human neuroimaging and neurophysiology literature, and facilitates the study of neuropsychiatric populations.
UR - http://www.scopus.com/inward/record.url?scp=85175992849&partnerID=8YFLogxK
U2 - 10.1038/s41597-023-02693-z
DO - 10.1038/s41597-023-02693-z
M3 - Journal article
C2 - 37935738
AN - SCOPUS:85175992849
SN - 2052-4463
VL - 10
JO - Scientific data
JF - Scientific data
IS - 1
M1 - 773
ER -