Taylor & Francis Group
Browse
pcem_a_1683516_sm2115.docx (11.67 MB)

Do you hear what I see? An audio-visual paradigm to assess emotional egocentricity bias

Download (11.67 MB)
journal contribution
posted on 2019-11-01, 05:35 authored by Mariana von Mohr, Gianluca Finotti, Klaudia B. Ambroziak, Manos Tsakiris

We often use our own emotions to understand other people’s emotions. However, emotional egocentric biases (EEB), namely the tendency to use one’s own emotional state when relating to others’ emotions, may hinder this process, especially when emotions are incongruent. We capitalised on the classic EEB task to develop a new version that is easier to implement and control. Unlike the original EEB task that relies on a combination of private (e.g. touch) and public (e.g. vision) sensory information, our EEB task (AV-EEB) used audio-visual stimuli to evoke congruent/incongruent emotions in participants. Auditory and visual signals are both public, in that they can be shared among individuals, and make the task easier to implement and control. We provide lab-based and online validations of the AV-EEB, and demonstrate a positive relationship between EEB and social negative potency. This new, easily implemented version of the EEB task can accelerate the investigation of egocentricity biases in several research areas.

Funding

M. Tsakiris is supported by the H2020 European Research Council Consolidator Grant [grant number ERC-2016-CoG-724537] to M. Tsakiris under the FP7 for the INtheSELF project, and the NOMIS Foundation Distinguished Scientist Award.

History

Usage metrics

    Cognition & Emotion

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC