Intermodal matching of emotional expressions in young children with autism.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Abstract:
      Abstract: This study examined the ability of young children with autism spectrum disorders (ASD) to detect affective correspondences between facial and vocal expressions of emotion using an intermodal matching paradigm. Four-year-old children with ASD (n =18) and their age-matched normally developing peers (n =18) were presented pairs of videotaped facial expressions accompanied by a single soundtrack matching the affect of one of the two facial expressions. In one block of trials, the emotions were portrayed by their mothers; in another block of trials, the same emotion pairs were portrayed by an unfamiliar woman. Findings showed that ASD children were able to detect the affective correspondence between facial and vocal expressions of emotion portrayed by their mothers, but not a stranger. Furthermore, in a control condition using inanimate objects and their sounds, ASD children also showed a preference for sound-matched displays. These results suggest that children with ASD do not have a general inability to detect intermodal correspondences between visual and vocal events, however, their ability to detect affective correspondences between facial and vocal expressions of emotions may be limited to familiar displays. [Copyright &y& Elsevier]
    • Abstract:
      Copyright of Research in Autism Spectrum Disorders is the property of Elsevier B.V. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)