Incorporating structured emotion commonsense knowledge and interpersonal relation into context-aware emotion recognition.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Abstract:
      Traditional emotion recognition technology often focuses on recognizing human biometrics such as facial expressions or body postures. However, psychological research shows that the context (contextual information) also plays an important role in perceiving the emotions of others. Existing research methods, that are based on contextual information, have heavily relied on the semantic features of images. They do not take into account the interrelationships between objects and fail to use external knowledge. Meanwhile, external knowledge is likely to be very helpful in perceiving emotion. In this paper, by incorporating external structured emotion commonsense knowledge, two methods are proposed for constructing emotion knowledge graphs based on the objective text of images, and a multi-modal emotion recognition model is designed. The model has three branches, one of which focuses on human biometrics, and another two branches employ emotion knowledge graphs to perceive emotion from contextual information. Before constructing the emotion knowledge graphs, we convert the visual content into the text information to obtain the prime and ample contextual information from the object, scene, and the relationship between the objects. This approach can reduce redundant and invalid information. After that, the structured emotion commonsense knowledge is integrated into the objective text in word sharing. A large-scale emotion knowledge graph based on all valid words (LEKG) and a small-scale emotion knowledge graph based on the document itself (TEKG) are constructed, respectively. We propose two fusion modules, one of which is attention-based, and the other is a deep reasoning module that incorporates interpersonal relation. We conduct extensive experiments on the benchmark dataset EMOTIC. The experimental results prove that our method is superior to the most advanced methods, and it has obvious advantages in global context-aware tasks. [ABSTRACT FROM AUTHOR]
    • Abstract:
      Copyright of Applied Intelligence is the property of Springer Nature and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)