Item request has been placed!
×
Item request cannot be made.
×
Processing Request
Exploitation of Contextual Affect-Sensing and Dynamic Relationship Interpretation.
Item request has been placed!
×
Item request cannot be made.
×
Processing Request
- Author(s): LI ZHANG
- Source:
Computers in Entertainment. Dec2010, Vol. 8 Issue 3, p18-1-18-16. 16p. 1 Diagram, 1 Chart.
- Additional Information
- Subject Terms:
- Abstract:
Real-time contextual affect-detection from open-ended text-based dialogue is challenging but essential for the building of effective intelligent user interfaces. In our previous work, an affect-detection component was developed, which was embedded in an intelligent agent interacting with human-controlled characters under the improvisation of loose scenarios. The affect-detection module is capable of detecting 25 basic and complex emotions based on the analysis of pure individual turn-taking input without any contextual inference. In this article, we report developments on equipping the intelligent agent with the abilities of interpreting dynamic inter-relationships between improvisational human-controlled characters and performing contextual affect-sensing, based on the discussion topics, the improvisational "mood" that one has created, relationship interpretation between characters, and the most recent affect profiles of other characters. Evaluation results on the updated affect-detection component are also reported. Overall, the performances of the contextual affect-sensing and dynamic relationship interpretation are promising. The work contributes to the journal themes on affective computing, human-robots/agent interaction, and narrative-based interactive theatre. [ABSTRACT FROM AUTHOR]
- Abstract:
Copyright of Computers in Entertainment is the property of Association for Computing Machinery and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
No Comments.