Cognition Guided Human-Object Relationship Detection.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Author(s): Zeng Z; Dai P; Zhang X; Zhang L; Cao X
  • Source:
    IEEE transactions on image processing : a publication of the IEEE Signal Processing Society [IEEE Trans Image Process] 2023; Vol. 32, pp. 2468-2480. Date of Electronic Publication: 2023 May 05.
  • Publication Type:
    Journal Article
  • Language:
    English
  • Additional Information
    • Source:
      Publisher: Institute of Electrical and Electronics Engineers Country of Publication: United States NLM ID: 9886191 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1941-0042 (Electronic) Linking ISSN: 10577149 NLM ISO Abbreviation: IEEE Trans Image Process Subsets: MEDLINE
    • Publication Information:
      Original Publication: New York, NY : Institute of Electrical and Electronics Engineers, 1992-
    • Subject Terms:
    • Abstract:
      Human-object relationship detection reveals the fine-grained relationship between humans and objects, helping the comprehensive understanding of videos. Previous human-object relationship detection approaches are mainly developed with object features and relation features without exploring the specific information of humans. In this paper, we propose a novel Relation-Pose Transformer (RPT) for human-object relationship detection. Inspired by the coordination of eye-head-body movements in cognitive science, we employ the head pose to find those crucial objects that humans focus on and use the body pose with skeleton information to represent multiple actions. Then, we utilize the spatial encoder to capture spatial contextualized information of the relation pair, which integrates the relation features and pose features. Next, the temporal decoder aims to model the temporal dependency of the relationship. Finally, we adopt multiple classifiers to predict different types of relationships. Extensive experiments on the benchmark Action Genome validate the effectiveness of our proposed method and show the state-of-the-art performance compared with related methods.
    • Publication Date:
      Date Created: 20230428 Date Completed: 20230508 Latest Revision: 20230508
    • Publication Date:
      20230508
    • Accession Number:
      10.1109/TIP.2023.3270040
    • Accession Number:
      37115831