Campos V, Giró-i-Nieto X, Jou B, Torres J, Chang S-F. Sentiment concept embedding for visual affect recognition. In Multimodal Behavior Analysis in theWild. 1st ed. Elsevier; 2018.

Abstract

Automatic sentiment and emotion understanding of general visual content has recently garnered much research attention. However, the large visual variance associated with high-level affective concepts presents a challenge when designing systems with high-performance requirements. One popular approach to bridge the “affective gap” between low-level visual features and affective semantics consists of using Adjective Noun Pair (ANP) semantic constructs for concepts, e.g. “beautiful landscape” or “scary face” which act as a mid-level representation that can be recognized by visual classifers while still carrying an affective bias. In this work, we formulate the ANP detection task in images over a continuous space defined over an embedding that captures the inter-concept relationships between ANPs. We show how the compact representations obtained from the embeddeding extend the discrete concepts in the ontology and can be used for improved visual sentiment and emotion prediction, as well as new applications such as zero-shot ANP detection.