Facial Expression and Pose Recommender System Using Stereotypical Trait Identification
DOI:
https://doi.org/10.32473/flairs.38.1.138994Keywords:
Facial Landmark, personality traits, Pose, RecommenderAbstract
The study of facial features and their association with personality traits has long been a subject of interest across various disciplines, including psychology and anthropology. Despite limited scientific support for direct correlations between facial structure and personality, people often intuitively associate certain facial features with specific character traits. Building on this idea, this paper presents a facial expression and pose recommender system that utilizes early face recognition techniques, specifically facial landmark-based methods, to guide users in optimizing their expressions and positioning for different social contexts. The system relies on the detection of facial landmarks - predefined points on the face that correspond to key features such as the eyes, eyebrows, nose, mouth, and jawline - coupled with geometric distance measurements and angular comparisons. Although these methods, including Active Shape Models (ASM) and Active Appearance Models (AAM), are limited by their sensitivity to pose variation, lighting, and aging effects, they remain highly effective in applications such as character trait identification. Pose and expression variations are pivotal in recommender system. The proposed system computes the relative distances between a user’s facial landmarks and those of stereotypical faces, providing tailored recommendations for adjustments in facial expressions and positioning. These recommendations are designed to enhance user presentation in various scenarios, including public speaking, social interactions, and professional engagements, by aligning facial features with traits like confidence, intelligence, and trustworthiness.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Jinhwi Lee, Teryn Cha, Sung-Hyuk Cha

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.