Midnightsun Posted March 5, 2022 Report Posted March 5, 2022 Researchers are working on a new data-driven model and algorithm to identify the perceived emotions of individuals based on their walking styles “Forest” by Simon Lehmann (Pixabay) Researchers at the University of North Carolina at Chapel Hill and the University of Maryland at College Park are working on a new data-driven model and algorithm to identify the perceived emotions of individuals based on their walking styles. Exploiting gait features to classify emotional state Through RGB videos of an individual walking, the team extracted his/her walking gait in the form of a series of 3D poses. The aim was to exploit the gait features to classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral. The researchers’ perceived emotion recognition approach is based on using deep features learned via long short-term memory (LSTM) on labeled emotion datasets. A representation of the novel algorithm to identify the perceived emotions of individuals based on their walking styles. Given an RGB video of an individual walking (top), the researchers extracted his/her walking gait as a series of 3D poses (bottom). The used a combination of deep features learned via an LSTM and affective features computed using posture and movement cues and classify using a Random Forest Classifier into basic emotions (e.g., happy, sad, etc.). Credit: Randhavane et al., Fair Use. Moreover, the team combined these features with affective features computed from the gaits utilizing posture and movement cues. Such features are classified using a Random Forest Classifier (a type of algorithm). The team showed that its mapping between the combined feature space and the perceived emotional state provides 80.07% accuracy in identifying the perceived emotions. In addition to classifying discrete categories of emotions, the algorithm also predicts the values of perceived valence and arousal from gaits. The visualization of the motion-captured gaits of four individuals with their classified emotion labels. Gait videos from 248 motion-captured gaits were displayed to the participants in a web-based user study to generate labels. The researchers used that data for training and validation. Credit: Randhavane et al., Fair Use. Quote
jiggubhai Posted March 5, 2022 Report Posted March 5, 2022 1 minute ago, TOM_BHAYYA said: Pushpa always angry a aithe we should show balio's walking style to AI.. pichidai poddemo Quote
Midnightsun Posted March 5, 2022 Author Report Posted March 5, 2022 Just now, jiggubhai said: we should show balio's walking style to AI.. pichidai poddemo Quote
Midnightsun Posted March 5, 2022 Author Report Posted March 5, 2022 2 minutes ago, TOM_BHAYYA said: Pushpa always angry a aithe Thats zombie walk Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.