Built upon powerful new technologies, we offer our clients quick and accurate insights they can act upon to improve their video content.
Viewer responses are recorded through their webcams, with their consent, while they watch a video. The recorded responses are streamed to our cloud servers where they are securely processed. The results are aggregated and reported on the Youfirst dashboard in near real time. Youfirst.studio may be integrated into your usual questionnaire or work as a standalone platform.
Long-standing research has established that there are six basic emotions that are universally recognized by humans: anger, disgust, fear, happiness, sadness, and surprise. These basic emotions don't vary in their expression, so we can accurately train computer algorithms to recognize them using computer vision and machine learning techniques.
The Youfirst platform is fueled by an emotion AI that tracks 68 key-points of the face, mainly around the eyes, nose, and mouth. For a more robust solution, the AI is also able to detect wrinkles and shades in the nasolabial, forehead, and other regions. Based on these input data, our industry leading classifier computes the probability of each emotional facial expression (anger, disgust, fear, happiness, sadness, surprise, neutral) on a scale from 0 to 1.
Function of emotions
Emotions serve as shortcuts in our decision making process and are hardwired into our brains at birth. Emotions help us make complex decisions and are often rationalized after the fact. Furthermore, emotions are closely related to certain action tendencies. Due to the fact that emotions are reflected in one's facial expression, an analysis of one's facial expression can provide information about his/her action tendencies.