background

Your testing studio for videos

This is the place to be for all players with a video strategy. Test any video with real people. A powerful emotion AI brings an actionable report in your hands within hours.

More about the product

Read a case study

Case study:
Audience Emotions Moment to Moment

In February 2017, an entertaining talk show Golden Times produced by National TV of Slovakia was tested with Youfirst, an emotion AI software that tracks the facial expressions of viewers who are watching a video from the comfort of their homes. Youfirst cooperates with established ESOMAR agencies or works directly with influencers to analyze video content , and so far, it has analyzed over 2 billion frames.

Design of a facial coding study

Youfirst uses computer vision and machine learning algorithms to determine emotions based on the facial expressions of its viewers. Views are aggregated to form a statistically relevant picture of the emotional experience for each moment in the video. The emotional states that are tracked represent the universal building blocks of human emotional responses - anger, disgust, fear, happiness, sadness, and surprise*.
* Ekman, P. (2001). Facial expressions. In Blakemore, C. & Jennett, S. (Eds.), Oxford Companion to the Body. London: Oxford University Press.


Emotional Facial Expressions are determined by recognizing a specific combination of facial muscles called Action Units . Above, the Action Units are overexposed and labeled for the purpose of clarity.

According to predefined sample criteria, we invited a group of respondents from an ESOMAR online panel to take part in a facial coding study. After agreeing with the purpose and methodology of the study, an automated test of recording conditions was executed. The webcam, optimal recording conditions, and speakers were tested. Our data was collected from 50 one-hour-long recordings. Additionally, the data was supplemented with responses from a qualitative questionnaire. All of this was done from the comfort of the viewer's home with no need to install any additional software.


An automated test of recording conditions runs on the viewer's computer.

Outcome

The software performs an emotional analysis on a frame-by-frame basis, so a huge amount of data is obtained. In this particular study, we gathered a total of 3 million frames. Each frame carries information about the 6 basic emotions, distance and angles of the face, along with socio-demographic data and responses to the questionnaire. Here, we present 3 areas that were specified for RTVS:

1) What is the chemistry of the relationship between the two hosts? Building the emotional engagement of the viewer

Human emotions are built on stories and our interpretations of these stories. From emotional data, we can conclude that an emotional engagement increases naturally with a good narrative and dramatic build up. If a narrative is discontinuous, the emotion curve does not reach desired values. Discontinuity may be caused by a lack of authenticity, problems with understanding, or poor pacing.

It was the uneasy impression left on the viewers and the uncoordinated interactions between the hosts that prevented the audience from stepping out of the smiling zone and into a state of amusement. The level of the positive emotion curve jumped up here and there; however, there was not a moment when the hosts were able to build up upon the guests' narratives or each other's.

2) What are the critical moments we should be aware of? Beware of the different stages of channel surfing.

Reacting emotionally basically signals that there is something personally significant going on. So a failure to evoke emotions in our audience means to lose them. There are 3 critical moments to avoid depending on when they happen: an unengaging beginning, a mid-program lull, and a lackluster ending. If you lose your audience with an unengaging beginning, you risk them looking elsewhere for another program to entertain them for the evening. They may, however, return at a later time. If you lose your audience during the second critical moment, a mid-program lull, they are leaving with an opinion that might be unfavorable. Finally, a lackluster ending moment can be avoided by building hype and excitement at the very end, ensuring that the audience comes back for the next episode - do not let them slip back to neutral!

The analysis uncovered a number of critical moments in the program when the engagement level was low. The beginning of the program and the last quarter did not evoke emotional reactions. Additionally, the musical insertions that were supposed to serve as entertainment did the opposite by evoking frowning.


3) What are the power moments and how should we build upon them? Use power moments for trailer creation

Youfirst is able to automatically detect moments where the audience had a more intense response, also known as power moments. Using Youfirst, it is even possible to identify exact quotes that led to these power moments. Such emotionally charged segments are great for trailers and targeted campaigns.

Our study identified one guest in particular - a gentlemanly and witty older actor and humorist ML, who was able to engage the audience. Here are two amusing comments that were identified by Youfirst.

"You look wonderful... younger indeed. But how MA rejuvenated!", talking to one of the hosts who used to have a much older co-host, MA.

"Are we going to get some money for this?", ML asking about being paid for successfully guessing the year when certain scenes were recorded.

Unfortunately, the program hosts failed to build upon these gags.


Successful implementation of results

The results of the study brought specific propositions for changes, which were used in the refresh of the program, which was broadcast on TV. The dramaturgy of the program was changed and now supports interaction and storytelling, andML became a host of the program. Finally, the musical insertions were completely rearranged due to their unengaging quality.


ML = Milan Lasica became a regular host in the talk show Golden Times

After a refresh of the program that was in line with the facial coding study conclusions and recommendations, the performance of the talk show Golden Times has continued to grow. Comparing viewers' behavior from Fall 2016 to Spring 2017:

Mean Viewership: +9%

More people watch the talk show Golden Times.

Audience Loyalty: +14%

Viewers watch the talk show Golden Times for a longer period.

Source: PMT/TNS.sk


Benefits of a facial coding study

Emotional data offers an analysis of spontaneous reactions moment to moment. All of this occurs in real time because the audience is approached on their devices, and the analysis is automatically performed by software. Emotional quality along with intensity is identified, in addition to targeting specifications.