Contrary to our competitors, we think of emotions not in discrete categories, but as a continuous space of expressions that can be visualized as a disk. We measure 3 emotional attributes or dimensions:
- Arousal (passive to energetic)
- Valence (negative to positive)
- Intensity (different from neutral)
Additionally, we measure the head pose in terms of roll, pitch, and yaw angles.
To achieve high accuracy and robustness, our algorithms have been trained using a database of over 300,000 face images with a large range of expressions and hundreds of people.
For a single person, we provide instantaneous measurements, trends of emotional attributes over time, and aggregated statistics over time (e.g. in the form of a heatmap of the distribution), as shown in the examples below.
For multiple people, we can also provide comparative analytics, e.g. how different people react to the same situation, in addition to the individual measurements. An example from one of the recent US presidential candidate debates is shown below.
Finally, for larger crowds, aggregated statistics over all persons can be extracted. An example from a trial deployment in a subway station containing data from 64,000 faces detected over a period of 2 weeks is shown below.