Innovation to Sense and React to the World
With over 10 years of R&D and commercial collaboration in the field of human–computer interaction, our technology advocates a perfect discernment of emotions with high accuracy and natural capabilities of extracting concealed feelings.
An emotional state of a person is the gauge of several manifestations being held in a person or group mind or intention. We index and study emotions with outpatients and nonpatients to examine the prevalence of sentiment and co-occurrence of perceived intentions and sentiment.
Our emotion analytics runs on popular platforms, any device with camera or wearable optical devices. Low computation cost that requires less than 1 GHz processing on the Edge with minimum bandwidth needed. Robust SDK with APIs libraries to create growing demand applications with 3rd party development.
Further affective computing research on multimodal speech, physiological signals and gestures to manifest an individual or large group emotional state with refine classification.
Applications and Case Studies
Opsis Co-Founder (Prof S. Winkler) publications on Facial Expression Analysis Technology
S. Winkler, L. Zhang, S. Peng.
PersEmoN: A deep network for joint analysis of apparent personality, emotion and their relationship.
IEEE Transactions on Affective Computing, to appear.
S. Winkler, V. Vonikakis, D. Neo.
MorphSet: Augmenting categorical emotion datasets with dimensional affect labels using face morphing.
eprint arXiv:2103.02854, March 2021.
S. Winkler V. Vonikakis.
Identity-invariant facial landmark frontalization for facial expression analysis.
Proc. IEEE International Conference on Image Processing (ICIP), Abu Dhabi, UAE, Oct. 25-28, 2020.
S. Winkler S. Peng, L. Zhang, Y. Ban, M. Fang.
A deep network for arousal-valence emotion prediction with acoustic-visual cues.
One Minute Gradual (OMG) Emotion Behavior Challenge (1st/2nd place), eprint arXiv:1805.00638, May 2018.
S. Winkler V. Vonikakis, Y. Yazıcı, V. D. Nguyen.
Group happiness assessment using geometric features and dataset balancing.
Proc. 18th ACM International Conference on Multimodal Interaction (ICMI), Emotion Recognition in the Wild Challenge (2nd place), Tokyo, Japan, Nov. 12-16, 2016.
S. Winkler, H.-W. Ng, V. D. Nguyen, V. Vonikakis.
Deep learning for emotion recognition on small datasets using transfer learning.
Proc. 17th ACM International Conference on Multimodal Interaction (ICMI), Emotion Recognition in the Wild Challenge (3rd place), Seattle, WA, Nov. 9-13, 2015.