Edge computing—which dates back to the 1990s when content delivery networks were built to stream...
Facial expressions are a core part of the human experience, but understanding other people’s facial expressions can be difficult. Learn more about the 7 core and 21 mixed facial expressions. How do facial expressions connect to emotions?
Machine learning uses algorithms that learn from examples. Classification is a task that uses machine learning algorithms that learn how to assign class labels to example sets in order to solve a problem. The 4 main types of classification are binary, multi-class, multi-label, and imbalanced. It is easy to figure out how to select the best algorithm with our quickstart guide.
Do you store facial data?
No. We do all the analysis on the end user’s device to maintain privacy. We don’t store faces and only output the numerical results of the emotional analysis.
Does everyone in the meeting need to install the app?
No. Elevate only needs to be installed on the PC of one of the participants in the meeting.
Which facial expressions / emotions does Elevate measure?
We measure 8 core facial expressions–based on their intensity–in order to get quantitative data on how a person is feeling: