Say you’re in a situation where it behooves you to keep a poker face. At an auto dealership negotiating a price on the new car of your dreams, or sitting across from your boss trying to make a case for a promotion. Your heart is pounding, sweat is forming in the crevasses of your palms, yet you need to keep yourself in check and not betray your excitement or apprehension. How skilled are we at masking emotions really? Emotions affect everyone in daily life, and play a key role in non-verbal communication. They are essential to understanding human behavior. A growing area of research is geared toward classifying and analyzing human emotions for a variety of studies, and methods for acquiring this data have become increasingly reliable and sophisticated. How do these studies work, and how is something as amorphous as the smallest twitch of a facial expression accurately recorded?
Facial “emotion reading” software such as FaceReader can detect minute variations in facial expressions by analyzing key points on the face (over 500) and has been “trained” with more than 10,000 manually annotated images to enhance facial modeling accuracy. Using this vast classification library, the input is distilled into six basic or universal emotions, (happy, sad, angry, surprised, scared, disgusted, or neutral.) Facial expression data is non-invasively gathered in real-time via a still or video camera, so no electrodes or leads are necessary.
But these refinements are more than skin deep. Additional tools for further quantifying facial expression data are commonly available. “Action Units” is the term used to describe individual or group facial muscle activity. The FaceReader Action Units add-on module allows automatic analysis of a selection of 20 common Action Units (such as raising of cheeks, wrinkling of nose, dimpling, and lip tightening) to measure affective attitudes (such as interest, boredom, and confusion).
Facial expressions can be visualized as bar graphs, in a pie chart, and as a continuous signal. Using FaceReader integrated with AcqKnowledge software and the MP160 System extends the functionality. Not only is the data automatically synchronized with other signals (ECG, fEMG, EDA, etc.,) but the recorded facial video can be monitored in the FaceReader display for real-time feedback.
Perhaps with a bit of practice, you can pull off that promotion and be able to afford that new car without tipping your hand!
For more information about FaceReader Software please consider signing up for our free September 27, 2018 Webinar, “Using Facial Expressions to Understand Behavior” with BIOPAC CEO Frazer Findlay.
BIOPAC offers a wide array of wired and wireless equipment that can be used in your research. To find more information on solutions for recording and analyzing signals such as ECG, heart rate, respiration and more using any platforms mentioned in this blog post, you can visit the individual application pages on the BIOPAC website.