Smile, and the world will smile with you*


In the early 1970s I received a small grant to conduct a consultation on the air with a volunteer couple in Victoria. This was something very new. I was broadcasting from a room at the BC Tel (now Telus) office in Vancouver, and the couple was receiving to and responding from a similar technical set up in Victoria. In those days TV transmissions had to be line-of-sight (i.e., the transmitting and receiving stations had to be in view of each other). There were four technicians in my room twirling dials, counting down to when I could start, and a similar crew was handling the Victoria side. A small TV screen in front of me showed the couple, and they saw me on their small screen. This was all very exciting, and I thought very successful. I was able to take a sexual history from the volunteer couple over the air, 50 km away, and I could read their emotions on their faces! I could see them move their eyes, I saw their smiles, I observed them being somewhat bashful and contemplative, and at one point there were tears in the eyes of the woman volunteer, which I saw very clearly.

In spite of my enthusiasm, and to my great disappointment, my hope of doing more of these long-distance on-air consultations was canceled. The line-of-sight obstacle and the need for a special technical room for the broadcast were certainly issues at that time. Apparently the bigger issue was the lack of privacy and confidentiality: a room full of technicians was not thought to be an appropriate setting for sharing medically related information. How things have changed over the years with our telephones and other long-distance communication tools, including current video technology. They have become vital, leading doctor-patient communication tools in our current pandemic. Reading our patients’ feelings from their facial expressions and body movements has always been important in an interview, but one consideration has come to the fore: how certain can we be in our interpretations of our patients’ faces on our small screens?    

For millennia, facial expressions—whether happiness, sadness, anger, fear, disgust, or surprise—have served as clues to a person’s emotional state. Assumptions made based on those clues influence our assessment of our patients and our reactions to their presentations. Similar assumptions may influence legal judgments, educational approaches, security measures, and the development of artificial intelligence devices. The problem is that available evidence suggests that our assumptions are just assumptions. Deducing emotions from facial movements is proving to be a risky and often faulty exercise.

A recent review in the journal Psychological Science in the Public Interest suggests an urgent need for research that examines how people move their faces to express emotions and how people detect or perceive instances of emotion in one another.[1] Authors suggest that four criteria are needed to justify that a facial expression reveals a person’s emotional state: reliability, specificity, generalizability, and validity. 

Research is needed via multidisciplinary methods—neuroscience, machine learning, computational models—to include assessment of facial movements, posture and gait, tone of voice, autonomic nervous system changes, and the contextual changes of varied life situations.

In an accompanying article,[2] a group of researchers, including Jessica L. Tracy of the Department of Psychology at the University of British Columbia, propose development of a comprehensive atlas of human emotions. They argue that models that rely only on the basic emotions of happiness, sadness, fear, disgust, anger, or surprise do not portray the variability of emotion-related responses, nor do they account for the various expressions that vary culturally and are situation specific. 

As Hamlet tells us, beware: “One may smile and smile and be a villain.” There is a lot of work still to be done.
—George Szasz, CM, MD

References
1.    Barrett LF, Adolphs R, Marsella S, et al. Emotional expressions reconsidered: Challenges to inferring emotions from human facial movements. Psychol Sci Public Interest 2019;20:1-68.
2.    Cowan A, Sauter D, Tracy JL, Keltner D. Mapping the passions: Toward a high-dimensional taxonomy of emotional experience and expression. Psychol Sci Public Interest 2019;20:69-90.

* Jack Ellison, poet 
 


This post has not been peer reviewed by the BCMJ Editorial Board.


Leave a Reply