At Applied Influence Group we know the importance of understanding someone else’s perspective to effectively communicate and ultimately, influence. We first understand the culture and environment that person is from, then examine the individual’s interpersonal communications to give us a solid basis to provide the best chance of achieving our objective. There is a minefield of studies and research around behavioural analysis to lean on, but one we frequently see ourselves referencing, is the work of Paul Ekman.


In the 1960s Ekman led an expedition into the jungle of Papa New Guinea to confirm whether the facial emotions he had studied elsewhere were universal. He found a tribe completely untouched by western civilisation and showed a series of images with white faces exhibiting a series of different emotions. He confirmed that there were six facial expressions that were universal when exhibiting emotions. Fear, joy, sadness, shock, anger and disgust were the original six he discovered but he later added a seventh; contempt, following further research. Ekman’s study proved extremely lucrative as he sold his emotion detection method to various law enforcement agencies and later to the corporate world. He even set the Ekman Institute specialising in delivery, coaching and training of emotional skills & credibility analysis. 


Oscar Schwarz’s article in the Guardian, ‘Don’t look now: why you should be worried about machines reading your emotions’, examines how emotion detection has turned into a $20bn industry and looks at some of the pitfalls of applying Ekman’s research to artificial intelligence.


The article is enlightening and interesting, particularly when Schwarz cites Lisa Feldman Barrett's challenges to Ekman’s research. She, like us argues that emotions are complicated and are dependent on circumstance, culture and environment. She states that a person might be saying one thing but exhibiting another and that Ekman’s science does not account for this. This is true but that is why we refer to Ekman’s micro expression research, as they provide a great indicator for what’s going on emotionally but that might be suppressed, which avoids the ‘smiling whilst plotting my downfall’ scenario that Barrett brought up. 


The article then moves into how the research is being inbuilt into AI. We have seen some great technology using the facial recognition process, mainly around digital advertisement boards to collect data on demographics and the individual’s reaction to whatever is being displayed. This helps the advertising companies refine their product and ensure they are having the right reaction with their target audience.


Meredith Whittaker, co-director of the New York University-based research institute AI Now, talks about the dangers of matching the research to technology and the dangerous results it can provide. She raises some healthy and relevant observations about the subject but her comments about AI using Ekman’s outdated science leading to social harm is slightly extravagant and unnecessary. Our heritage is borne from working in military intelligence and in that world, you assess the reliability of the information you have, then make an assessment and decision based on what different sources are telling you. The AI will assist and can inform a decision being made by a recruiter or school worker, but people deciding whether someone should get a job or not based solely on a facial expression, should surely not be qualified to be making that decision.


In the case of Ekman’s research there are many counter claims to what he found and people will continue to misuse the research and look for it to tell them things that they potentially want to conclude. As the adage goes, all models are wrong, but some are useful and as long as you are aware of this and know what pitfalls you can fall into, then we find his research extremely useful in understanding the true emotions and reactions of an individual.