Updated: Oct 23, 2019
I am lucky enough to be editing this blog on the shores of Lake Windermere, as I have travelled up to the Lake District for a British Exploring Society training weekend. This is part of our preparation for an expedition to the Canadian Yukon in Summer 2019.
Recently I have started reading a book by Matthew Syed, called “Black Box Thinking.” Syed is a journalist and was previously the England table tennis number 1 for a decade! I have had his book in my ever-growing list of “must-reads” for sometime and finally got around to starting it a few weeks ago.
It discusses an interesting concept known as “Cognitive Dissonance,” which I think is really important to be aware of, both in day to day life but also when working in healthcare, whether in a hospital or GP practice or when out on an expedition. It is equally important in other industries and for anyone holds a position of responsibility whether this is at home or at work.
So what is Cognitive Dissonance?
To find the answer for this we have to go back to the 1950s to learn about the work of an American researcher, called Festinger, and his subsequent book entitled: “When prophecy fails.” This is explained in this excellent short video below:
Cognitive dissonance is a negative feeling that results from conflicting beliefs and behaviours.
There are 4 main ways that an individual can resolve their dissonance:
Take the example of unhealthy eating. I am overweight but I would like to eat a doughnut, my options are (https://en.wikipedia.org/wiki/Cognitive_dissonance)
1. Change the behaviour or the cognition ("I'll eat no more of this doughnut.")
2. Justify the behaviour or the cognition, by changing the conflicting cognition ("I'm allowed to cheat my diet every once in a while.")
3. Justify the behaviour or the cognition by adding new cognitions ("I'll spend thirty extra minutes at the gymnasium to work off the doughnut.")
4. Ignore or deny information that conflicts with existing beliefs ("This doughnut is not a high-sugar food.")
This is well illustrated in this study published in 2013 regarding cognitive dissonance and the health beliefs of smokers:
"A telephone survey was conducted of nationally representative samples of adult smokers from Canada, the USA, the UK and Australia from the International Tobacco Control Four Country Survey. Smokers were followed across three waves (October 2002 to December 2004), during which they were asked to report on their smoking-related beliefs and their quitting behaviour."
"Rationalisations are highest among smokers when they are smoking and lowest when they have quit, but importantly these rationalisations return to original levels, or close to original levels, when a quit attempt fails...Smoking cessation campaigns might benefit from targeting these rationalisations rather than simply trying to provide people with information about the health effects of smoking alone."
Cognitive dissonance creates a phenomenon called “error justification” and this is widespread throughout recent history.
Syed uses the example of the second Iraq war. Prior to the Iraq invasion, Tony Blair and George W Bush were sure that they would find WMDs in Iraq, as Saddam Hussain was definitely producing these.
As we know, the invasion happened and no WMDs were found. Initially this was because a "team of experts" were needed to find them. One year later, when this team hadn’t found any WMDs, Blair said that “they could have been removed, they could have been hidden or they could have been destroyed.”
How is this relevant to healthcare?
Syed begins his book discussing the tragic death of Elaine Bromiley, who went in to hospital for a routine day case operation and ended up dying, due to what is known as a “can’t intubate, can’t ventilate” scenario.
If you don't know of this sad story, watch this short video below that was made with the permission of her husband Martin Bromiley.
Martin Bromiley, who features in the video above, was a pilot and had extensive experience of human factors, and the challenges involved with acknowledging and responding to failure. He investigated what had happened and applied lessons learnt from the aviation industry.
I first heard about this case during medical school, and there is now much more emphasis upon human factors and communication amongst teams, throughout both undergraduate and postgraduate medical education.
Consider the same scenario if Mr Bromiley had not come from an aviation background. The doctors may have something like:
“there was a complication…a technical error…an unanticipated outcome…we did our best”
This is a form of self-justification or “re-framing.”
As Syed points out, if we look objectively and take these comments at face-value, they seem like they are designed to deliberately deceive bereaved relatives, however from what we have learnt, we can see that they may be due to cognitive dissonance, occurring on a subconscious level.
Having an awareness of how intrinsically this is built into the medical profession, and our everyday lives, can help us to develop self-awareness and also provide better care to our patients.
I hope that has given you a flavour of a much more complex topic. To find out more about some of the areas touched on in this blog, checkout the book, Black Box Thinking.
Don't forget to follow online and on social media
Until next time, take care