Heuristics!

2022-04-19

A deep dive into how we think about diagnoses - the use of heuristics in clinical reasoning

In this third and final blog post centred on diagnostic process we will explore the concepts discussed in the third paper published by Whitehead, Canfield, Johnson, O’Brien and Malik (2016) (https://doi.org/10.1177/1098612X16643251) around heuristics and illness scripts.

 

 So, what are heuristics?

 

“Heuristics” can be defined as mental shortcuts and can act to support either System 1 or System 2 thinking.

 

There are many different kinds of heuristics, some of which can also be classed as biases (although when they are used as heuristics, they should give rise to correct diagnostic decisions more often than not!). Names and descriptions of the different heuristics can be found in full in the paper (inclusive of ‘availability’, ‘representativeness’, ‘familiarity’, relating to System 1 thinking and ‘anchoring and adjustment’, ‘means-end and hill-climbing’ and ‘progress-monitoring’, relating to System 2, heuristics).

 

But, for our purposes today we will just identify that these short cuts are used often and effectively by skilled clinicians to respond efficiently to clinical conundrums.

 

“When you hear hoof beats think horses, not zebras”

 

Why is it important to be aware of the heuristics that you are using?

 

Well sometimes, for example when you are on safari in Africa, using a heuristic that leads you to think that the hoofbeats belong to horses is going to be misleading. However, if you are aware of the heuristic/s that you are using and are able to deploy some reflection, you will surmise that in this case (as you are ambling along in an African savanna) zebras might indeed be a possibility that are worth considering.

 

As we gain experience, the knowledge that is stored in long term memory is used to drive diagnoses through the use of heuristics. To the untrained observer, this may appear like the senior clinician has regressed to simple pattern recognition (and is just lucky to get it correct!) but, for those excellent senior diagnosticians, it is actually quite a different process that is being employed. Prior cases, along with multiple alternate modes of learning and deep reflection, create a catalogue of long-term memories that are then able to be called upon quickly, through heuristics and partial or complete ‘illness scripts’, to work through initial presentations efficiently using System 1 thinking. Note, it is not a passive process that leads to this point! Vital here to the success of the senior clinician’s strategy (relating to the case at hand and also for coding of this information for future cases) is the subsequent use of effortful System 2 thinking, which may also be based on heuristics that drive general diagnostic strategies, along with other evidence-based strategies inclusive of diagnostic algorithms. Use of these techniques reinforce or refute the progress made through deployment of heuristics and scripts in System 1 thinking, by calling out and managing any cognitive error that has occurred.

 

If this sounds familiar, it is! We are talking about the same pattern that we discussed in the last post, but now with some deeper understanding of the mechanisms that drive the processes.

 

So, to end our series on diagnosis, we defer back to Kahneman’s central message, which is simply that when our minds are left to their own devices they engage in a number of erroneous ways of thinking. If we want to make better decisions, we need to be aware of these systematic errors and develop workarounds.

 

Our take home messages are:

  • Know your cognitive biases,
  • Use them to your advantage (trust your intuition and don’t be afraid of System 1 thinking),

and

  • Always ensure that you have safeguards in place (System 2 thinking) to reduce the effect of bias and the error that it may present.

 

 

“The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little.”

- Daniel Kahneman, Thinking Fast and Slow.