Cognitive error!


A deep dive into how we think through diagnoses - cognitive error

We hope that you have been enjoying thinking about ‘thinking’ after the last blog post. Yes, we are talking “metacognition” for all of those nerds out there (we, of course, use ‘nerd’ as the greatest term of endearment…)


Our last blog post summarised the first of three papers published by Canfield and colleagues ( and covered the basics of System 1 (immediate and unconscious) and System 2 (effortful and analytical) thinking. This second post will cover the second paper in the series ( and explores cognitive error.


So, what IS cognitive error?


In the paper it is described as “flawed clinical reasoning, due to faulty knowledge, data gathering and synthesis”. Cognitive error is often driven by ‘biases’ which can occur with both types of thinking (although is most common with System 1 and flawed System 2 thinking).


So, let’s cover a few of the more important cognitive biases that might trip you up in your diagnoses…


The first is ‘confirmation’ bias. This is where you tend to interpret, recall and focus on information that confirms your prior or early beliefs (preconceptions) about a case. This bias can result in the discounting of other possibilities once you latch on to an early diagnosis.


Similar but different is ‘availability’ bias. When this bias occurs, diagnoses or events that have greater ‘availability’ in your memory (because you have seen these cases recently, or they were associated with an unusual or emotional presentation) are more likely to be called on than others.


Another important bias is ‘anchoring’ bias is where a single piece of information (often the first) is relied on too heavily when making decisions, acting as an anchor to which all other information is referenced, rather than part of the big picture. This bias also causes us to stick with a diagnosis, despite additional information (including lack of treatment response) that might discredit it.


Finally, ‘gambler’s fallacy’ is where we think that a certain event / diagnosis is more or less likely to happen based on the outcome of previous events (every vet in the history of the world has had ‘runs’ of certain presentations, breeds or owners). While this may very well be true for infectious or environmental diseases, most presentations can and should be considered as independent events.


No doubt some, if not all, of these biases are familiar to you in some way.


Can you recall instances when you have fallen prey to one or more? Or, conversely, can you think of times where the use of these biases has actually helped you to work through a diagnosis? Yes, it’s true that while biases can mislead us, they can also provide “short cuts” for thinking about a diagnosis, leading to rapid results. BUT, because these results aren’t always correct, the next step is to harness our knowledge about biases and systems thinking to ensure that we are aware of their presence and can minimise their misleading effects while capitalising on their benefit.


To do this, Canfield and colleagues discuss a simple “de-biasing” strategy that focuses on asking yourself two questions when you become aware that you may have fallen prey to bias in your diagnostic process:


  • What do I need to do next to support my presumptive diagnosis?

(…and how do I ensure that this supporting evidence is objective and accurate?)


  • If my diagnosis is wrong, what other possibilities exist?

(…and how can I remain both open-minded AND sceptical to ensure that what I have read or heard is NOT misleading me?)



If you consistently ask yourself these simple questions in relation to each case you address, you will be well on the way to managing cognitive error.


So, the take home messages here are:

  1. be AWARE of the biases that you are (or might be) operating under;
  2. take time to REFLECT on your cases and how you approach them (any great diagnostician is a great reflector and accepts that they will be wrong from time to time);
  3. ensure that you use trained System 2 thinking to support any System 1 thinking; and
  4. introduce some ACCOUNTABILITY, seek ADVICE and develop CHECKLISTS where necessary to support these approaches.


In short, if you ensure that all processes that you utilise combine a healthy dose of objectivity, open-mindedness and scepticism you can’t go wrong (OK, you can always go wrong, but the probability will be vastly reduced!).


Stay tuned for our third and final blog in this series that will discuss the use of mental shortcuts (heuristics) and illness scripts in diagnostic reasoning.