A Beautiful Mind- Mental Modeling/ Cognitive Bias

One of the greatest wonders and mysteries of the human body is the brain. A small organ

weighing 3.3 lbs, the brain defines who we are as human beings. It’s an amazing framework of neurons that allow us to have complex thoughts and make split second decisions. However, these neurons are just itching to “mess you up!” …. well, that got dark quickly!

Have you ever wondered why you are more likely to get in an accident only a couple miles from your home? Or why you are more likely to ask for directions in an unfamiliar place? Have you ever wondered why you sometime never felt you had to think about how to treat the patient in front of you, you just did it? Or how about those time, you completely missed the diagnosis of your patient. This is our Beautiful Mind at work (did you just get as excited as I do when I hear one of the characters in the movie say the name of the movie….in the movie!)

Cue the dramatic movie music!

Let me explain what is going on…

Mental Modeling

Mental models are how we understand the world. Not only do they shape what we think and how we understand but they shape the decisions we make. A mental model is simply a representation of how something works. We cannot keep all of the details of the world in our brains, so we use models to simplify the complex into understandable and organizable chunks.

Most of our initial education as EMT’s, Paramedics, Nurses and Physicians is based around learning Mental Models of different conditions and disease processes

If you see A,B andC...it is CHF and you DO 1,2,3

These mental models continue with us once we get into the field as we do our best to just not kill anyone…I see A,B and C... it is CHF and I will do 1,2,3.

The 12 Lead shows a Normal Heart Axis, so its definitely not Ventricular Tachycardia ;) Brian Behn

*please read his blog after this one to see why you need to change your mental model for Wide Complex Tachycardias- https://www.foamfratblog.com/post/wide-complex-tachycardia-for-losers-also-tips-for-winning-global-thermonuclear-warfare

Where else do we see these mental models? We have visualization of mental models in our protocols and the alphabet soup Algorithms. A simplified way of working through a particular problem.

Hypertension + Bloody Frothy Sputum= pg. 65 Congestive Heart Failure...I must do 1,2,3

As we progress as clinicians our experiences will continue to alter these mental models. And we will continue to use them to guide our treatment. Mental Models can lead to lightning-fast decisions and interventions. Unfortunately, our patients rarely read our textbook or know your specific mental model (no two mental models are the same) to prepare for their emergency. Mental Models can cause you to fall into that illusive TUNNEL VISION trap!

To prevent cognitive overload, our brains have 2 operating systems for making decisions. Mostly described as System 1 and System 2. I like to think as them as the tortoise and the hare.

We make roughly 35,000 decisions a day! They have different levels of importance and difficulty. From everyday needs to what is going on with our patients and how we plan to treat them.

Our system 1 (hare) is good at making quick decisions and can help us make quick decisions to assess and treat our patients. These decisions come at a cost. The hare (system 1) cheats and uses shortcuts as it races to the finish line. These shortcuts are called a heuristic. These shortcuts rely on immediate cases that come to a given person’s mind. Or easier put. We value information that springs to mind quickly as being more significant. So, when you have to make a decision. We automatically think about related events or situations. As a result, we might judge that those events are more frequent. The way we ‘build’ heuristics is by reviewing the information at hand. And connecting that information to our experience. Heuristics are strategies derived from previous experiences with similar problems. It is trying to solve a problem based on experience instead of theory.

Thinking like a hare (system 1) can be very effective in providing the quick treatment that our patient’s need! And we use this system more often than not in our practices. System 1 can be described as our “gut feeling.”

However, speed doesn’t not always equal accuracy! And can result in us falling into that illusive tunnel vision trap also known as cognitive bias.

Cognitive Bias

The definition of cognitive bias is as follows- a systematic pattern of deviation from norm or rationality in judgment- yup that is as clear a mud!

It’s actually a little easier to understand Cognitive Bias, by discussing some of the different type of cognitive bias. There are a bunch of cognitive bias. I am only going to discuss the 4 that we as medical providers succumb too.

Availability bias- An assumption that what most readily comes to mind is most relevant

The badness of this type of bias is that we simply choose a patient’s diagnosis based on the first thing that come to mind or what you believe is the most common things.

Patient says she is having the worst headache of their life…so it’s OBVIOUSLY a Subarachnoid Hemorrhage!

If walks like a duck…talks like a duck…it MUST be a duck!

Anchoring heuristic- Fixation on initial impressions

The badness of this type of bias is that we either fail to incorporate additional information into the diagnostic process or we make all additional information fit the initial diagnosis.

The patients’ hypotension MUST be from the Subarachnoid Hemorrhage that just caused them to herniate…. see my other blog post “The Interesting Case of the Brain and the Octopus” (yes, that was just some shameless self-promotion at its best!)- https://www.foamfratblog.com/post/the-curious-case-of-the-brain-and-the-octopus-trap

Blind obedience- Undue importance given to expert opinion or test results

The badness of this type of bias is trusting without verifying. And just assuming since someone is an “expert.” They MUST be right! Additionally, test results are just one part of a larger picture. Finally, just as you can have cognitive bias, so can the expert.

Premature closure- Acceptance of a diagnosis before it is fully verified.

That badness here is that as providers we just commit to a presumed diagnosis and stop thinking about anything else or do further evaluations.

Soooooooooo…what can we do to combat system 1 thinking and Cognitive Bias and become the trusty tortoise (system 2 thinking)!


Metacognition refers to a person’s ability to regulate their thinking and learning and consists of the self-assessment skills: planning, monitoring, and evaluating.

During patient contact- Stay present in the moment. Actually, think about what you’re doing before doing it, instead of doing thing out of a gut feeling. If you feel yourself turn into a robot or begin to just assume the patient presentation is due to the initial diagnosis. Stop and THINK!

After the call- use something called Cognitive Autopsy- examining your own thought process after the call. This isn’t the same as talking about the call, immediately following it.

First step- Write down everything you can recall about a call; ambient conditions, when it occurred on your shift, things the patient said or did, the assessment findings, how they responded to your treatment.

Second Step- get in touch with the people who were at the call with you- Partner, receiving physician, etc. Get their recollection of the call and compare notes.

Were some things you recalled as fact actually perception or implied by your subconsciousness?

Cognitive autopsy is designed to


Never Stop Gathering Information

We have a tendency to work through our assessment or continue to gather information up until the point it fits our “Gut Feeling” or the reason we were original dispatched. From that point we dive into our Mental Model/shortcuts to begin treating the patient. While this is great for providing time-sensitive treatment. Unfortunately, this can start our path into Cognitive Bias and tunnel vision. Remember assessment and information gathering is a never-ending process and that diagnoses CAN and DO change from what you were initial dispatched for or someone initial thoughts.

Food for thought- Do you think it actually helps or hurts us to be told an initial diagnosis by dispatch?

For me, I am actually okay with not knowing a diagnosis on dispatch. Just tell me if there are any issues with the patient ABC’s that I need to be prepared to treat. I'll figure out the rest when I get there. However, I have worked with people who get irritated when they are not told what they are going for. And if they are told, their faces are in the protocol book for that specific diagnosis- for me that’s the beginning of Cognitive Bias and tunnel vision. While they may be ready to provide speedy treatment for the patient based on the protocol they just read. They can easily fall into many of the Cognitive Bias we talked about above, because of steps they took prior to patient contact.

Take short breaks during a call to think and reflect

Now I am not saying stop what your doing, step away from your patient and grab a bite to eat…

I am saying to stop and go through your differential diagnosis (decide to add or subtract), assessment findings and patient response to your treatment(s).

The brief pauses allow your brain to catch up- and allow you to focus on “Thinking about Thinking

So now go forth and be the tortoise!!! #bethetortoise


Cognitive bias and the problem of MISDIAGNOSIS: Avoiding the big miss. (2020, March 15). Retrieved February 09, 2021, from https://www.challengercme.com/articles/2020/02/cognitive-bias-clinical-misdiagnosis-avoiding-the-big-miss-clinician-reasoning-biases-examples/

Improvewithmetacognition.com. (n.d.). Retrieved February 09, 2021, from https://www.improvewithmetacognition.com/enhancing-medical-students-metacognition/

Kahneman fast and Slow THINKING: System 1 and 2 explained By sue. (2019, November 08). Retrieved February 09, 2021, from https://suebehaviouraldesign.com/kahneman-fast-slow-thinking/

Mcdaniel, R. (2020, March 27). Metacognition. Retrieved February 09, 2021, from https://cft.vanderbilt.edu/guides-sub-pages/metacognition/

Morgenstern, J., & Morgenstern, J. (2019, September 22). Cognitive errors in medicine: The common errors. Retrieved February 09, 2021, from https://first10em.com/cognitive-errors/

Quirk, M. E. (2006). Intuition and metacognition in medical education: Keys to developing expertise. New York: NY.