fbpx

Cognitive Dispositions to Respond

Reviewed and revised 8 December 2014

OVERVIEW

  • Cognitive Dispositions to Respond (CDRs) or cognitive biases are:  “Predictable tendencies, or biases, to react to contextual clues that are largely unconscious and may contribute to flaws in reasoning; a mental state that embraces a variety of terms, often with negative connotations (e.g. heuristics, biases, sanctions, fallacies and errors) that have been described in psychology and medicine literature.”
  • the term CDR was initially coined as it has less negative connotations that ‘cognitive bias’
  • This page is derived from the work of Pat Croskerry (see references)

CLASSIFICATION OF CDRs

A single CDR may fall into more than one category.

  • Error of over-attachment to a particular diagnosis
    • e.g Anchoring, Confirmation bias, Premature closure, Sunk costs
  • Error due to failure to consider alternative diagnoses
    • e.g. Multiple alternatives bias, Representatives restraint, Search satisficing, Sutton’s slip, Unpacking principle, Vertical line failure
  • Error due to inheriting someone else’s thinking
    • e.g Diagnosis momentum, Framing effect, Ascertainment effect, Bandwagon effect
  • Errors in prevalence perception or estimation
    • e.g. Availability bias, Ambiguity effect, Base-rate neglect, Gambler’s fallacy, Hindsight bias, Playing the odds, Posterior probability error, Order effects
  • Errors involving patient characteristics or presentation context
    • e.g. Fundamental attribution error, Gender bias, Psych-out error, Triage cueing, Contrast effect, Yin-yang out
  • Errors associated with physician affect, personality, or decision style
    • e.g. Commission bias, Omission bias, Outcome bias, Visceral bias, Overconfidence/ under-confidence, Vertical line failure, Belief bias, Ego bias, Sunk costs, Zebra retreat

TYPES OF CDRs

Aggregate bias

  • AKA ecological fallacy
  • when physicians believe that aggregate data, such as those used to develop clinical guidelines, do not apply to individual patients (especially their own), they are invoking the aggregate fallacy. The belief that their patients are atypical or somehow exceptional might lead to errors of commission, e.g., ordering x-rays or other tests when guidelines indicate that none are required.

Ambiguity effect

  • ambiguity is associated with uncertainty. The ambiguity effect is due to decision makers avoiding the options when probability is unknown. In considering options on a differential diagnosis, for example, this would be illustrated by a tendency to select options for which the probability is of a particular outcome is known, over an option for which the probability is unknown. The probability might be unknown because of lack of knowledge, or because the means to obtain the probability (a specific test or imaging) is unavailable.

Anchoring

  • AKA tram-lining, jumping to conclusions, first impression
  • the tendency to perceptually lock on to salient features in the patient’s initial presentation too early in the diagnostic process and failing to adjust the initial impression in the light of later information. This CDR might be severely compounded by confirmation bias.

Ascertainment bias

  • AKA seeing what you expect find, response bias
  • occurs when a physician’s thinking is shaped by prior expectation; stereotyping and gender bias are good examples.

Attentional bias

  • the tendency to believe there is a relationship between two variables when instances are found of both being present. More attention is paid to this condition than when either variable is absent from the other.
  • e.g. Saint’s triad

Availability bias

  • AKA common things are common, the sound of hoof-beats means horses not zebra, out of sight out of mind (non-availability), Recency effect
  • the disposition to judge things as being more likely, or frequently occurring, if they readily come to mind.
  • Thus, recent experience with a disease might inflate the likelihood of it being diagnosed. Conversely, if a disease has not been seen for a long time (is less available) it might be underdiagnosed.
  • Availability is not limited to recency however, there may be other reasons why something readily comes to mind (e.g. a particularly vivid prior experience)

Bandwagon effect

  • the tendency for people to believe and do certain things because many others are doing so. Group-think is an example, and it can have a disastrous impact on team decision making and patient care.

Base-rate neglect

  • AKA representativeness exclusivity
  • the tendency to ignore the true prevalence of a disease, either inflating or reduce its base-rate and distorting Bayesian reasoning. However, in some cases clinicians might (consciously or otherwise) deliberately inflate the likelihood of disease, such as in the strategy of “rule out worst case scenario” to avoid missing a rare but significant diagnosis.

Belief bias

  • the tendency to accept or reject data depending on one’s personal belief system, especially when the focus is on the conclusion and not the premise or data. Those trained in logic and argumentation appear less vulnerable to this bias.

Blind spot bias

  • the general belief people have that they are less susceptible to bias than others, due mostly to the faith they place in their own introspections. The bias appears to be universal across all cultures.

Commission bias

  • AKA actions speak louder than words
  • results from the obligation towards beneficence, in that harm to the patient can only be prevented by active intervention. It is the tendency towards action rather than inaction. It is more likely in overconfident physicians. Commission bias is less common than omission bias.

Confirmation bias —

  • AKA following hunches, pseudodiagnosticity, positive testing, effort after meaning, relevance bias
  • the tendency to look for confirming evidence to support a diagnosis rather than look for disconfirming evidence to refute it, despite the latter often being more persuasive and definitive.

Congruence bias

  • similar to confirmation bias but refers more to an over-reliance on direct testing of a given hypothesis and a neglect of indirect testing. Again it reflects an inability to consider alternative hypotheses.

Contrast effect

  • occurs when the value of information is enhanced or diminished through juxtaposition to other information of greater or lesser value. Thus, if an emergency physician was involved in a multiple trauma case and subsequently saw a patient with isolated extremity injury, there might be a tendency to diminish the significance of the latter.

Diagnostic momentum

  • AKA Diagnostic creep
  • Once diagnostic labels are attached to patients they tend to become stickier and sticker. Through intermediaries (patients, paramedics, nurses, physicians), what might have started as a possibility gathers increasing momentum until it becomes definite and all other possibilities are excluded.

Ego bias

  • In medicine, ego bias is systematically overestimating the prognosis of one’s own patients compared with that of a population of similar patients. More senior physicians tend to be less optimistic and more reliable about patient’s prognosis, possibly reflecting reverse ego bias.

Feedback sanction

  • A form of ignorance trap and time-delay trap CDR. Making a diagnostic error might carry no immediate consequences as considerable time can elapse before the error is discovered (if ever), or poor system feedback processes prevent important information on decisions getting back to the decision maker. The particular CDR that failed the patient persists because of these temporal and systemic sanctions.

Framing effect

  • How diagnosticians see things might be strongly influenced by the way in which the problem is framed, e.g. physician’s perceptions of risk to the patient might be strongly influenced by whether the outcome is expressed in terms of the possibility that the patient might die or that they might live. In terms of diagnosis, physicians should be aware of how patients, nurses and other physicians frame potential outcomes and contingencies of the clinical problem to them.

Fundamental attribution error

  • AKA Judgmental behaviour, negative stereotyping
  • The tendency to be judgmental and blame patients for their illnesses (dispositional causes) rather than examine the circumstances (situational factors) that might have been responsible. In particular, psychiatric patients, minorities, and other marginalized groups tend to suffer from this CDR. Cultural differences exist in terms of the respective weights attributed to dispositional and situational causes.

Gambler’s fallacy

  • AKA Monte Carlo fallacy, sequence effect, law of averages
  • Attributed to gamblers, the fallacy is the belief that if a coin is tossed 10 times and is heads each time, the 11th toss has a greater chance of being tails (even though a fair coin has no memory). An example would be a physician who sees a series of patients with chest pain in a clinic or the ED, diagnoses all with an acute coronary syndrome, and assumes the sequence will not continue. Thus, the pretest probability that a patient will have a particular diagnosis might be influenced by preceding, but independent events.

Gender bias

  • AKA sex discrimination
  • The tendency to believe that gender is a determining factor in the probability of diagnosis of a particular disease when no such pathophysiological basis exists. Generally, it results in an overdiagnosis of the favoured gender and an underdiagnosis of the neglected gender.

Hindsight bias

  • AKA “knew it all along”, retroscope analysis, outcome knowledge, creeping determinism, wisdom after the fact.
  • Knowing the outcome might profoundly influence perception of past events and prevent a realistic appraisal of what actually occurred. In the context of diagnostic error, it might compromise learning through either an underestimation (illusion of failure) or overestimation (illusion of control) of the decision maker’s abilities.

Information bias

  • The tendency to believe that more evidence one can accumulate to support a decision the better. It is important to anticipate the value of information and whether it will be useful to not in making a decision, rather than collect information because we can r for its own sake, or for curiosity.

Multiple alternatives bias

  • AKA status quo bias, wallpaper phenomenon
  • A multiplicity of options on a differential diagnosis might lead to a significant conflict and uncertainty. The process might be simplified by reverting to a smaller subset with which the physician is familiar, but might result in inadequate consideration of other possibilities. One such strategy is the three diagnosis differential: “it is probably A, but it might be B, or I don’t know (C)”. Although this approach has some heuristic value, if the disease falls into the C category and it is not pursued adequately, it will minimize the chances that some serious diagnoses can be made.

Omission bias

  • AKA tincture of time, watchful waiting, expectant approach, temporizing, let well enough alone
  • The tendency toward inaction, rooted in the principle of non-maleficence. In hindsight, events that have occurred through the natural progression of a disease are more acceptable than those that be attributed directly to the action of the physician. The bias might be sustained by the reinforcement often associated with not doing anything, but might prove disastrous. Omission biases typically outnumber commission biases.

Order effects

  • AKA primacy, recency
  • Information transfer is a U-function; tendency to remember the beginning part (primacy effect) or the end (recency effect) is referred to as serial position effects. Primacy effect might be augmented by anchoring. In transitions of care, where information is transferred from patients, nurses or other physicians is being evaluated, care should be taken to give due consideration to all information, regardless of the order in which it is presented.

Outcome bias

  • AKA Chagrin factor, value bias
  • The tendency to opt for diagnostic decisions that will lead to good outcomes, rather than those associated with bad outcomes, thereby avoiding chagrin associated with the latter. It is a form of vaule bias in that physicians might express a stronger likelihood in their decision making for what they hope will happen rather than what they really believe might happen. This might result in serious diagnoses being minimized.

Overconfidence bias

  • There is a universal tendency to believe that we know more than we do. Overconfidence reflects a tendency to act on incomplete information, intuitions or hunches. Too much faith is placed in opinion instead of carefully gathered evidence.

Playing the odds

  • AKA Frequency gambling, law of averages, odds judgments
  • Also known as frequency gambling, is the tendency in equivocal or ambiguous presentations to opt for a benign diagnosis on the basis that it is significantly more likely than a serious one.

Posterior probability error

  • AKA history repeats itself
  • Occurs when a physician’s estimate for the likelihood of disease is unduly influenced by what has gone before for a particular patient. It is opposite of the Gambler’s fallacy in that the physician is gambling on the sequence continuing, e.g. if a patient presents to the office five times with a headache and it is correctly diagnosed as migraine on each visit, it  is the tendency to diagnose migraine on the sixth visit.

Premature closure

  • AKA counting chickens before they’re hatched
  • A powerful CDR accounting for a high proportion of missed diagnoses. It is the tendency to apply premature closure to the decision making process, accepting a diagnosis before it has been fully verified. The consequences of the bias are reflected in the maxim “when the diagnosis is made, the thinking stops.”

Psych-out error

  • Psychiatric patients appear to be particularly vulnerable to the CDRs described in this list, and to other errors in their management, some of which might exacerbate their condition. They appear especially vulnerable to fundamental attribution error. In particular, co-morbid medical conditions might be overlooked or minimized. A variant of psych-out error occurs when serious medical conditions (e.g. hypoxia, delirium, metabolic abnormalities, CNS infections, head injuries) are misdiagnosed as psychiatric conditions.

Representativeness restraint

  • AKA prototypical error, similarity-matching error, persistent forecasting error
  • Drives the diagnostician toward looking for prototypical manifestation of disease: “if it looks like a duck, walks like a duck, quacks like a duck, then it is a duck”. Yet, restraining decision-making along these pattern recognition lines leads to atypical variants being missed.

Search satisficing

  • AKA bounded rationality, keyhole viewing
  • Reflects the universal tendency to call off a search once something is found. Co-morbidities, second foreign bodies, other fractures, and co-ingestants in poisoning might all be missed.

Sutton’s slip

  • AKA going for the obvious, going where the money is, Occam’s razor mistake, KISS error
  • Takes it name from the apocryphal story of the Brooklyn bank robber Willie Sutton who, when asked by the Judge why he robbed banks, is alleged to have replied, “Because that’s where the money is!”. The diagnostic strategy of going for the obvious is referred to as Sutton’s Law. The slip occurs when possibilities other than the obvious are not given sufficient consideration.

Sunk costs

  • The more clinicians invest in a particular diagnosis, the less likely they might be to release it and consider alternatives. This is an entrapment form of CDR more associated with investment and financial considerations. However, for the diagnostician, the investment in time and mental energy; for some, ego might be a precious investment. Confirmation bias might be a manifestation of such an unwillingness to let go of a failing diagnosis.

Triage cueing

  • AKA Geography is destiny
  • The triage process occurs throughout the health care system, from the self-triage of patients to the selection of specialist by the referring physician. In the emergency department, triage is a formal process that results in patients being sent in particular directions, which cue their subsequent management, Many CDRs are initiated at triage, leading to the maxim: “geography is destiny”.
  • Once a patient is referred to a specific discipline, the bias within that discipline to look at the patient only from their own perspective is referred to as deformation professionnelle.

Unpacking principle

  • AKA support theory, discounting the unspecified possibilities
  • Failure to elicit all relevant information (unpacking) in establishing a differential diagnosis might result in significant possibilities being missed. If patients are allowed to limit their history giving, or physicians otherwise limit their history taking, unspecified possibilities might be discounted.

Vertical line failure

  • AKA Thinking in silos, thinking in grooves, thinking inside the box
  • Routine repetitive tasks often leads to thinking in silos — predictable orthodox styles that emphasize economy, efficacy and utility. Though often rewarded, the approach carries the inherent penalty of inflexibility. In contrast, lateral thinking styles create opportunities for diagnosing the unexpected, rare or esoteric.
  • An effective lateral thinking strategy is simply to pose the question, “What else might this be?”.

Visceral bias

  • AKA Countertransference, emotional involvement
  • The influence of affective sources of error on decision-making has been widely underestimated. Visceral arousal leads to poor decisions. Countertransference, involving both negative and positive feelings towards patients, might result in diagnoses being missed.

Yin-yang out

  • AKA serum rhubarbs, standing stool velocities
  • When patients have been subjected to exhaustive and unavailing diagnostic investigations, they are said to have worked up the yin-yang. The yin-yang out is the tendency to believe that nothing further can be done to throw light on the dark place where and if, any definitive diagnosis resides for the patient, i.e. , the physician is let out of further diagnostic effort. This might prove ultimately to be prove, but to adopt the strategy at the outset is fraught with a variety of errors.

Zebra retreat

  • AKA courage of convictions
  • Occurs when a rare diagnosis (zebra) figures prominently on the differential diagnosis but the physician retreats from it for various reasons: perceived inertia in the system and barriers to obtaining special or costly tests; self-consciousness and underconfidence about entertaining remote and unusual diagnosis and gaining a reputation for being esoteric; the fear of being seen as unrealistic and wasteful of resources; under- or overestimating the base-rate for the diagnosis; the ED might be very busy and the anticipated time and effort to pursue the diagnosis might dilute the physician’s conviction; team members might exert coercive pressure to avoid wasting the team’s time; inconvenience of the time of day or weekend and difficulty gaining access to specialists; unfamiliarity with the diagnosis might make the physician less likely to go down an unfamiliar road; and fatigue or other distractions might tip the physician toward retreat. Any one or a combination of these reasons might result in a failure to pursue the initial hypothesis.

References and Links

LITFL

Journal articles and textbooks

  • Croskerry P, Cosby KS, Schenkel SM, Wears RL. Patient Safety in Emergency Medicine. Lippincott Williams & Wilkins, 2009. [Google books preview]
  • Croskerry P. Diagnostic Failure: A Cognitive and Affective Approach. In: Henriksen K, Battles JB, Marks ES, Lewin DI, editors. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology). Rockville (MD): Agency for Healthcare Research and Quality (US); 2005 Feb. [pubmed] [free full text]

CCC 700 6

Critical Care

Compendium

Chris is an Intensivist and ECMO specialist at the Alfred ICU in Melbourne. He is also a Clinical Adjunct Associate Professor at Monash University. He is a co-founder of the Australia and New Zealand Clinician Educator Network (ANZCEN) and is the Lead for the ANZCEN Clinician Educator Incubator programme. He is on the Board of Directors for the Intensive Care Foundation and is a First Part Examiner for the College of Intensive Care Medicine. He is an internationally recognised Clinician Educator with a passion for helping clinicians learn and for improving the clinical performance of individuals and collectives.

After finishing his medical degree at the University of Auckland, he continued post-graduate training in New Zealand as well as Australia’s Northern Territory, Perth and Melbourne. He has completed fellowship training in both intensive care medicine and emergency medicine, as well as post-graduate training in biochemistry, clinical toxicology, clinical epidemiology, and health professional education.

He is actively involved in in using translational simulation to improve patient care and the design of processes and systems at Alfred Health. He coordinates the Alfred ICU’s education and simulation programmes and runs the unit’s education website, INTENSIVE.  He created the ‘Critically Ill Airway’ course and teaches on numerous courses around the world. He is one of the founders of the FOAM movement (Free Open-Access Medical education) and is co-creator of litfl.com, the RAGE podcast, the Resuscitology course, and the SMACC conference.

His one great achievement is being the father of three amazing children.

On Twitter, he is @precordialthump.

| INTENSIVE | RAGE | Resuscitology | SMACC

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.