by Zachary Wright


Someone I know, who is a very vocal LGBTQ+ advocate, once got into an argument with me about concepts behind gender.  I did everything I could to understand where they were coming from, seeing as this is a complex topic with differing perspectives.  However, they were trying to convince me (among other things) that gender is a social construct.  Having studied the issue, I pushed against their ideas, arguing that gender is more complicated than that and it certainly isn’t a mere social construct.  However, despite my best efforts, my research and perspectives fell on deaf ears.  Instead, my friend seemed to cling to the ideas of the people she agreed with.  I’ll be sure to go over LGBTQ+ perspectives another day.  However, my purpose in telling this story is to show an example of what researchers and psychologists call cognitive bias, which we’ll discuss today.

A few episodes ago, I explained logical fallacies – errors during logical reasoning – that can lead to incorrect conclusions.  Today, we’ll be talking about the equivalent of that, but in regards to psychological bias that we sometimes employ.  Cognitive biases are “errors in thinking that affect people’s decision-making in virtually every situation” (1).  Unlike logical fallacies, cognitive biases are negative brain processes (usually coming from intuition) that can affect our thoughts and lead us to wrong conclusions.  In other words, logical fallacies relate to arguments in the way that cognitive biases relate to intuitive brain processes.  Critical thinkers must recognize cognitive biases in their thinking in order to more objectively analyze the data they interact with and make more informed decisions.  Today, we’ll be going over where cognitive biases come from, explore some examples of cognitive biases, and then explore some principles that can help us avoid them.  Let’s take a look.

Heuristics:  The Brain’s “Easy Way Out”

Before we explore examples of cognitive biases, we must first understand where they come from.  Most documented cognitive biases come from “mental shortcuts” in our minds.  Psychologists call these shortcuts “heuristics.”  One group of researchers described heuristics in the following way:

A heuristic is a mental shortcut that allows an individual to make a decision, pass judgment, or solve a problem quickly and with minimal mental effort. While heuristics can reduce the burden of decision-making and free up limited cognitive resources, they can also be costly when they lead individuals to miss critical information or act on unjust biases. (2)

Heuristics help us make decisions and arrive at conclusions easily and quickly.  Heuristics take information that seems familiar and draws a connection to something we’ve observed in the past.  These shortcuts often lead to good results… except when they don’t.

How heuristics can sometimes be misleading can be shown in some basic examples.  For instance, consider this video below:

If you’re like me, you initially see a series of dots moving around in a circle.  However, that’s not REALLY what’s going on.  It’s a series of points lighting up and growing dimmer in a specific pattern.  Our brains provide the illusion that it’s a circular-moving line of dots.  This example is one of many optical illusions that employ heuristics (3).  Another example of a heuristic is the “inattentional blindness” phenomenon.  Here’s another video example of a cognitive bias stemming from a heuristic (4).

In both instances, our brains are either making connections based on information that isn’t present or ignoring information that is present.  The accepted psychological theory states these intuitive (and I use that term very deliberately in light of my previous article) heuristics came from evolutionary processes and natural selection (5).  In other words, we’ve inherited instincts from our ancestors that protected us, enhancing our chances of survival.  This detail will be important later.

It almost goes without saying that this power of heuristics is nothing short of incredible, and one thing that I do want to stress here is that pattern recognition isn’t a bad thing.  I’ll be discussing how drawing inappropriate patterns can lead us to jump to conclusions, but not all heuristically-based decisions are bad ones.  One pair of researchers noted that “for many decisions, the assumptions of rational models are not met, and it is an empirical (after the fact) rather than an a priori (before the fact) issue how well cognitive heuristics function in an uncertain world” (6, parentheses added).  In other words, we often don’t know how well our heuristically-based decisions worked until after the fact.  However, they sometimes have unintended consequences that can cause problems in our thinking, as we’ll soon see.

Cognitive Bias:  Bad Processes, Worse Outcomes

There are multiple types of Cognitive Biases, each with observable patterns, chiefly focusing on drawing patterns, prioritizing the information that correlates with what we already know, focusing on dominant/impactful data, and ignoring seemingly irrelevant information (7).  I’ll be sure to link a more comprehensive list of cognitive biases below, as we won’t be able to discuss each one today.  However, we’ll cover some of the most important ones today, which may help outline some principles to help overcome cognitive biases.

The Anchoring Bias is “the common human tendency to rely too heavily on the first piece of information offered (the “anchor”) when making decisions” (8).  When we see information and don’t critically think about it, we tend to allow that source to color how we see other sources of information.  For example, imagine someone learning from their parents that “the Mormons are just a cult.”  From then on, they cling to that information, allowing it to color how they discuss the church with the missionaries.  The problem with this kind of heuristic is evident:  It inhibits valuable analysis and thought.  It hyper-prioritizes information we already know at the expense of additional information that may relate to the issue.  

The Fundamental Attribution Error is the “tendency for people to over-emphasize dispositional or personality-based explanations for behaviors observed in others while under-emphasizing the situational explanations.” (9).  In other words, we tend to associate certain attributes as being a part of personality rather than just being due to the surrounding circumstances.  An example of this is found in the hypothetical example of Member #1 and Member #2.  Member #2 is in a hurry and comes across as rude and impatient to Member #1.  Member #1 is offended at his impatience and then classifies Member #2 as an impatient person.  In reality, he’s in a hurry to get home to take care of his sick family.  As you can see, the situation was mostly governing Member #2’s behavior, not necessarily his personality as a whole.  Our heuristics focus on the dominant information in front of us and jump to conclusions as a result.

The Availability Bias indicates “The more available and relevant information there is, the more likely the event is judged to be” (10).  Put another way, we tend to prioritize more available information, even if it doesn’t tell the complete story.  For instance, there is a lot more critical information about the church than positive information about the church in places such as social media, where people talk about bad experiences they had in the church.  That information is often more accessible, and more advertised, than information about how members of the church end up being more generous (11), more pro-social (12), and have overall higher well-being (13) than most other populations of people.  This isn’t to say that it’s a contest.  However, it goes to show that there is evidence to go against the idea that church members aren’t happy, but it’s less available, so fewer people outside the church know about it.  This bias manifests from the intuition’s dismissal of seemingly absent or irrelevant information.

The Bias blind-spot Bias states “that most people believe they are less biased than their peers” (14).  In other words, we’re often able to point out other people’s biases more easily than notice our own biases.  Consider these statements (based on actual statements by critics)

  • “You can get unbiased information about the church from books like Under the Banner of Heaven by Jon Krakauer”
  • “You can get the REAL Mormon history from *Insert critic here*”

The problem with these statements comes from appealing to an idea of purity that doesn’t really exist.  There is no such thing as “unbiased” sources.  As you can see, it’s easier for people to ignore bias in favor of what they believe, because our mental heuristics tend to lean in favor of the information we already know.

Confirmation Bias is characterized by “seeking or interpreting of evidence in ways that are partial to existing beliefs” (15).  This can also be described as only looking for things that support your opinion.  Examples of this can be found in critics of the church only associating with people who agree with their assessments about church history while ignoring or dismissing organizations such as FAIR.  This, however, is problematic seeing as it once again prioritizes what people already believe at the expense of data that contradicts it.  This bias is prevalent (even among members), so watch for it.

The Framing Bias “refers to the observation that the manner in which data is presented can affect decision making” (16).  The way that information is initially presented to us can affect how we process that information and can lead us to incorrect conclusions (i.e. we make a decision based on how the issue was framed).  For example, “You belong to a manipulative cult” differs greatly from “You belong to a high-demand religion.”  (Lots of high demand groups exist, whether they be secular like the Navy Seals, or religious such as nuns, etc.)  This heuristic pulls on the idea that our brains focus on dominant, impactful data.  “Cult” is a very charged word often used with a negative connotation, so hearing it leads to conclusions that may or may not be accurate.  After all, depending on how you use the word, you could describe all religions as cults.

The Hindsight Bias is “the tendency, after an event has occurred, to overestimate the extent to which the outcome could have been foreseen.” (17).  This one is a little more well-known through the phrase “Hindsight is 20/20”, but its status as a cognitive bias is well-deserved.  This is often used against prophets and the mistakes they make.  It may be easier for us to say “How could Brigham Young have said X thing, isn’t it obvious that wasn’t true!?”  However, this just shows our bias…we have more information than Brigham Young did, and we’ve been able to draw patterns and make connections in ways that Brigham Young may not have been able to.  However, in such discussions, we have to remember that we are all human, we all make mistakes.  I try to be as charitable to people (past or present) as I imagine Jesus would be charitable towards me during Judgement Day.

Loss/Regret Aversion Bias was put by one researcher as “an expression of fear” (18).  Fear of what?  Well, fear of regret, pain, and loss.  This can manifest itself in a few different ways.  Perhaps a member is scared to take a serious look at church history because they’re afraid of losing their testimony.  Perhaps a critic of the church is scared of regretting their decision to leave the church, and so they double down on their criticisms.  These fears, even if somewhat justified, can impede our ability to understand the world around us.  Our brains tend to focus on dominant information and – let’s be honest with ourselves – we remember our failures.  Our mistakes/failures present themselves as very dominant information in our minds (19).  Keep this in mind as you’re making decisions.

The Stereotyping Bias is probably more familiar to most people.  A more technical definition of “stereotype” is “[the] category-based beliefs about a group that also involve affective–evaluative loading and behavioral tendencies” (20).  In other words, we put people with certain characteristics in a category in our minds, based on patterns we see.  For example, a critic might say something like “Mormons are raised to be bigoted and judgemental.”  Among other things, this is a stereotype placed on us by other people (21).  However, this bias comes from people’s tendency to over correlate data, that is, it’s easy for us to categorize other people into “being a certain way,” sometimes even if we don’t realize it.

Finally, in a similar vein, we have the In-group Bias, which is described as the willingness of members of a group to “treat their own group superior than others” (22).  In other words, we’re willing to hold those we don’t like to a higher standard than those we do like.  For example, I’ve seen instances where critics of church tend to give the benefit of the doubt to other critics, but not with those who defend the church.  When members of the church didn’t know something, it’s because we’re “ignorant,” but you’d be hard-pressed to find a member of that same community who would call another critic “ignorant.”  You’re far more likely to find them saying that the lay-members are ignorant than you are to find them saying that a member of their own ranks is ignorant.  Again, this kind of practice goes back to how people focus on dominant information, and for many people who have left the church, there are a lot of criticisms that are dominant in their minds.

Now, before I move on, I’d like to make one disclaimer:  I am not saying that those who have left the church are more susceptible to cognitive biases than those who have not left the church.  With just as much ease, I can find several instances where members of the church employ the same kinds of cognitive errors.  However, I do think that these examples of cognitive biases can demonstrate some of the ways that cognitive biases can affect how we interpret information, and consequently affect how members and non-members can be treated in negative ways.  It will always be in our best interest to avoid unnecessarily ostracizing other people.  See if you can identify cognitive biases in yourself, make note of which ones tend to cause you the most amount of trouble, and see if you can find ways to correct them.  As you’ll soon see, the first step to being able to combat cognitive biases is to identify them, and I hope that these examples can help you do that.

How to improve

That prompts the question:  How can we avoid cognitive biases?  There are a lot of sources out there that you can find that provide insights as to how you can prevent (or at least diminish) cognitive biases from affecting your life.  However, we won’t be able to go over all of them today, and even if I could, the ways that those practices can be applied vary from person to person.  Instead, I’ll cover a few principles that I think can help mitigate the effects of cognitive biases in our lives.  Hopefully that will provide insight as to what you can do to reduce the effect of cognitive biases, at least a little bit.

Right off the bat, it’s helpful to slow down the decision making process when possible.  As mentioned before, heuristics are basically “mental shortcuts” that help us make decisions quickly without spending too much time and energy thinking about it.  However, if we don’t need to make decisions quickly, then there’s no reason to take the shortcut.  Take some time to do research, and to “study it out in your mind” (23).  By taking more time, you’ll be more likely to avoid impulsive decisions.  If you’re able to, allow your intuition to be supported by other epistemic sources such as reason.

So, what should we do while we’re taking this “time out,” so to speak?  Before we can answer that, let’s take a step back for a moment.  We know that cognitive biases come from heuristics, which help us make decisions quickly.  It stands to reason, then, that in order to limit the cognitive biases we’re employing, we have to first recognize what heuristics are at play.  Asking yourself questions like “Am I focusing on some information at the expense of other information?” or “is there some kind of information that I’m missing?” can be helpful when analyzing what biases are at play.  By reviewing the presuppositions we have, it’s easier to see how our intuition may be helping us, or hurting us.

It’s also important to look at the issue from different perspectives.  Being able to seek further information, as well as receive feedback, may help reduce the effects of cognitive biases (24).  When you find something that confuses you, don’t just take the critics’ word for it, look at what FAIR has to offer, or consult other experts in different areas.  Not only is there likely to be information that you didn’t know (thus helping your intuition recognize true patterns and adjust to dominant information), but it can also be nice to analyze issues from people with different perspectives (25).  It’s also easier to verify your information too, so this practice of looking at what other people have to say can be useful in fact-checking your data, which is always going to be useful for critical thinkers.


In conclusion, heuristics can be a widely versatile tool that can help us resolve problems in powerful ways.  However, just like any tool, it can be misused, and can sometimes get in the way of our ability to arrive at correct conclusions and solve problems.  Cognitive biases come in many shapes and sizes, but they don’t have to define our thought processes.  There are good ways to avoid allowing these mental shortcuts to disrupt our analyses.  It’s imperative to recognize that these cognitive biases will never truly disappear.  They will always be a part of us, and no one is impervious to their influence (26).  Again, heuristics are important…even if we could ignore them we don’t have the time or energy to function well in this modern world without them.  Instead, it serves us to be patient with ourselves as we root out what we consider to be good and bad.  As we analyze where our intuition can sometimes lead us astray, it will become easier for us as critical thinkers to solve problems, and avoid common pitfalls that inhibit good reasoning.  In short, learning how (and how not) to use heuristics can help us improve how we learn, how we use our agency, and eventually help us become the kind of thinkers and believers I think God wants us to be.


  6. Gigerenzer, G., & Gaissmaier, W. (2011). Heuristic decision making. Annual review of psychology, 62, 451–482.
  7. Korteling, J. E., Brouwer, A. M., & Toet, A. (2018). A Neural Network Framework for Cognitive Bias. Frontiers in psychology, 9, 1561.
  9.; a more charitable view of this cognitive bias can be found here 
  13.; Not that things like “happiness” are a contest. 
  15. Nickerson, R. S. (1998a). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220. 
  16. Robinson, J. D. (2016). Using blind reviews to address biases in medical malpractice. Blinding as a Solution to Bias, 181–193.  
  19. Huang, S., Stanley, M.L. & De Brigard, F. The phenomenology of remembering our moral transgressions. Mem Cogn 48, 277–286 (2020). ; This is especially true for moral failures.
  20. Fiske, S. T., & Tablante, C. B. (2015). Stereotyping: Processes and content. In M. Mikulincer, P. R. Shaver, E. Borgida, & J. A. Bargh (Eds.), APA handbook of personality and social psychology, Vol. 1. Attitudes and social cognition (pp. 457–507). American Psychological Association.
  21. Derek Westra’s talk “Whistleblowers in the last days” goes over how the media portrays stereotypes, and sometimes falsehoods, about members of the church.  It’s accessible online for free at this link here (you’ll need to log in first).
  22. Abbink, K., & Harris, D. (2019). In-group favouritism and out-group discrimination in naturally occurring groups. PloS one, 14(9), e0221616.
  23. D&C 9:8
  24. Bojke L, Soares M, Claxton K, et al. Developing a reference protocol for structured expert elicitation in health-care decision-making: a mixed-methods study. Southampton (UK): NIHR Journals Library; 2021 Jun. (Health Technology Assessment, No. 25.37.) Chapter 6, Reviewing the evidence: heuristics and biases. Available from: 

Further Study:


Zachary Wright was born in American Fork, UT.  He served his mission speaking Spanish in North Carolina and the Dominican Republic.  He currently attends BYU studying psychology, but loves writing, and studying LDS theology and history.  His biggest desire is to help other people bring them closer to each other, and ultimately bring people closer to God.

The post By Study and Faith – Episode 6: Cognitive Biases appeared first on FAIR.

Continue reading at the original source →