
When I first started studying psychology and apologetics, I thought that people were rational beings. I quickly discovered that we are not as rational as we think. However, it wasn't until years of studying bias and experience interacting with people that helped me realize that people are far from rational. Here's the thing though: we can be rational, but when it comes to topics like religion, politics, or any other emotionally charged topic, it requires a lot of hard work to be rational. We need the patience to withhold premature judgments, we need the courage to confront our emotions and challenge the standard view of our social groups, and we need the humility to admit we might be wrong or ignorant.
What's unique about this list compared to others you might find on the internet is that I use illustrations that present these in the context of apologetics and I've cross-referenced each factor with related ones and fallacies.
![]() |
See egocentric bias |
Please let me know if you think others should be added or if something is unclear. This list is meant to be a reference for myself and anyone else who wants to use it. New terms are in blue.
- Related to liking, first impressions, intuition
- Same or nearly the same intuitive cognitive style
- Related to an appeal to emotions
- Related to reactance (opposite)
- Same or nearly the same as arbitrary coherence.
- Related to framing and priming.
Apophenia - the tendency to see patterns, meaning, or connections in randomness. Essentially, this is seeing shapes in clouds or finding hidden codes in the Bible. In apologetics, believers are accused of this when they claim there is design in the universe. However, the same critique can be aimed at evolution so both sides need to make a case that they are not falling victim to this bias.
- Same or nearly the same as hyperactive agency detection device (HADD), agenticity, paternicity, the clustering illusion, hot-hand fallacy, Texas sharp shooter fallacy, and pareidolia.
- Related to the false cause fallacy (aka causal fallacy), gambler's fallacy
Arbitrary Coherence - the tendency to form a coherent view or argument based on an arbitrary value. Once an arbitrary value is accepted, people tend to act coherently based on that value.
- Same or nearly the same as the anchoring effect.
- Related to framing, and priming.
Attribution Errors - An attribution error is any time we blame or credit one thing when it was really a different thing. Usually, this happens because several factors usually go together and we just choose the one that is more apparent to us. This is extremely common in our world. When Christians are unloving, people often attribute this to Christianity rather than the person's personality, culture, a specific brand of Christianity, potential misunderstandings, their own defense mechanisms, or several other potential factors.
- Same or nearly the same as fundamental attribution error, placebo effect, classical conditioning
- Related to the availability heuristic, the representativeness heuristic
- Same or nearly the same as base-rate fallacy.
- Related to the false-consensus effect, cherry-picking (fallacy), selective attention, selective perception
Backfire Effect - When a person moves farther away from a view after hearing an argument for it. This is probably the best explanation for why neither person usually changes their mind when debating religion, politics, or other heated topics.
- Same or nearly the same as belief perseverance and group polarization.
- Related to belief bias, confirmation bias, reactance.
Bandwagon Effect - The tendency to prefer popular options. People might be hesitant to become a committed Christian because they don't see many other people who are.
- Same or nearly the same as an appeal to the majority.
- Related to the bystander effect, false-consensus effect, individualism (contrasts), mere exposure, reactance (contrasts), optimal distinctiveness.
Base Rate Fallacy - The tendency to ignore the base (average) probability of something occurring in favor of new or readily available information. An example of this is when someone points to a mutation as evidence for evolution but neglects the extremely low average rate of beneficial mutations, especially beneficial mutations that insert new information into the genome.
- Same or nearly the same as the availability heuristic
- Related to the cherry-picking (fallacy), hot-hand fallacy, regression to the mean, representativeness heuristic.
Belief Bias - The tendency to evaluate arguments based on what we already believe rather than the strength of the premises. In other words, to rationalize, ignore, or misunderstand argument that would disprove what we already believe. If we believe a conclusion, then we will deny any premises that do not support our conclusion without careful consideration of them.
- Same or nearly the same as rationalization.
- Related to affirming the consequent (fallacy), belief perseverance, confirmation bias, and straw-man fallacy.
Belief Perseverance - The tendency to maintain a belief even in the face of contrary evidence.
- Same or nearly the same as belief perseverance and group polarization.
- Related to belief bias, confirmation bias
Blindspot Bias or Bias Blindspot – The tendency for people to see themselves as less susceptible to biases than other people. This one is more likely to affect people with high IQ or education. Anecdotally, I've noticed that people who have converted or deconverted as an adult tend to be guilty of this by thinking they have transcended bias and no longer need to reevaluate their views.
- Same or nearly the same as "I'm not biased" bias, self-serving bias.
- Related to belief bias, belief perseverance, confirmation bias, intelligence,
Bystander Effect - The tendency to not respond to a situation when there is a crowd of people also not responding. When we see a car on the side of the road, we don't stop to help because nobody else is stopping to help. This effect happens because we don't want to stand out, rationalize that we might not be needed, or we actually think we might be wrong and everyone else is right. In apologetics and theology, this is when we see other people accepting sinful behaviors so we don't try to stop it (if and when we are in the proper role to do so).
- Same or nearly the same as an appeal to the majority (fallacy) or bandwagon fallacy.
- Related to the availability heuristic, false-consensus effect, normalization, the spotlight effect, systematic desensitization.
Classical Conditioning - This is the type of conditioning where a neutral stimulus is paired with a positive or negative stimulus and our brains automatically link the two together so that we come to expect they will go together. This has actually been used to develop superstitions in pigeons (and I think other animals) and is the likely mechanism for sexual fetishes. The key is that this is an automatic process; however, it might show itself through conscious means. Someone with a legitimate superstition probably has very convincing reasons to explain how it works even though everyone else recognizes there is no connection between the superstition and the outcome.
- Same or nearly the same as attribution errors
- Related to superstition, fetishes, misperceptions
- Same or nearly the same as
- Related to
Cognitive Ease - We are more likely to accept something or like it if it is easy to process. This includes the content, the way the content is presented, and the medium used to present it. Using a clear font when writing, speaking loud enough for people to hear, high-resolution video, and simplifying a complex concept are just a few ways to take advantage of this bias.
- Same or nearly the same as disfluency (opposite)
- Related to mere exposure, optimal distinctiveness
Cognitive Dissonance - The uncomfortable feeling we get when we have inconsistent beliefs or when our actions do not align with our beliefs. When our actions and beliefs are inconsistent, we usually change our minds to align with our beliefs because they are more observable so people won't recognize our hypocrisy. This can be used in apologetics to show that a person's moral concerns (environmentalism, politics, human rights, etc.) do not align with their beliefs because there is no objective morality without God.
- Same or nearly the same as
- Related to
Commitment Bias - The tendency to stick with what we're already doing or already believe even when new evidence suggests we should change. Anyone with a firm commitment to their current beliefs about God is susceptible to this.
- Same or nearly the same as escalation of commitment, hasty generalization (fallacy), premature cognitive commitment, sunk cost fallacy.
- Related to appeal to authority, backfire effect, foot-in-the-door technique, self-herding, status-quo bias
Compensatory Control - When we lose control in one situation or domain, we try to compensate by gaining control in another area. During an election year when there is political uncertainty, religious people tend to view God as being more in control than during non-election years. We gain compensatory control through work, routine, parenting, and many other domains.
- Same or nearly the same as
Confirmation Bias - This is has become a fairly broad term to describe any bias, action, or thing that helps us confirm what we already believe. It can take the form of looking only for confirmatory evidence (as opposed to evidence that potentially disproves our view), forgetting or ignoring evidence that doesn't support our view, or interpreting evidence in a twisted way to fit our view.
- Same or nearly the same as belief bias, belief perseverance, myside bias
- Related to the availability heuristic, backfire effect, cherry-picking (fallacy), and pretty much everything else.
Conformity Bias - This is the tendency we have to conform to what other people think, say, or do. We are social creatures and most people don't like going against the crowd. People may conform so that they do not appear to be different or because they might doubt themselves and think that others are right. In situations where there is strong pressure to conform, just one other person who dissents is enough to strengthen anyone else who might be doubting the group. So if you are in a setting where people are bashing Christianity, you can be the voice of dissent (respectfully) which can give strength to others who might have the same convictions but lack the confidence to say something. Likewise, if you present a good case to several people that Christianity is true, but one non-believer objects (even if the objections are really bad), that will likely strengthen others in their unbelief.
- Same or nearly the same as an appeal to the majority, bystander effect, herding
- Related to agreeableness, an appeal to authority, liking, in-group/out-group bias, social proof, groupthink, reactance (opposite)
Context - We all recognize when a Bible verse is taken out of context, but we often don't realize that context shapes our every thought. Our culture, the weather, the people we're with, the song on the radio, how much money we make, and everything else you can think of is a part of the context that shapes our thoughts. The effect can be so strong that even some optical illusions are culturally dependent. Obviously, not everything has an effect at all times, and not always to the same degree, but there's always the chance that it might. For instance, if you just listened to an annoying song on the radio, you might be more likely to reject what I say here than had you listened to nothing or a song you like. The more we recognize the potential that contextual factors can influence our thoughts about something, the more open we should be to other views and the better we should become at discerning truth.
- Same or nearly the same as
- Related to conformity, authority, in-group/out-group bias, social proof, liking, empathy.
Contrast Effect - The tendency to judge something in comparison to something that came immediately before it. If you give an argument or a presentation after someone else, the quality of what you say will be judged in comparison to the person who spoke before you. This can help and hurt in apologetics depending on the person who went before you. This can apply to the quality of your videos, the design of your website, interviews, in-person or online conversations, and just about anywhere else.
- Same or nearly the same as
- Related to anchoring, arbitrary coherence
- Related to superstition, classical conditioning, operant conditioning, confirmation bias, belief perseverance, data fishing
![]() |
For more weird and spurious correlations, go to https://www.tylervigen.com/spurious-correlations |
Correspondence Bias - This is very close to the fundamental attribution error (FAE). In FAE, we have a tendency to attribute situational factors to the person (thinking the guy who cut me off is a jerk rather than avoiding an object in the road) whereas correspondence bias is when there is a known situational constraint, a person is still likely to attribute the constraint to the person. The simplest example is if I am assigned to a view to defend in a debate competition, people will likely think that I actually hold the view I am defending even though it was randomly assigned. I see this in religious dialogue when someone corrects a bad argument for a view both people hold, then people assume the person correcting the view holds the opposing view.
- Same or nearly the same as fundamental attribution error (FAE)
- Related to correlation/causation, false cause fallacy, attribution errors, reactance, framing,
hasty generalization (fallacy), self-serving bias
Decision Fatigue - As we make more and more choices throughout the day, we become more mentally fatigued and less willing to put in the cognitive effort to make careful decisions. Whether this effect exists is highly debated. A recent paper suggests it does exist, just not as broadly as originally thought. In apologetics, this may come into play if you ask too many hard questions of someone. They may just get tired of answering and stop trying, in which case they may just leave, resort to name-calling, or answer without thinking (see other biases).
- Same or nearly the same as ego depletion
- Related to
Decoy Effect - When there are two competitive options, the decoy option is like one of them but less desirable, making the one it is like seem best. For instance, if I am selling you a burger with fries for $5 and a chicken sandwich with fries for $5, I can add a decoy to make one sell better than the other. If I have a bunch of burgers about to go bad, I can give the option for a burger only for $4.50, making the burger with fries seem most desirable. I suspect this might be part of why some people are spiritual but not religious. They are essentially choosing between to choose between atheism, religion without rules, and religion with rules, and for many people, organized religion serves as the decoy to nudge people towards spiritualism rather than atheism. I should note that this is just a hypothesis of mine or a potential application of this effect.
- Same or nearly the same as
- Related to anchoring, dilution effect
- Related to liking, decoy effect, appeal to authority, confirmation bias, belief perseverance, Forer effect
- Related to the affect heuristic, cognitive style, appeal to emotions.
- Same or nearly the same as identifiable victim effect
- Related to vividness, vagueness
Dunning-Kruger Effect - My favorite bias because I think it explains so much of the world. This is the tendency for people with minimal knowledge or experience in an area to have extremely high confidence in their ability in that area. As they gain genuine expertise, their confidence dips down before starting to climb again. This is apologetics. Almost everyone thinks they are an expert on religion and science so when you try to have an apologetics conversation, they are unwilling to listen or consider what is said because they view themselves as the expert. The original paper for this is called "Unskilled and Unaware."
- Same or nearly the same as
- Related to humility (opposite), straw-man (fallacy),

- Related to
- Related to self-serving bias, the false-consensus effect. the Dunning-Kruger effect, pride, confirmation bias, Hyperactive Agency Detection Device (HADD)
- Same or nearly the same as mere ownership effect.
- Related to commitment bias
False Consensus Effect - The tendency to think something is more normal than it really is. The classic example is premarital sex in high school. People, especially high school students, think "everyone is doing it," but the research shows that more than half of high school students are still virgins when they graduate. The most prevalent example for apologetics is the tendency people have to think scientists or intellectuals are more atheistic than they really are.
- Same or nearly the same as
- Related to the availability heuristic, appeal to the majority (fallacy)
- Same or nearly the same as
- Related to Mandela effect, misinformation effect, mere exposure, vividness/vagueness, imagination inflation, confirmation bias, critical lure, deja vu, flashbulb memories
- Same or nearly the same as the mere exposure effect, the availability heuristic
- Related to optimal distinctiveness
- Same or nearly the same as
- Related to intuition, thinking style, liking, aesthetics
- Same or nearly the same as the availability heuristic, cherry-picking (fallacy)
- Related to affect heuristic, confirmation bias, red herring (fallacy)
Forer Effect - The tendency for people to accept very broad or generalized statements about their personality as being uniquely true of them as opposed to recognizing they are largely true of most people. This basically explains the current trendiness of the enneagram, even though it is no scientifically valid. This may play a role in apologetics because people might be susceptible to view themselves in a way that could be beneficial or harmful for apologetics conversations. Trying to prime people to view themselves as rational, careful thinkers, respectful people, and so on, can help set up conversations so they are more effective.
- Same or nearly the same as the Barnum effect
- Related to the availability heuristic, confirmation bias, self-serving bias
Framing Effect - The way something is presented, or framed, can affect our conclusions about it. For instance, 99% effective sounds better than saying only fails 1% of the time. In one of my presentations, I show a clip from Brain Games where a cop asks witnesses how fast a car was going when it bumped/smashed into the other vehicle. By changing just one word, witnesses report drastically different speeds. In apologetics, our words matter. When we frame another worldview as ridiculous, those who agree with us and some in the middle will likely find it very convincing; however, unbelievers will feel as though we're not honestly representing their view and disregard what we say. Another example is how we present Christianity. Do we present it in a positive light so people want to follow it or are we simply known for all the things we're against? People are more prone to accept something, or at least listen when it is presented in a mostly positive way (not to say you can't or shouldn't mention the struggles of being a Christian).
- Same or nearly the same as
- Related to affect heuristic, appeal to emotions (fallacy), arbitrary coherence, fundamental attribution error (FAE)
Functional Fixedness - This is the tendency for people to view something only for it's intended purposes, preventing us from seeing alternative uses. The ability to break this pattern is what made Macgyver popular. In other words, this bias is thinking inside the box, so the antidote (as if it's just that easy) is to think outside the box. In apologetics, I find people sometimes have a fixed view of what Christianity is or what their identity is ("I'm a doctor and doctors aren't religious") which prevents them from seeing Jesus. If you notice this might be an issue, it's easy to overcome as long as you don't point it out in a condescending way.
- Same or nearly the same as
- Related to the availability heuristic, creativity, confirmation bias
Fundamental Attribution Error (FAE) - This is when we make an error in attributing something to someone or something. Usually, it's used to refer to blaming people (personal attribution) for things that were not within their control, or not completely within their control, and then we often associate the act with their character. If you cut someone off in traffic, even if it was an emergency or they unknowingly swerved into your lane, they will likely blame you for it and if you have a Jesus sticker on your car, they'll pass that judgment onto Him. Similarly, if you make a mistake about a fact (or they think you make a mistake), they will attribute that to your character and probably your intelligence. This is why it's extremely important to be careful with our words, fact-check everything, and speak to others with grace.
- Same or nearly the same as correspondence bias
- Related to false cause fallacy (aka causal fallacy), confirmation bias, hasty generalization (fallacy), self-serving bias
- Same or nearly the same as the backfire effect
- Related to affect heuristic, commitment bias, conformity, groupthink, ingroup/outgroup bias, liking, obedience to authority
Groupthink - When groups have a strong desire to conform or be unified, they have a tendency to accept ideas too quickly and without critical though, leading to bad decisions. This can also happen when the group leader or the environment punishes dissent. This is different than ingroup bias or group polarization in the sense that it stems from a desire within the group to get along and an unwillingness to risk the consequences of dissent (notice how I didn't name specific errors I think many Christians make theologically 😉). This is hard to get around in social media because if someone does speak out against their side, they are criticized by both sides. If they see someone else do it, they often don't want to stick their neck out in support so they might passively like something or just refuse to comment about it.
- Same or nearly the same as conformity
- Related to commitment bias, false-consensus effect, hasty generalization (fallacy), in-group/out-group bias
Halo Effect - This is the tendency for us to globalize a positive attribute of a person from one domain to another. For instance, if someone is physically attractive (or any other noticeable positive attribute), we're more likely to rate them as more intelligent, more competent, more trustworthy, and so on. While this effect solely focuses on positive attributes, the same applies to negatives so that if a negative attribute stands out to someone, they are more likely to apply that to us in other domains too. This is why it's extremely important in apologetics and evangelism to make good first impressions with people, to speak respectfully, dress and look respectable (not to be confused with being vain), be kind, and so on. People are much more likely to listen to apologetic arguments or the gospel if something about us (or many things) stands out as being very positive.
- Same or nearly the same as
- Related to affect bias, availability heuristic, Dunning-Kruger effect, first impressions, hasty generalization (fallacy), liking, representativeness heuristic
Hawthorne Effect - This is when people change their behavior when they're aware of being watched or think they're being watched. It's obvious that this happens, at least to some degree, but people sometimes underestimate how big the effect is and how easy it is to invoke it. Some studies have found that simply putting a picture of a face in certain places can get people to act better, although the effectiveness of this small of an intervention is debated. This is seen in public conversations such as on social media or other public venues because people are more likely to stick with their group's views on a topic rather than seriously consider other views for fear of being condemned by their group.
- Same or nearly the same as
- Related to group polarization, ingroup/outgroup bias, self-serving bias
Hedonic Treadmill/Adaptation - The tendency for people to adapt to things that are enjoyable so that it becomes their new expected standard. For instance, if you won the lottery, you would be ecstatic but you would slowly return to your previous levels of happiness, and worse, expect your quality of life to always remain the same so that you would be disappointed getting less than you had previously. If you buy a nice car, you will get used to the comforts and advantages of it so that when it comes time for a new car, you will expect the same or better, even if you don't really need all the luxuries. The application is more theological than apologetics. When we become accustomed to the comforts of American life, we tend to be calloused toward people around the world who aren't so well-off and we become unwilling to make sacrifices in our life for them. This affects how people view us when we do apologetics, but also the amount of time we spend studying or doing apologetics (or Bible study, prayer, etc.). When we become accustomed to Netflix xx hours per week, it's hard for us to give that up so that we can read, do evangelism, serve the poor, etc. We then rationalize that we somehow deserve such rest because we work so hard at other times (we should indeed rest, but we don't need nearly as much as the American lifestyle affords us).
- Same or nearly the same as
- Related to anchoring, halo effect, identifiable victim bias, foot-in-the-door technique, routinization, normalization, desensitization.
Herding - The tendency to follow the crowd as if we are a herd. This is a very useful heuristic, especially in unfamiliar places (e.g. traveling to a new country), but it can often lead to false conclusions. Many people have false views about Christianity because of this. They get their theology from popular media sources, leading them to think Christianity is intellectually bankrupt and faith is blind, and so they just go along with the crowd. This especially relates to sexuality and gender.
- Same or nearly the same as the appeal the majority, bandwagon fallacy, false-consensus effect
- Related to in-group/out-group bias, self-herding
Hindsight Bias - The tendency to look at events from the past as having been obviously predictable. In other words, we look at past things with blinders on due to changes in culture or increased knowledge about something. We look at slavery as wrong today, and rightfully so, but because it's so culturally ingrained in us, skeptics sometimes have a hard time understanding slavery that is discussed in the Bible.
- Same or nearly the same as
- Related to affect bias, availability heuristic
Hot-Hand Fallacy - This is often discussed in terms of basketball, which is how it was discovered. Researchers found that when a person is perceived to be on a hot streak and observers think that person is more likely to make the next shot; however, the data shows this is not the case. The relevance to this in apologetics is that our immediate intuitions are not always correct. When looking at rates of abortion, effects of gun control, crime and religiosity, and so on, we can't just cite a statistic and give a simple explanation (this goes for people on all sides). Sometimes the obvious conclusion is correct, but we need to look around for other data and the best explanation for something.
- Same or nearly the same as apophenia, regression to the mean.
- Related to blindspot bias
- Related to drop-in-the-bucket effect, vividness, vagueness
Ikea Effect - When we place more value or importance on something that we build. In apologetics, if you tell someone the evidence for Christianity and give them the answer, they are likely to feel like your answer is not as good as theirs because they did not come up with it, and therefore, they will resist you. A better approach might be to tell people of certain facts or create hypotheticals based on the facts and then ask them to come to a conclusion based on those facts.
- Same or nearly the same as the endowment effect, mere ownership effect, not-invented-here (NIH) effect
- Related to the genetic fallacy
The Illusion of Control - The tendency to overestimate our ability to control things. This affects Christians who might think they have more control over another person's beliefs than they actually do. This seems to part of the equation for people who want to push for government laws restricting behaviors that do not align with Christianity with the assumption that it will be more effective than it actually is. This is not saying there isn't a place for laws restricting certain behaviors, but it's the overestimate of the effectiveness of these laws that is the bias.
- Same or nearly the same as
- Related to the apophenia, compensatory control, gambler's fallacy, superstition,
Illusory Truth Effect - The tendency to believe false information is true after hearing it over and over again. Essentially, it's not the correctness that is remembered, but the content, so when people recall it, they remember it as true, unless of course they explicitly recognized it as false and argued against it. The main takeaway for apologetics is that people are likely to reject apologetic arguments when they're new to them. This means we don't need to be pushy or overbearing with people. We can and should take the long view and give them a little something to chew on time and time again. The goal isn't to get them to believe false information, but to help remove an emotional barrier to something that seems new and strange.
- Same or nearly the same as mere exposure effect
- Related to the appeal to the majority, the false-consensus effect
Imagination Inflation - When we imagine something, we are more likely to remember that thing as something that actually happened. This is a legitimate objection for atheists to the resurrection that needs to be seriously dealt with by believers. However, the mechanisms of this effect are not nearly powerful to explain the resurrection because it would need to be argued that the resurrection was remembered correctly as not happening, but then imagined by someone (presumably multiple people), then that person or group of people didn't discuss the resurrection idea they concocted for a long period of time, but when they did finally recall it, they remembered what they imagined rather than what really happened.
- Same or nearly the same as
- Related to false memories
Inattentional Blindness or Selective Attention - Strictly speaking, this is more of a perceptual error when we are so fixated on one thing, we miss surrounding cues. If you've ever seen the gorilla basketball video, that's an example of this. However, the same general thing occurs when we are so sure we are correct about something that we just blatantly miss or don't remember things that oppose our view. This is likely why atheists so often use incorrect definitions of faith or repeat the same misunderstandings about the kalam (e.g. who created God) even after they've been corrected. The correction just doesn't register with them because they're so sure they're right. On the other hand, I see apologetics get so fixated on pedantic details of an objection to Christianity and lose the whole point of what the other person was saying.
- Same or nearly the same as
- Related to the availability heuristic, belief bias, confirmation bias
- Same and nearly the same as reactance
- Related to reactance conformity collectivism
- Related to the time perspective, money, backfire effect
In-group/Out-group Biases - These are two different biases, but they're often used to describe the same thing and used interchangeably. The reason is that they're just two sides of the same coin. We are biased in favor of our own group and against other groups. When someone from our own group does something good, we apply it generally to the group as the norm and claim it as an example of the individual's character or competence. When someone from the ingroup does something bad, we rationalize it or we blame it on the individual. The opposite happens with the outgroup. When a member of the outgroup does something good, we apply it only to the individual as an exception to the norm or we try to explain it away as being a product the situation or not all that good after all. When someone from the outgroup does something bad, we apply it to the whole group and view it as the norm. Apologetics is a quintessential example of ingroup/outgroup behaviors. People on all sides are guilty of jumping on the bandwagon of bad arguments and rationalizing. About the only thing you can do is to be aware of this bias so you can try to avoid falling into it yourself and work on building relationships with people in the outgroups so they don't view you as an adversary.
- Same or nearly the same as self-serving bias (but applied to groups)
- Related to the appeal to the bandwagon effect, fundamental attribution error, liking, the majority (fallacy)
- Related to the belief bias, confirmation bias, personality, stereotyping, pattern recognition or apophenia,
- Related to compensatory control, the illusion of control, HADD
- Same or nearly the same as
- Related to context, egocentric bias, and time perspective
- Same or nearly the same as
- Related to affect bias, the halo effect, ingroup/outgroup biases
- Related to depression, affect heuristic, appeal to emotion, liking
- Same or nearly the same as negativity bias
- Related to the affect heuristic, availability heuristic, belief bias, confirmation bias, endowment effect, mere ownership effect, sunk cost fallacy, and status quo bias
- Related to survivor bias, egocentric bias, self-serving bias, in-group/out-group bias, attribution errors, just world beliefs, HADD, superstition, false cause fallacy, correlation/causation
- Same or nearly the same as DRM procedure, false memories
- Related to imagination inflation, egocentric bias
Mere Exposure - This is the tendency to be more willing to accept things that we've been exposed to before. In other words, new things (like evidence for Christianity) are strange to us and seem unlikely to be true so we reject them. Don't expect people, even other Christians, to accept the arguments for Christianity the first time they hear them. They'll likely need several exposures to the idea of rational faith and evidence before they'll be open to accepting it.
- Same or nearly the same as
- Related to optimal distinctiveness
Mere Ownership Effect - The tendency to overvalue items that we own. This is studied with physical objects by seeing how much people will buy and sell things for, but there's no reason the same effect doesn't apply to things like worldviews. This is likely one of the many reasons it's hard for people to change their beliefs, even on small topics.
- Same or nearly the same as the endowment effect
- Related to Ikea effect, not-invented-here (NIH) effect, enactment effect
Misinformation Effect - This is when information after an event affects our memory of the event. The classic example of this effect is from a 1974 study that showed people a film of a car crash. Participants were asked how fast the car was going when it either collided, bumped, contacted, hit, or smashed into the other vehicle. This change of a single word affected their estimates of the car's speed and one week later, those in the smashed condition were more likely to say they saw broken class. This is a potential argument against the resurrection, however, this effect cannot explain something as big as a person rising from the dead or the fact that the NT authors witnessed several other miracles and did miracles themselves.
- Same or nearly the same as
- Related to DRM procedure, false memories, imagination inflation, Mandela effect
- Related to rationalization, cognitive dissonance
- Same or nearly the same as confirmation bias
- Related to belief bias, belief perseverance
- Same or nearly the same as mere exposure effect, systematic desensitization, habituation
Not-Invented-Here (NIH) Effect - This is when we reject an idea or undervalue something because we did not come up with it. Simply put, people don't like to be wrong and don't like it when someone else knows more than them. A great way to get people to think they've come up with an idea of their own is to ask leading questions, preferably about things they likely haven't thought about before. This is different than trying to ask questions that trap a person in a corner, but with this effect, you want to use less direct questions and don't answer the question yourself (or point out how their views are contradictory). This can also be used with the mere exposure effect because once people are aware of an idea, even if they can't name it, their brain may unconsciously stumble upon it through repeated conversations.
- Same or nearly the same as the Ikea effect
Obedience - People are generally more obedient to authority figures than they would be otherwise. There are certainly many exceptions to this and other factors to consider so this one can be hard to use in practical situations. The best way to use this is to understand who you're talking to and present yourself in a way that they would consider an authority. In most cases, this means dressing and speaking professionally, but with some people, it might mean presenting yourself as somewhat of a rebel. Additionally, citing people or institutions the other person considers authoritative can also be effective. If the person is an atheist, try to cite what other atheists have said that supports what you are trying to say.
Openness (Personality) - Openness to experience is one of the five major personality factors that affect people and it plays an important role in decision making. People who are high in openness are likely to be more liberal and more likely to consider and accept alternative viewpoints, but they may be too comfortable with contradictions or ambiguity and unwilling to commit to a view. People low in openness are unlikely to even consider new evidence presented to them in the first place, and even if you can get them to consider it, actually changing views will be extremely hard for them.
- Related to confirmation bias, belief perseverance
- Related to classical conditioning, in-group/out-group bias, liking, negativity bias
Placebo Effect - This is when something affects us simply because we believe it will affect us. The obvious example is when sugar pills (placebo) are given as medication for drug testing, the sugar pills actually improve people's health; however, this effect is much broader than just medicine. Believers seems to be especially susceptible to this effect. The placebo effect can largely explain why things like amber beads, essential oils, and other homeopathic remedies seem effective (obviously with rare exceptions where clinical trials have shown them to be effective). The importance of understanding the placebo effect for apologetics is mostly for recognizing how non-believers might view us mistake the placebo effect for a real effect. If I am talking to a thoughtful atheist and I tell her that I was healed by essential oils, she will likely think it was really just the placebo effect and that I am too biased or ignorant to know better. The same is true for many claims people make about prayer. It's extremely hard to convince people to change their views on something, but it's even more difficult if they think you are unintelligent.
- Same or nearly the same as
- Same or nearly the same as self-serving bias,
- Same or nearly the same as out-group bias
Priming - Priming is when our brains are prepared (primed) to think a certain way or about a certain topic. If you've ever done the trick where you ask someone say ten 10 times then you ask them what soda cans are made out of, they are likely to say tin instead of aluminum because their brain is primed to think of things like ten and since tin sounds like ten and is a metal like aluminum, it pops into our minds right away. The importance of this for apologetics is understanding how our environment, single words, or just about anything else can prime the person we are talking to which will affect the way a person engages our arguments. We can prime them to be argumentative and intuitive, which would not be helpful, or we can prime them to be open and thoughtful.
- Same or nearly the same as
- Same or nearly the same as
- Related to revenge,
- Related to backfire effect, forgiveness (opposite), reactance, retributive justice
![]() |
Making people want to punch you is not a good persuasion tactic. |
- Related to authority, compliance, backfire effect, obedience, revenge, retributive justice
Regression to the Mean - This is a statistical phenomenon. Imagine you flip a coin a billion times. What will the result be for heads and tails? Likely it will be almost exactly 50%. However, if you look at a random set of flips within that billion, there might be instances of 10 or even 20 heads in a row. Our lives are composed of billions of moments and opportunities for apparently rare things to happen, so we need to be careful not to draw generalized conclusions from these statistically inevitable, but rare events. Believers are particularly guilty of this when they claim every coincidence in life is the active work of God. Non-believers may not know the term regression to the mean, but they'll still view you and your claims as nonsensical. I'm not saying you shouldn't recognize God in your life, but you should be aware of how others might view what you say.
- Same or nearly the same as apophenia, pareidolia
Representativeness Heuristic - This is a mental shortcut that leads us to conclude that a specific example that matches our schema is more likely than a general example. Here's the classis illustration of this heuristic. Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable? a.) Linda is a bank teller or b.) Linda is a bank teller and is active in the feminist movement. Many people say "b" because Linda matches their schema for a feminist but "a" is the more likely since "a" can be correct without "b" being correct but it can't be the other way around.
- Related to the base-rate fallacy, hasty generalization (fallacy), schemas, archetypes, prejudice, in-group/out-group bias
Routinization or Foot-in-the-door - This is we agree to or accept something small, and that small thing becomes our new standard or expectation, and when something is added so we go along with that as well, and things continue to escalate. This is a classic sales tactic, but it also explains how people might do things that seem morally disgusting, even to the person who did it. In apologetics, this technique can be used to build bridges and rapport with them for having conversations. Start by finding places where you agree on things and then move slowly from there. In most cases, people will not be open to hearing what you say if you give them too much at once so using this technique can help you from overwhelming the other person.
- Same or nearly the same as habituation, escalation of commitment
- Related to the hedonic treadmill, systematic desensitization, normalization, mere exposure effect, anchoring, door-in-the-face, commitment bias, obedience, self-herding, optimal distinctiveness
- Same or nearly the same as
- Related to unity and social proof (somewhat opposite), not-invented-here, loss aversion, in-group/out-group bias, optimal distinctiveness.
Schemas - If you've watched anything from Jordan Peterson you've probably heard him talk about Jung's idea of an archetype. This is very similar. It's a mental representation of something in an idealized form. This relates to decision-making because we have a very hard time understanding, remembering, and accepting things that don't fit into our schemas. For instance, if someone's schema for religious belief includes blind faith, they will likely have a very hard time understanding how evidence could even be applied to religious belief. They almost certainly won't accept any evidence provided at that point either. They need time to come to terms with the new information and adjust their schema first (unfortunately, many people won't adjust it and will just forget or ignore the new information).
- Same or nearly the same as cherry-picking (fallacy)
- Same or nearly the same as cherry-picking (fallacy)
- Related to the availability heuristic, belief bias, confirmation bias, expectancy bias, selective attention, identifiable victim effect
Self Herding - We tend to follow the crowd in much of our decisions, and the more like us the crowd is, the more likely we are to follow...and nobody is more like us than our own self? When we act in a particular way, we are likely to act that way in the future because we're merely following our own lead. This could be good or bad depending on the situation but I'm guessing it's more likely to be bad. Let's say you think something is a sin, but you give in to temptation and do it anyway. Now that you've done it, the next time you are tempted, your past experience of doing it will increase the chances that you do it again and perhaps even more frequently. This will lead to cognitive dissonance, which often means the beliefs, not the actions will change. Anecdotally, it seems like this process is a contributing factor in deconversion for a lot of people.
- Same or nearly the same as
- Related to foot-in-the-door, routinization, desensitization, habituation, hedonic adaptation, cognitive dissonance, optimal distinctiveness.
Self-Fulfilling Prophecy - When we view ourselves in a certain we, we are more likely to act in ways that align with that view. Likewise, when someone else has certain expectations of us, they are likely to treat us in ways that align with those expectations. In both cases, the results usually align with the expectations. In apologetics, if we treat someone like an enemy and combatant, they are likely to respond as an enemy would and get defensive rather than listen to the evidence. On the other hand, treating people as though they are a friend is much more likely to spawn a cooperative environment where the other people listens (which means you need to listen to).
- Same or nearly the same as The Pygmalion effect
- Related to expectancy bias, the Hawthorne effect, self-esteem, identity, in-group/out-group bias, liking, unity
Self-Serving Bias - It could be argued that every bias in reasoning is some version of this bias. We have a strong tendency to favor ourselves over others. We think we're less biased than others, more likely to have success in the future, that we're more capable than others, and so on. At the same time, we're more likely to explain away bad things we do or that happens to us due to circumstances rather than our own inability. When you engage someone with apologetic arguments, keep in mind that they will desperately do or say some pretty extreme things in order to save face and avoid damaging their pride. At the same time, remember that you are likely to do the same. Hopefully, your knowledge of this can help inject some humility into the conversation.
- Same or nearly the same as
- Related to fundamental attribution error, in-group/out-group bias, confirmation bias, backfire effect, and just about everything else.
Sensory adaptation or desensitization - When we are bombarded with a strong sense, our minds adapt to it and we eventually filter it out and don't notice it unless someone mentions it or there is some change. The interesting thing about this is that even though we may not consciously recognize the stimuli, it can still influence our decisions. A disgusting smell or environment is likely to make people less receptive to accepting new views, although this is probably a pretty small effect. The more likely scenarios are broader examples of this effect listed below as nearly the same. See those factors for more information.
- Same or nearly the same as hedonic adaptation (treadmill), systematic desensitization, habituation
- Related to
Sexual Arousal - It's probably not surprising that being sexually aroused affects our decision making, but what might be surprising is the extent to which is does (See Tables 2, 3, and 4 of this study). Thankfully, this doesn't have a huge application to apologetics (although I'm sure there are ways it can), but it's interesting to be aware of to understand how our reasoning works.
- Same or nearly the same as
- Related to appealing to emotion
Sleep Deprivation - It's probably common sense that sleep deprivation can affect our reasoning pretty badly, however, what is not common sense is just how easily sleep deprivation can affect us and the wide variety of ways it can affect us. Sleep debt is cumulative so if you don't get enough sleep for several nights in a row (even if it's a little bit of sleep debt), the negative effects compound. When we're tired, we're more likely to dismiss things without careful thought and even if we do think about it, it won't be the highest quality thinking, which is what is necessary for apologetics discussion. Check out my article on sleep for more information.
- Same or nearly the same as
- Related to fatigue, ego depletion, stress
- Same or nearly the same as
- Same or nearly the same as system justification
- Related to sunk cost fallacy, loss aversion, compensatory control, cognitive dissonance, the endowment effect


Stereotypes - It's no secret that we stereotype people, and sometimes we even hold what are sometimes considered positive stereotypes. Unfortunately, the stereotypes we hold about groups of people affect the way we evaluate their arguments. If we think a certain group of people are less intelligent, then we will not take their arguments seriously and be more likely to ignore their arguments and be condescending, so the stereotypes person has to work extra hard to make their case. This often works against Christians since they are often thought of as dumb, however, I see it play out the other way too, when educated Christians think people with less education are dumb. A positive stereotype, like stereotyping a certain group of people as kind, also has negative effects for evaluating evidence. If our schema for that person's group is kind, then we tend to think their decisions may be more emotional than rational, and so we discount them. If they act differently than the stereotype, then we also tend to think they're less intellectually competent because now they're just trying to be different and are overly attached to doing so. At the same time, a group of people we stereotype as highly intelligent will be more influential even if their arguments are worse.
- Related to mere exposure effect, discrimination
- Same or nearly the same as
- Related to regression to the mean, false cause fallacy, egocentric bias, luck/coincidence
Terror Management Theory (Mortality Salience) - Terror management theory (TMT) says that we fear death, and as a result, all of our decisions and actions are attempts to avoid death (or thinking about it). Mortality salience is just the thought of death so that when mortality salience is high, we are likely to be more affected by thoughts of death than when it is low. In most cases, thoughts of death cause us to stick closer to our in-group and be more narrow-minded, which is not what we want for apologetics. However, it can work in weird ways sometimes because talking to someone about their impending death can also get them to try to prevent it, making them more open to the idea of eternal life through Jesus.
- Same or nearly the same as
- Related to time perspective, affect bias
- Same or nearly the same as intuition, appeal to emotion
- Related to priming, IQ, personality
- Same or nearly the same as
- Related to context, hindsight bias, memory, negativity and positive biases, egocentric bias
- Related to authority, credibility, confirmation bias
Vividness & Vagueness - These two adjectives are opposites, but often work in similar ways. If something is vivid and detailed, we're more likely to assume it's true. However, with our own beliefs, we often have very vague notions of how they fit together and we don't recognize inconsistencies. We also have a tendency to attribute profoundness to vagueness even when it might be complete rubbish (See this scientific article on pseudo-profound bulls**t)
- Related to flashbulb memories, identifiable victim effect
Von Restorff Effect - Also known as the isolation effect, this describes our tendency to notice and better remember things that are different or that stand out in some way. This seems obvious, but the implications of it may not be. If you're having a conversation with someone and you say one wrong thing and 100 good things, the one wrong thing stands out and that's what they are likely to remember most from the conversation. This is why a single good line can win a debate (Reagan/Mondale debate is the perfect example).
- Related to the availability heuristic, cherry-picking (fallacy), confirmation bias
- Same or nearly the same as
- Related to psychological distance, affect bias
Recommended Resources
Good list of biases with much more in-depth descriptions
The Decision Lab and Short Cuts
Books to help further understand how these factors affect our decisions.
Thinking, Fast and Slow by and Noise: A Flaw in Human Judgment Daniel Kahneman
Predictably Irrational by Dan Ariely
The Righteous Mind by Jonathan Haidt
Books to help overcome psychological barriers when doing apologetics and evangelism.
Influence: The Psychology of Persuasion and Pre-Suasion by Robert Cialdini
How to Win Friends and Influence People by Dale Carnegie
And just for good measure, here's a satirical list of biases that describe some current cultural tendencies, some of which relate to the biases listed above. A few particularly good ones are Evopsychophobia, Implicit ESP delusions, Subjectiphilia, and Wokanniblism.
Orwelexicon for Bias