Monday, February 3, 2020

Engineering Psychology for Apologetics

Apologists tend to think like engineers in the sense that we're very analytical and logical, which also means that we often misunderstand people who don't think this way. Nearly every apologist goes through a phase where we think, if I just show people the evidence, then they'll believe. Some give up when they realize it's not that simple, some get wise and change their methods, and some never figure this out.

Since so many of us think like engineers, why not embrace this thinking style and view ourselves as conversation engineers. The field of engineering psychology, or human factors psychology, attempts to improve the design of machines or systems so that people can use them better. The better the design, the more likely people will seek out the product, use it, and recommend it to others. As conversation engineers, we can design spiritual conversations that lead to a better experience for the people we talk to.

The success of Apple has largely been attributed to the superior design of their products. Apple engineers and designers have made products that are easy to use, making it an enjoyable experience for customers, which leads to increased sales and brand loyalty. As apologists, we can use these same engineering principles to improve our outcomes. Our product is the evidence for Christianity, which the users (non-apologists and non-Christians) will be more likely to use and trust if we can give them a positive experience.

Basic Concepts
For now, there are three basic concepts to understand. These are affordances, signifiers, and feedback. Affordances are what the product can be used for and to some extent, is dependent on the person. A chair affords sitting, but if it's a really big or heavy chair, it only affords moving for strong people or groups of people. Signifiers are signals about what a product can do (what is affords). An arrow on a dial signifies which way the dial can turn.

Feedback is a little more than people might initially think. It is a signal regarding whether the operation has been activated or a signal about whether to product is working as intended. Your phone may beep to let you know you've pressed a button or it may provide different types of beeps to tell you something went wrong. Feedback can be thought of as a type of signifier that comes after a function rather than before it and gives information about the product, not the user.

Application to Apologetics
In apologetics dialogues, our product is evidence and our goal with it is to bring people closer to Jesus, strengthen religious belief, grow the church, give Christian confidence, inspire awe, and so on. In human factors terms, these are the affordances. However, it can also have bad affordances just like a phone can have bad affordances like porn, distractions, being used for target practice, and so on. Apologetics, when done wrong, can push people away from Jesus, drive people to anti-intellectualism, or make Christianity seem narrow and self-righteous.

If we use signifiers and feedback in the right ways, we can help prevent the negative affordances of apologetics. Signifiers can be obvious or subtle. Wearing Christian or apologetics clothing and posting Bible verses on social media are obvious, but doing so may provide feedback to people with unintended negative consequences. I'm not saying don't do this, but be aware that there could be negative consequences to it. To overcome potential negative effects, use positive rather than negative signifiers (e.g. more positive Bible verses than critical ones).

Subtle signifiers can also be useful. Wearing colors like blue, white, or green can make you seem more approachable and open so people are more likely to engage in conversation with you and be more open to listening when doing so. Colors like black and red may work against you in this way. Likewise, the same goes for your website, ministry branding, and so on.

The language we use can be a powerful signifier. If we talk about people in a condescending or critical way, this tells people we aren't open to understanding and don't really care about others so they will avoid us or if they do engage, the conversation will be fruitless. Likewise, when we talk about how busy we are, we send the signal that we're not open to taking an interest in others, which creates a barrier to relationships.

Feedback is perhaps the most dangerous area for apologists. When our feedback is limited to variations of "that's wrong and here's why," we are not going to win people over. Again, this signals a lack of concern for others, even if we are correct. Instead, our forms of feedback should send the message we care about others and we're open to considering their point-of-view (even if we've already considered it 100 times about know it's false).

How can we do that? Rather than engaging people in a back-and-forth, point-counterpoint type of conversation, we need to take a greater interest in others. Ask people about their lives and what motivates them. When they do say something factually or logically incorrect, try paraphrasing so you can ensure you've understood them or thank them for sharing their views with you. Ask follow-up questions to find out more about their beliefs or their lives, not just questions aimed at trapping them.

You may be thinking that this doesn't sound like typical feedback and you're right. This is feedback about yourself, not about them. You are sending the other person the message that you can listen to them without arguing, that you care about them, and that you're a rational, kind, and considerate person. By sending them this type of feedback, they will be much more open to listening when you do present evidence.

Conclusion
This is just the tip of the iceberg on this topic. I want to encourage you to check the resources below to go deeper or check back here as I will almost certainly write more on this topic. The big take-away is that we want the people we engage with to have a positive experience so that they want more of it. We need to consider the other person's experience over our own desire to present evidence. If they have a positive experience, they will be much more likely to listen to the evidence and come back to you in the future or search for more evidence on their own.

References
I highly recommend the book The Design of Everyday Things to help understand design concepts. Even though it's not an apologetics or religious book, if you think through it and do the work of applying it to apologetics yourself, it can be very helpful.

I also recommend the Great Courses class How Colors Affect You: What Science Reveals. This can help you be more effective at apologetics through the clothes you wear, how you design your website, your ministry logo, and so on.

Thursday, January 23, 2020

List of Psychological Biases for Apologetics

In one of my apologetics presentations, I ask the audience to shout out as many logical fallacies as they can in 15 seconds and they usually list about seven. I then ask them to shout out all the psychological biases that they can and I almost say confirmation bias and nothing else. People generally seem more aware of fallacies and correctly recognize them as errors in reasoning, but few people are aware of the huge number of psychological factors that affect our every decision.

When I first started studying psychology and apologetics, I thought that people were rational beings. I quickly discovered that we are not as rational as we think. However, it wasn't until years of studying bias and experience interacting with people that helped me realize that people are far from rational. Here's the thing though: we can be rational, but when it comes to topics like religion, politics, or any other emotionally charged topic, it requires a lot of hard work to be rational. We need the patience to withhold premature judgments, we need the courage to confront our emotions and challenge the standard view of our social groups, and we need the humility to admit we might be wrong or ignorant.

Below is a list of all the broadly accepted psychological factors I could find that influence our reasoning, usually in a non-rational way. Most, if not all of these biases are unconscious so we cannot even know if they are affecting us. We know they exist because of clever experiments by psychologists. These are the factors we must overcome when we make decisions and the factors we need to help others avoid when doing apologetics. What's especially interesting about these is after people are made aware of these factors, they almost always say it didn't have an effect on them, but the data do not lie.
Image result for bias
What's unique about this list compared to others you might find on the internet is that I use illustrations that present these in the context of apologetics and I've cross-referenced each factor with related ones and fallacies.

Please let me know if you think others should be added or if something is unclear. This list is meant to be a reference for myself and anyone else who wants to use it.

Affect Heuristic - Then tendency to rely on our current emotions to make quick decisions. Disgust is particularly powerful for moral decisions. When we make decisions based on our emotions, we usually come up with posthoc (after the fact) reasons for our decision.

Same or nearly the same intuitive cognitive style

Related to an appeal to emotions

Anchoring Effect - When we have a value or representation in our mind, this becomes the standard for which we judge other options, even if it's arbitrary. In other words, this value or belief becomes your anchor for how you judge other things in relation.

Same or nearly the same as arbitrary coherence.

Related to framing and priming.

Apophenia - the tendency to see patterns, meaning, or connections in randomness. Essentially, this is seeing shapes in clouds or finding hidden codes in the Bible. In apologetics, believers are accused of this when they claim there is design in the universe. However, the same critique can be aimed at evolution so both sides need to make a case that they are not falling victim to this bias.

Same or nearly the same as agenticity, paternicity, the clustering illusion, hot-hand fallacy, and pareidolia.

Related to the false cause fallacy (aka causal fallacy), gambler's fallacy

Arbitrary Coherence - the tendency to form a coherent view or argument based on an arbitrary value. Once an arbitrary value is accepted, people tend to act coherently based on that value.

Same or nearly the same as the anchoring effect.

Related to framing, and priming.

Availability Heuristic - The tendency to make decisions or draw conclusions based on the data that we hear about most often or most recently instead of a systematic comparison of all the data.

Same or nearly the same as base rate fallacy.

Related to the false-consensus effect, cherry-picking (fallacy).

Backfire Effect - When a person moves farther away from a view after hearing an argument for it. This is probably the best explanation for why neither person usually changes their mind when debating religion, politics, or other heated topics.

Same or nearly the same as belief perseverance and group polarization.

Related to belief bias, confirmation bias, reactance.

Bandwagon Effect - The tendency to prefer popular options. People might be hesitant to become a committed Christian because they don't see many other people who are.

Same or nearly the same as an appeal to the majority.

Related to the bystander effect, false-consensus effect, individualism (contrasts), mere exposure, reactance (contrasts).

Base Rate Fallacy - The tendency to ignore the base (average) probability of something occurring in favor of new or readily available information. An example of this is when someone points to a mutation as evidence for evolution but neglects the extremely low average rate of beneficial mutations, especially beneficial mutations that insert new information into the genome.

Same or nearly the same as the availability heuristic

Related to the cherry-picking (fallacy), hot-hand fallacy, regression to the mean, representativeness heuristic.

Belief Bias - The tendency to evaluate arguments based on what we already believe rather than the strength of the premises. In other words, to rationalize, ignore, or misunderstand argument that would disprove what we already believe. If we believe a conclusion, then we will deny any premises that do not support our conclusion without careful consideration of them.

Same or nearly the same as rationalization.

Related to affirming the consequent (fallacy), belief perseverance, confirmation bias, and straw-man fallacy.

Belief Perseverance - The tendency to maintain a belief even in the face of contrary evidence.

Same or nearly the same as belief perseverance and group polarization.

Related to belief bias, confirmation bias

Blindspot Bias – The tendency for people to see themselves as less susceptible to biases than other people. This one is more likely to affect people with high IQ or education. Anecdotally, I've noticed that people who have converted or deconverted as an adult tend to be guilty of this by thinking they have transcended above bias.

Same or nearly the same as self-serving bias.

Related to belief bias, belief perseverance, confirmation bias.

Bystander Effect - The tendency to not respond to a situation when there is a crowd of people also not responding. When we see a car on the side of the road, we don't stop to help because nobody else is stopping to help. This effect happens because we don't want to stand out, rationalize that we might not be needed, or we actually think we might be wrong and everyone else is right. In apologetics and theology, this is when we see other people accepting sinful behaviors so we don't try to stop it (if and when we are in the proper role to do so).

Same or nearly the same as an appeal to the majority (fallacy) or bandwagon fallacy.

Related to the availability heuristic, false-consensus effect, normalization, the spotlight effect, systematic desensitization.

Cognitive Ease - We are more likely to accept something or like it if it is easy to process. This includes the content, the way the content is presented, and the medium used to present it. Using a clear font when writing, speaking loud enough for people to hear, high-resolution video, and simplifying a complex concept are just a few ways to take advantage of this bias.

Same or nearly the same as

Related to mere exposure

Cognitive Dissonance - The uncomfortable feeling we get when we have inconsistent beliefs or when our actions do not align with our beliefs. When our actions and beliefs are inconsistent, we usually change our minds to align with our beliefs because they are more observable so people won't recognize our hypocrisy. This can be used in apologetics to show that a person's moral concerns (environmentalism, politics, human rights, etc.) do not align with their beliefs because there is no objective morality without God.

Same or nearly the same as

Related to

Commitment Bias - The tendency to stick with what we're already doing or already believe even when new evidence suggests we should change. Anyone with a firm commitment to their current beliefs about God is susceptible to this.

Same or nearly the same as escalation of commitment, hasty generalization (fallacy), premature cognitive commitment, sunk cost fallacy.

Related to appeal to authority, backfire effect, foot-in-the-door technique, self-herding, status-quo bias

Compensatory Control - When we lose control in one situation or domain, we try to compensate by gaining control in another area. During an election year when there is political uncertainty, religious people tend to view God as being more in control than during non-election years. We gain compensatory control through work, routine, parenting, and many other domains.

Same or nearly the same as ...


Related to attachment

Confirmation Bias - This is has become a fairly broad term to describe any bias, action, or thing that helps us confirm what we already believe. It can take the form of looking only for confirmatory evidence (as opposed to evidence that potentially disproves our view), forgetting or ignoring evidence that doesn't support our view, or interpreting evidence in a twisted way to fit our view.

Same or nearly the same as belief bias, belief perseverance, myside bias

Related to the availability heuristic, backfire effect, cherry-picking (fallacy), and pretty much everything else.

Contrast Effect - The tendency to judge something in comparison to something that came immediately before it. If you give an argument or a presentation after someone else, the quality of what you say will be judged in comparison to the person who spoke before you. This can help and hurt in apologetics depending on the person who went before you. This can apply to the quality of your videos, the design of your website, interviews, in-person or online conversations, and just about anywhere else.

Same or nearly the same as

Related to anchoring, arbitrary coherence

***** 1st New Additions *****

Decision Fatigue - As we make more and more choices throughout the day, we become more mentally fatigued and less willing to put in the cognitive effort to make careful decisions. Whether this effect exists is highly debated. A recent paper suggests it does exist, just not as broadly as originally thought. In apologetics, this may come into play if you ask too many hard questions of someone. They may just get tired of answering and stop trying, in which case they may just leave, resort to name-calling, or answer without thinking (see other biases).

Same or nearly the same as ego depletion

Related to

Decoy Effect - When there are two competitive options, the decoy option is like one of them but less desirable, making the one it is like seem best. For instance, if I am selling you a burger with fries for $5 and a chicken sandwich with fries for $5, I can add a decoy to make one sell better than the other. If I have a bunch of burgers about to go bad, I can give the option for a burger only for $4.50, making the burger with fries seem most desirable. I suspect this might be part of why some people are spiritual but not religious. They are essentially choosing between to choose between atheism, religion without rules, and religion with rules, and for many people, organized religion serves as the decoy to nudge people towards spiritualism rather than atheism. I should note that this is just a hypothesis of mine or a potential application of this effect.

Same or nearly the same as

Related to anchoring

Drop-in-the-Bucket Effect - The tendency to do nothing when our resources cannot make a significant impact on fixing a problem. I sometimes fail to use this when I talk about adoption. I cite the vast numbers of kids who need help, which is a problem no single person can fix, and so people aren't motivated to get involved. If I focused more on individual children who need help, people would be more likely to be moved and do something to help that child. In apologetics, this is important to remember when you talk about moral issues. People will be much more concerned if there is an identifiable victim.


Same or nearly the same as identifiable victim effect

Related to vividness, vagueness

Dunning-Krugger Effect - My favorite bias because I think it explains so much of the world. This is the tendency for people with minimal knowledge or experience in an area to have extremely high confidence in their ability in that area. As they gain genuine expertise, their confidence dips down before starting to climb again. This is apologetics. Almost everyone thinks they are an expert on religion and science so when you try to have an apologetics conversation, they are unwilling to listen or consider what is said because they view themselves as the expert. The original paper for this is called "Unskilled and Unaware."

Same or nearly the same as

Related to humility (opposite), straw-man (fallacy),

Ego Depletion - See the comment above for decision fatigue. They're the same. The only possible difference is that ego depletion is stated in terms of will power and compares it to a muscle that can be fatigued in the short-term but can be trained to grow stronger over time. The issue with this effect is that it doesn't always show up when expected, which made people say it's not a real effect. The research shows that essentially it can easily be overcome, so if that's the case, is it a real thing. The solution seems to be that it affects whether we decide to put forth cognitive effort for a decision. If we do put in the effort, there's no effect, but if we decide not to put in effort, we become very prone to any number of other biases listed here.

Same or nearly the same as decision fatigue

Related to

Endowment Effect - The tendency to overvalue something we own simply because it's ours. Our stuff has memories and emotions attached to it which other people don't see or value. The basic idea seems to apply to worldviews, religious practices, personal sins, and social groups too. These things are ours and are part of us and we don't want to give them up easily.

Same or nearly the same as mere ownership effect.

Related to commitment bias

False Consensus Effect - The tendency to think something is more normal than it really is. The classic example is premarital sex in high school. People, especially high school students, think "everyone is doing it," but the research shows that more than half of high school students are still virgins when they graduate. The most prevalent example for apologetics is the tendency people have to think scientists or intellectuals are more atheistic than they really are.

Same or nearly the same as

Related to the availability heuristic, appeal to the majority (fallacy)

Focusing Effect - The tendency for people to focus on a small detail or one aspect of something instead of the overall picture. In common words, it's losing the forest for the trees. This happens in apologetics when people get hung up on details that are often irrelevant or they are unwilling to move beyond a certain issue. For instance, a skeptic may focus so heavily on evil that they are unwilling to recognize the broader point that there is no such thing as evil without a moral lawgiver or that there are other arguments that show God exists.

Same or nearly the same as the availability heuristic, cherry-picking (fallacy)

Related to affect heuristic, confirmation bias, red herring (fallacy)

Forer Effect - The tendency for people to accept very broad or generalized statements about their personality as being uniquely true of them as opposed to recognizing they are largely true of most people. This basically explains the current trendiness of the enneagram, even though it is no scientifically valid. This may play a role in apologetics because people might be susceptible to view themselves in a way that could be beneficial or harmful for apologetics conversations. Trying to prime people to view themselves as rational, careful thinkers, respectful people, and so on, can help set up conversations so they are more effective.

Same or nearly the same as the Barnum effect

Related to the availability heuristic, confirmation bias, self-serving bias

Framing Effect - The way something is presented, or framed, can affect our conclusions about it. For instance, 99% effective sounds better than saying only fails 1% of the time. In one of my presentations, I show a clip from Brain Games where a cop asks witnesses how fast a car was going when it bumped/smashed into the other vehicle. By changing just one word, witnesses report drastically different speeds. In apologetics, our words matter. When we frame another worldview as ridiculous, those who agree with us and some in the middle will likely find it very convincing; however, unbelievers will feel as though we're not honestly representing their view and disregard what we say. Another example is how we present Christianity. Do we present it in a positive light so people want to follow it or are we simply known for all the things we're against? People are more prone to accept something, or at least listen when it is presented in a mostly positive way (not to say you can't or shouldn't mention the struggles of being a Christian).

Same or nearly the same as

Related to affect heuristic, appeal to emotions (fallacy), arbitrary coherence, fundamental attribution error (FAE)

Functional Fixedness - This is the tendency for people to view something only for it's intended purposes, preventing us from seeing alternative uses. The ability to break this pattern is what made Macgyver popular. In other words, this bias is thinking inside the box, so the antidote (as if it's just that easy) is to think outside the box. In apologetics, I find people sometimes have a fixed view of what Christianity is or what their identity is ("I'm a doctor and doctors aren't religious") which prevents them from seeing Jesus. If you notice this might be an issue, it's easy to overcome as long as you don't point it out in a condescending way.

Same or nearly the same as

Related to the availability heuristic, creativity, confirmation bias

Fundamental Attribution Error (FAE) - This is when we make an error in attributing something to someone or something. Usually, it's used to refer to blaming people (personal attribution) for things that were not within their control, or not completely within their control, and then we often associate the act with their character. If you cut someone off in traffic, even if it was an emergency or the unknowingly swerved into your lane, they will likely blame you for it and if you have a Jesus sticker on your car, they'll pass that judgment onto Him. Similarly, if you make a mistake about a fact (or they think you make a mistake), they will attribute that to your character and probably your intelligence. This is why it's extremely important to be careful with our words, fact check everything, and speak to others with grace.

Same or nearly the same as correspondence bias

Related to false cause fallacy (aka causal fallacy), confirmation bias, hasty generalization (fallacy), self-serving bias

Group Polarization - The tendency for the views of two groups to move further apart after discussing the topic. The obvious example is politics. Let's say a group of democrats and republicans have slightly different views on a topic when they start a conversation about it. After the conversation, they will likely move further apart. This happens for a variety of reasons, some rational and some not. Talking about the issue may help them think about it more, helping them realize their previous view was inconsistent or poorly thought out. However, it may also be due to knee-jerk reactions against the other group, an unwillingness to compromise and seem weak, following a charismatic leader, or many other reasons. In apologetics, this can happen in group discussions between Christians and other groups or during debates. To overcome this, it's important to be respectful of others and build relationships so they don't view you as an enemy who needs to be defeated. It's very hard for someone to agree with a person they view as an enemy, even when it's common sense. We have an automatic reaction to disagree with enemies or people we don't like.

Same or nearly the same as the backfire effect

Related to affect heuristic, commitment bias, conformity, groupthink, ingroup/outgroup bias, liking, obedience to authority

Groupthink - When groups have a strong desire to conform or be unified, they have a tendency to accept ideas too quickly and without critical though, leading to bad decisions. This can also happen when the group leader or the environment punishes dissent. This is different than ingroup bias or group polarization in the sense that it stems from a desire within the group to get along and an unwillingness to risk the consequences of dissent (notice how I didn't name specific errors I think many Christians make theologically 😉). This is hard to get around in social media because if someone does speak out against their side, they are criticized by both sides. If they see someone else do it, they often don't want to stick their neck out in support so they might passively like something or just refuse to comment about it.

Same or nearly the same as conformity

Related to commitment bias, false-consensus effect, hasty generalization (fallacy), ingroup/outgroup bias

***** 2nd New Additions *****

Halo Effect - This is the tendency for us to globalize a positive attribute of a person from one domain to another. For instance, if someone is physically attractive (or any other noticeable positive attribute), we're more likely to rate them as more intelligent, more competent, more trustworthy, and so on. While this effect solely focuses on positive attributes, the same applies to negatives so that if a negative attribute stands out to someone, they are more likely to apply that to us in other domains too. This is why it's extremely important in apologetics and evangelism to make good first impressions with people, to speak respectfully, dress and look respectable (not to be confused with being vain), be kind, and so on. People are much more likely to listen to apologetic arguments or the gospel if something about us (or many things) stands out as being very positive.

Same or nearly the same as

Related to affect bias, availability heuristic, Dunning-Kruger effect, first impressions, hasty generalization (fallacy), liking, representativeness heuristic

Hawthorne Effect - This is when people change their behavior when they're aware of being watched or think they're being watched. It's obvious that this happens, at least to some degree, but people sometimes underestimate how big the effect is and how easy it is to invoke it. Some studies have found that simply putting a picture of a face in certain places can get people to act better, although the effectiveness of this small of an intervention is debated. This is seen in public conversations such as on social media or other public venues because people are more likely to stick with their group's views on a topic rather than seriously consider other views for fear of being condemned by their group.

Same or nearly the same as

Related to group polarization, ingroup/outgroup bias, self-serving bias

Hedonic Adaptation/Treadmill - The tendency for people to adapt to things that are enjoyable so that it becomes their new expected standard. For instance, if you won the lottery, you would be ecstatic but you would slowly return to your previous levels of happiness, and worse, expect your quality of life to always remain the same so that you would be disappointed getting less than you had previously. If you buy a nice car, you will get used to the comforts and advantages of it so that when it comes time for a new car, you will expect the same or better, even if you don't really need all the luxuries. The application is more theological than apologetics. When we become accustomed to the comforts of American life, we tend to be calloused toward people around the world who aren't so well-off and we become unwilling to make sacrifices in our life for them. This affects how people view us when we do apologetics, but also the amount of time we spend studying or doing apologetics (or Bible study, prayer, etc.). When we become accustomed to Netflix xx hours per week, it's hard for us to give that up so that we can read, do evangelism, serve the poor, etc. We then rationalize that we somehow deserve such rest because we work so hard at other times (we should indeed rest, but we don't need nearly as much as the American lifestyle affords us).

Same or nearly the same as

Related to anchoring, halo effect, identifiable victim bias.

Herding - The tendency to follow the crowd as if we are a herd. This is a very useful heuristic, especially in unfamiliar places (e.g. traveling to a new country), but it can often lead to false conclusions. Many people have false views about Christianity because of this. They get their theology from popular media sources, leading them to think Christianity is intellectually bankrupt and faith is blind, and so they just go along with the crowd. This especially relates to sexuality and gender.

Same or nearly the same as the appeal the majority, bandwagon fallacy, false-consensus effect

Related to ingroup/outgroup bias, self-herding

Hindsight Bias - The tendency to look at events from the past as having been obviously predictable.  In other words, we look at past things with blinders on due to changes in culture or increased knowledge about something. We look at slavery as wrong today, and rightfully so, but because it's so culturally ingrained in us, skeptics sometimes have a hard time understanding slavery that is discussed in the Bible.

Same or nearly the same as

Related to affect bias, availability heuristic

Hot-Hand Fallacy - This is often discussed in terms of basketball, which is how it was discovered. Researchers found that when a person is perceived to be on a hot streak and observers think that person is more likely to make the next shot; however, the data shows this is not the case. The relevance to this in apologetics is that our immediate intuitions are not always correct. When looking at rates of abortion, effects of gun control, crime and religiosity, and so on, we can't just cite a statistic and give a simple explanation (this goes for people on all sides). Sometimes the obvious conclusion is correct, but we need to look around for other data and the best explanation for something.

Same or nearly the same as apophenia, regression to the mean.

Related to blindspot bias



Identifiable Victim Effect - The tendency to be more compassionate toward a single person in need rather than a large group of people. Even though it seems like we should be more heart-broken over a million starving people than just one, the research shows we are more likely to act and give more for a single person than for a group. This is why many charities will feature a single person in need rather than a whole group.

Same or nearly as

Related to drop-in-the-bucket effect, vividness, vagueness


Ikea Effect - When we place more value or importance on something that we build. In apologetics, if you tell someone the evidence for Christianity and give them the answer, they are likely to feel like your answer is not as good as theirs because they did not come up with it, and therefore, they will resist you. A better approach might be to tell people of certain facts or create hypotheticals based on the facts and then ask them to come to a conclusion based on those facts.

Same or nearly the same as the endowment effect, mere ownership effect, not-invented-here (NIH) effect

Related to the genetic fallacy

Illusion of Control - The tendency to overestimate our ability to control things. This affects Christians who might think they have more control over another person's beliefs than they actually do. This seems to part of the equation for people who want to push for government laws restricting behaviors that do not align with Christianity with the assumption that it will be more effective than it actually is. This is not saying there isn't a place for laws restricting certain behaviors, but it's the overestimate of the effectiveness of these laws that is the bias.

Same or nearly the same as

Related to the apophenia, compensatory control, gambler's fallacy, superstition,

Illusory Truth Effect - The tendency to believe false information is true after hearing it over and over again. Essentially, it's not the correctness that is remembered, but the content, so when people recall it, they remember it as true, unless of course they explicitly recognized it as false and argued against it. The main takeaway for apologetics is that people are likely to reject apologetic arguments when they're new to them. This means we don't need to be pushy or overbearing with people. We can and should take the long view and give them a little something to chew on time and time again. The goal isn't to get them to believe false information, but to help remove an emotional barrier to something that seems new and strange.

Same or nearly the same as mere exposure effect

Related to the appeal to the majority, false-consensus effect

Imagination Inflation - We tend to slowly exaggerate past events more and more over time. This is why we joke about fisherman bragging about the giant fish they caught way back in the day. This is a legitimate objection for atheists to the resurrection that needs to be seriously dealt with by believers. However, the mechanisms of this effect are not nearly powerful to explain the resurrection. The inflation happens along a slowly progressing continuum and the resurrection requires large leaps of imagination.

Same or nearly the same as

Related to false memories

Inattentional Blindness or Selective Attention - Strictly speaking, this is more of a perceptual error when we are so fixated on one thing, we miss surrounding cues. If you've ever seen the gorilla basketball video, that's an example of this. However, the same general thing occurs when we are so sure we are correct about something that we just blatantly miss or don't remember things that oppose our view. This is likely why atheists so often use incorrect definitions of faith or repeat the same misunderstandings about the kalam (e.g. who created God) even after they've been corrected. The correction just doesn't register with them because they're so sure they're right. On the other hand, I see apologetics get so fixated on pedantic details of an objection to Christianity and lose the whole point of what the other person was saying.

Same or nearly the same as

Related to the availability heuristic, belief bias, confirmation bias

Identifiable Victim Effect - The tendency to give more resources to a single person in need rather than a large group of people in need. This is why charities typically show a single person in need in their ads rather than a whole group of people suffering. For apologetics, talking about the 120+ people killed by atheists in the 20th century is less powerful than focusing on a single victim in greater detail. Ideally, both points would be made for the greatest impact.

Same or nearly the same as

Related to closeness, drop-in-the-bucket effect, vividness

Individualism - This is more of a cultural or personality factor, but it definitely biases our decision making. For people in individualistic cultures or people high on an individualistic scale (similar to reactance), anything that appears to violate their personal freedom will be viewed negatively. Politically, this correlates with libertarian and conservative views, which is where I get the sense that many apologists align. This should cause some apologists to question whether they've really based some of their theological and political views on Jesus or if it's more based on their desire for individualism. I'm not saying their views are wrong; only that they should be carefully scrutinized. In apologetics, this is often the underlying reason people so strongly revolt against God's moral standard, because they don't want to be told what to do, even by an all-knowing, all-loving God.

Same and nearly the same as reactance

Related to reactance conformity collectivism

Ingroup/Outgroup Biases - These are two different biases, but they're often used to describe the same thing and used interchangeably. The reason is that they're just two sides of the same coin. We are biased in favor of our own group and against other groups. When someone from our own group does something good, we apply it generally to the group as the norm and claim it as an example of the individual's character or competence. When someone from the ingroup does something bad, we rationalize it or we blame it on the individual. The opposite happens with the outgroup. When a member of the outgroup does something good, we apply it only to the individual as an exception to the norm or we try to explain it away as being a product the situation or not all that good after all. When someone from the outgroup does something bad, we apply it to the whole group and view it as the norm. Apologetics is a quintessential example of ingroup/outgroup behaviors. People on all sides are guilty of jumping on the bandwagon of bad arguments and rationalizing. About the only thing you can do is to be aware of this bias so you can try to avoid falling into it yourself and work on building relationships with people in the outgroups so they don't view you as an adversary.

Same or nearly the same as self-serving bias (but applied to groups)

Related to the appeal to the bandwagon effect, fundamental attribution error, liking, the majority (fallacy)

Liking - When we like someone, we're much more likely to listen to them, be persuaded by them, give them the benefit of the doubt, and so on. If you want to be a more effective apologist, be kind and respectful of others and they will be much more likely to listen.

Same or nearly the same as

Related to affect bias, the halo effect, ingroup/outgroup biases

Loss Aversion - The tendency for potential losses to play a bigger role in our decisions than potential gains. This is likely why people are more willing to settle with what they have than risk losing it for something better. Worldviews are an excellent example. If a person has a worldview that seems to work, and they have a group of friends who share that worldview, they are not easily going to risk giving that up for Christianity. They will fight to show that their worldview is better and even if you can show Christianity is a better worldview, they may not be willing to accept it.

Same or nearly the same as negativity bias

Related to the affect heuristic, availability heuristic, belief bias, confirmation bias, endowment effect, mere ownership effect, sunk cost fallacy, and status quo bias

Mandela Effect - This is when someone misremembers something, such as Nelson Mandela dying (how the effect was named), and then a large number of people believe it. It's spread to broad cultural norms and details of pop-culture. There's actually several online tests you can take to demonstrate this effect and give you a better idea of what it is. Here's just one of them. This is perhaps one of the strongest arguments against Christianity, specifically the resurrection, but critics don't use it It's better than the swoon theory, hallucination theory, and all other attempts to explain away the resurrection, but it still falls short. This effect can't explain eye-witness accounts, the reports of Paul and the apostles performing miracles in the name of Jesus, and it doesn't take into account the memory ability of people in the first century.

Same or nearly the same as DRM procedure, false memories

Related to imagination inflation

Mere Exposure - This is the tendency to be more willing to accept things that we've been exposed to before. In other words, new things (like evidence for Christianity) are strange to us and seem unlikely to be true so we reject them. Don't expect people, even other Christians, to accept the arguments for Christianity the first time they hear them. They'll likely need several exposures to the idea of rational faith and evidence before they'll be open to accepting it.
Same or nearly the same as

Related to

Mere Ownership Effect - The tendency to overvalue items that we own. This is studied with physical objects by seeing how much people will buy and sell things for, but there's no reason the same effect doesn't apply to things like worldviews. This is likely one of the many reasons it's hard for people to change their beliefs, even on small topics.

Same or nearly the same as the endowment effect

Related to Ikea effect and not-invented-here (NIH) effect.

Misinformation Effect - This is when information after an event affects our memory of the event. The classic example of this effect is from a 1974 study that showed people a film of a car crash. Participants were asked how fast the car was going when it either collided, bumped, contacted, hit, or smashed into the other vehicle. This change of a single word affected their estimates of the car's speed and one week later, those in the smashed condition were more likely to say they saw broken class. This is a potential argument against the resurrection, however, this effect cannot explain something as big as a person rising from the dead or the fact that the NT authors witnessed several other miracles and did miracles themselves.

Same or nearly the same as

Related to DRM procedure, false memories, imagination inflation, Mandela effect

Myside Bias - The tendency to favor evidence and arguments that support what a person already believes. My favorite study on this asked participants to state whether deductive syllogisms were valid and they scored around 70%, but when asked to do the same for abortion syllogisms opposing their own view, they dropped to about 40%. When doing apologetics, you need to find ways to bring up and discuss topics in a safe and unemotional way so that people will be willing to think instead of rejecting it without much thought.

Same or nearly the same as confirmation bias

Related to belief bias, belief perseverance
-----------------------------------------------------------------------------------------------------------------------
There are some recommended resources below for further information, otherwise, this is the temporary end of this list. I will post new biases every few days until the list is done, with a description of what's been added.




Recommended Resources
Good list of biases with much more in-depth descriptions
The Decision Lab

Books to help further understand how these factors affect our decisions.
Thinking, Fast and Slow by Daniel Kahneman
Predictably Irrational and The Upside of Irrationality by Dan Ariely
The Righteous Mind by Jonathan Haidt

Books to help overcome psychological barriers when doing apologetics and evangelism.
Influence: Science and Practice (also as Influence: The Psychology of Persuasion) and Pre-Suasion by Robert Cialdini
How to Win Friends and Influence People by Dale Carnegie

And just for good measure, here's a satirical list of biases that describe some current cultural tendencies, some of which relate to the biases listed above. A few particularly good ones are Evopsychophobia, Implicit ESP delusions, Subjectiphilia, and Wokanniblism.
Orwelexicon for Bias

Saturday, January 18, 2020

The Coddling of the Righteous Mind

I've been meaning to read Jonathan Haidt's books, The Righteous Mind and The Coddling of the American Mind for quite a while. In fact, I considered applying to work with him for my PhD because his research is so relevant for apologetics. Alas, I finally got around to reading both of these books and they were great. I wish I would have read them much sooner.

The Righteous Mind discusses the science of moral decision making, which relates to our political and religious views. This is extremely useful for apologetics because if we better understand how people have come to their decision on different issues, we approach the topic with arguments that the other person will value. Haidt shows that there are five different domains that are used for moral decisions. They are Care/Harm, Fairness/Cheating, Loyalty/Betrayal, Authority/Subversion, and Sanctity/Degradation. He also proposes the possibility of a sixth domain, Liberty/Oppression, which seems to be accepted now (the book was published in 2013).

The Coddling of the American Mind is a response to the political climate on many college campuses today. Haidt and Lukianoff (who's actually the first author, but is lesser known) discuss the various cultural shifts that have led to a generation that is unable to cope with diversity of thought or the challenges of life. The book discusses changes in parenting practices, the effects of social media, and the negative effects of cultural maxim's such as "trust your feelings." This book is helpful for apologetics specifically for anyone who wants to reach Gen Z and also extremely helpful for parents.

Both books discuss interesting scientific research mixed with real-life events that often make the headlines. For this reason, they were enjoyable to listen to and easy to comprehend on an audiobook. My guess is that anyone who reads or listens to them will learn quite a lot about why people are the way they are and it won't feel like a chore either. I highly recommend these books to all apologists, especially anyone who does college ministry (e.g. Ratio Christi chapter directors). Additionally, parents should read The Coddling of the American Mind. Even though it's not a parenting book, per se, it will be as helpful, if not more, than most parenting books.

Both books have websites with additional resources and information for people who want to go further. TheCoddling.com and RighteousMind.com

Tuesday, January 14, 2020

The giraffe's neck: evidence for evolution or design?


If you've done any investigation into the debate between evolution and intelligent design (or creation), you've probably heard about the giraffe's neck. Not only do both sides claim it in favor of their position, but they often tout it as irrefutable proof that they are correct. How could this be and which side is right?


Let me start by explaining why both sides of the debate claim it for their side. The evolutionists look at the recurrent laryngeal nerve, which travels from the head all the way down the neck, under the aortic arch, and back up to the larynx. It's a 15 ft nerve that only needs to be about a foot, if that. Evolutionists say if this was designed, it was done poorly and not what is expected, but if it evolved from a shorter-necked species, it's exactly what is expected.

On the other hand, those in the intelligent design camp look at the features in the neck that protect the giraffe's brain. Because giraffes are so large and their heads are so high, their hearts have to pump blood very powerfully to get the blood up there, but when they bend down to drink, gravity will pull the blood down rapidly and flood the brain. To prevent death every time they drink, they have an advanced pressure control system which is far too complex to have evolved all at once (irreducible complexity), therefore, the best inference is that it was designed by an intelligent being.

I have no idea if this is
real of Photoshopped.
The arguments from both sides are more complex than I presented them and both sides have counter-arguments to the opposing claims so don't think my caricature settled the matter. My point isn't to argue for either side, but to help people see or present the argument with less bias.

It's easy to poke holes in any claim. That's why we have the flat earth society, Holocaust deniers, and so many other off-the-wall beliefs that are semi-normal in our society. We have to look at both sides of an argument in comparison to one another in order to avoid our own biases.

In this case, what is really being compared is whether God created something that appears poorly designed or that random mutations were able to produce an incredibly complex system all at a single time. In other words, based on our current knowledge, the choice is between something that seems odd or unlikely, but not impossible (God's design seems poor) to something that seems statistically impossible (irreducible complexity).

I admit that this seems to handicap evolution from the get-go, but there are two points to consider. First is that we should avoid making decisions based on a single instance if we can avoid it. So maybe you agree that the design argument is better in the case of the giraffe, but for the rest of biology, evolution makes more sense. The second point is that there are other domains that are equally biased against God or Christianity (historical claims of the Bible, the cosmological argument, etc.).

If we know both sides of the evolution debate and approach the topic with an open and honest mindset, we have to admit that it's a hard choice. Both sides make a pretty good case and both sides have issues. The case of the giraffe is a fun illustration because we see the trade-offs in a single species, but this method of comparison should not limited in this way. We should apply it to the evidence and limitations for each argument in the whole debate. Hopefully, by framing the debate in this way, we can make a more rational choice or present the topic in a way that helps others make a more rational choice.


Romans 1:4

I learned a valuable lesson from this verse, completely unrelated to what the verse actually says! It's also a lesson I already knew and should have been more careful to pay attention to.

When I first set out to memorize this verse, I read several translations and the Greek, which is good and what I should have done; however, none of them seemed to make much sense so I gravitated toward the one that made the most sense to me on the surface (NLT) before studying the verse in depth. As a result, I got it wrong.

My initial translation was:
"Jesus Christ, our Lord, was declared the Son of God when He was resurrected from the dead by the power of the Holy Spirit." 
There are three subtle errors here. I mistranslated declared (ὁρισθέντος), "by the power of" (ἐν δυνάμει), and "Holy Spirit" (πνεῦμα ἁγιωσύνης). What what is a better translation? After studying it all last week, I think the best wat to translate this verse is:
"Jesus Christ, our Lord, was appointed the Son of God in power according to the Spirit of holiness by resurrection from the dead."
The problem with the correct translation is that the meaning makes little sense. It's filled with vague or ambiguous terms that don't make sense without further study. I'm going to walk through these specific translation issues to explain what each phrase means.

Was appointed
Jesus was always the Son of God. There's no question about that. So then what does it mean that He was appointed by the resurrection? Before the resurrection, Jesus was fully human, meaning He had human constraints, at least to some degree. The resurrection formally appointed Jesus as Son of God in a unique way. In other words, the resurrection was the official beginning of a new age, which is nothing new to Christians.

In power
The debate on this verse is what power modifies. In other words, is it best translated:
  1. Appointed with power
  2. Son of God in power (or powerful Son of God)
  3. Power of the Holy Spirit
Due to word order and parallelism with verse 3 (in contrast to "descendant of David"), I chose "Son of God in power." The resurrection was the event that ushered in Jesus' full power, above and beyond the power He had prior to the resurrection.

Spirit of holiness
Most translations say Spirit of holiness while some say Holy Spirit, but they are essentially the same thing because the capital S on spirit means it is referring to the Holy Spirit. The other option is the translate it as "spirit of holiness" which would refer to Jesus' spirit in contrast to His flesh, which is the state He lived in from birth to crucifixion.

Many of the commentaries point to the parallelism with v. 3 (in contrast to "according to the flesh") and the fact the πνεῦμα ἁγιωσύνης is never used in the rest of the NT to refer to the Holy Spirit as arguments against this being a reference to the Holy Spirit, but then they go on to say it is a reference to the Holy Spirit without giving strong arguments for it. The best case seems to be parallels in the OT Septuagint for Holy Spirit (Ps 51:11, Isa 63:10-11) which seem best translated that way, but it's not exactly the same in the Greek (πνεῦμα τὸ ἅγιον vs. πνεῦμα ἁγιωσύνης) and that also seems to be an anachronistic translation or understanding of the OT.

It seems best not to be a reference to the Holy Spirit, but I don't think I understand the nuances enough to be rationally justified in disagreeing with the majority of NT (and OT) scholars on this. For this reason, I hesitantly accept this as a reference to the Holy Spirit rather than a reference to Jesus' spirit, but either way, the translation stays the same except for the capitalization.

If this were a more theologically important verse, I would spend more time studying it, but since it's not, I think it's best to hold this one loosely and move on to the next passage.

Monday, January 13, 2020

Review: Awkward: The Science of Why We're Socially Awkward and Why That's Awesome

Awkward: The Science of Why We're Socially Awkward and Why That's Awesome Awkward: The Science of Why We're Socially Awkward and Why That's Awesome by Ty Tashiro
My rating: 5 of 5 stars

I stumbled on this book on Amazon and thought it might be interesting and it certainly was, although I'm not sure everyone will think so.

The book is well written and uses a good combination of personal anecdotes and scientific discoveries to inform the reader about why people are awkward and how they can be less awkward. The author seems like the perfect guy to have written this book because he was an awkward kid, seemed to admit he is still awkward, and has a PhD in psychology.

The reason I thought this book was so interesting is because I am one of the awkward people this book talks about. I could relate very well to the descriptions of awkward people and the things they do. I laughed when he mentioned about awkward people putting cognitive effort into setting the microwave in the most efficient way because that's exactly what I do.

I think many people, particularly awkward people, will find helpful are the chapters that offer advice about hot to overcome some social awkwardness. I think awkward people will like this book because it will likely give them a sense that there are people who understand them and it can give them hope for being understood by those close to them. For non-awkward people, this book may seem strange and you might find it odd that people are really like what he describes. Still, the book is written in a way that you should still enjoy it, especially if you can think of a co-worker, child of yours, or another family member who is awkward because it will help you understand them better.

This book wasn't life-changing, but I can imagine that it might be for some people. Even if it's not life-changing, it was enjoyable to listen to and offered insight to better help understand other people. For that reason, I recommend it to anyone interested in understanding others.

View all my reviews

Thursday, January 9, 2020

The apologist's super-secret weapon

You have to meet people where they are
or they'll never get to where you are.

Part I of this article discussed paraphrasing as a secret weapon that anyone can use in apologetics or evangelism. All you have to do is repeat what the other person said in your own words. When done correctly, this method is so effective that often times the other person will reveal some personal details about their thoughts, beliefs, or actions, which can lead to some awkward moments.

Knowing this will happen is helpful to prepare you, but you also need to respond the correct way so that you don't shut down the conversation and damage the relationship. To help you be prepared, some possible confessions that come to mind are past or current affairs, murder, abortion or paying for someone to have one, having an intersex condition, homosexuality or same-sex attraction, being transgender, past sexual abuse, being in an open marriage, rape, war crimes, time in prison, drug use, and many many more.

Whatever you think is the worst possible thing a person could do, imagine someone confessed that to you and keep it in mind as you read the rest of this article (this is assuming they've served their time or are no longer a threat in potentially dangerous or criminal situations).

Super-Secret Tip
This "bonus" tool is always a good thing to practice, but it's mostly necessary because paraphrasing is so effective. If and when someone shares something with you that's very private, you have to prevent yourself from reacting with a negative tone of voice, look, or otherwise critical manner. If you show any trace of disapproval, you will shut down the conversation immediately and probably damage the relationship.

Psychologists call this unconditional positive regard. No matter what the person says, you need to respond positively or at the very least, neutrally. You don't have to agree with the person or condone their actions, which is going to be very tough for some of you to understand, but you still need to react in a non-negative fashion.

The hardest thing to control for most people in these situations will be your facial expressions, but you'll also have to watch your body language and keep yourself from blurting something out, including a laugh or audible gasp. An easy way to respond verbally is by paraphrasing, thanking the person for trusting you enough to share such personal details, or by asking how they feel about whatever they just revealed.

Conclusion
When someone share's an embarrassing or controversial detail about their life, they almost always already know you disapprove or might think they're abnormal so you don't need to respond with critical comments. This is true even when every bone in your body might be telling you to point out what you think their errors are (and it's probably your reaction telling you to do this, not the Holy Spirit's). Instead of correcting or debating, focus on making sure you understand their story and they feel safe. Once you've done this, you will begin to earn the right to respond, which is necessary if you actually want your words to make a positive impact on the other person.

We all have issues, some more severe than others and some just more taboo in our culture. Expecting someone to have their issues worked out before you will accept them is hypocritical and the exact opposite of what Jesus did. It's also a highly ineffective strategy, which also damages the reputation of all Christians.

Check out my articles on persuasive apologetics for more tips on increasing your effectiveness.

The apologist's secret weapon

I'm currently reading the new edition of Tactics, which is a must-read if you want to learn how to have better conversations with unbelievers. The books Relational Apologetics and Influence: Science and Practice (non-Christian book) are also great for evangelism and apologetics. These books are wonderful, but it takes a lot of studying to remember all the different methods and a lot of practice to be able to effectively put them to use.

What if there was a very simple method you could use in every situation that will help your conversations be more fruitful, lead to future opportunities, prevent misunderstandings, and take the pressure off yourself? Obviously, if there was such a method, we'd all do it, at least if we knew about it, which is why I'm telling you about it!

Clinical psychologists, and all others who do counseling, are trained in methods that help facilitate conversations. They're able to create an environment that helps people feel a greater sense of trust and connection with the counselor so they will be more willing to talk about personal topics and will be more open to receiving advice from the counselor. This is exactly the type of situation we want to create in apologetic dialogues so we should learn from their methods.

Even though it's extremely important to learn all the knowledge you can for apologetics, applying this technique is even more crucial because the answers can be found later and, more importantly, people aren't looking just for answers. This is abundantly clear from psychology. People want answers, but they also want connection. They want to express their views to another person, be understood, and feel respected (That's also great relationship, parenting, and leadership advice. You can pay any time 😉). To do this, you don't need to have answers. You just need to listen and understand.

Secret #1
The technique I am referring to, which is used extensively by psychologists, is paraphrasing. When using this method, you don't need any knowledge about anything related to apologetics. Answers are good, and we should have them, but the point is that we don't need them in order to have effective conversations with people. All you need to do is to listen carefully to the other person and repeat back to them what they said in your own words.

By paraphrasing or summarizing what the other person says, you're ensuring that you've properly understood their point of view, which will prevent both sides from getting frustrated at how "dumb" the other person is. When you do this, you will build trust with the other person so they'll be more likely to listen to you when you do give a response and more likely to have future conversations with you. As a bonus, it will also buy you more time to think of a response so you won't have to tune them out as you formulate a response in your head.

Image result for paraphrasingThis method is arguably more important in online conversations, even though it will drastically slow things down. When we can't hear a person's tone of voice or see facial expressions and body language, online conversations can very easily become uncharitable and degenerate into a cesspool of linguistic muck. Paraphrasing solves this issue most of the time, but it's hard, and sometimes too effective. Isn't that a nice problem to have?

The reason it's hard because it requires a lot of restraint to refrain from giving a response to an objection you've practiced a hundred times in your head. It's very tempting and I don't even do it as often as I should. It's easy to think we understand and then respond, only to see the conversation take what seems like a sharp turn off a cliff.

The other "issue" is that paraphrasing is sometimes so effective that people will often overshare very personal details about their life or beliefs with you, leading to some potentially awkward moments. When this happens, you have to use the apologist's super-secret weapon (it's a separate article for clarity and to limit the length here) so you don't damage the relationship.

Conclusion
Once you understand, you can choose to respond, or not, depending on the situation. You are probably going to err in one direction or the other, either by talking too much (and being seen as too pushy) or by not talking enough. If I had to choose one, I'd choose to err on the side of talking too little because this is going to open more opportunities in the future, and ask Koukl says in Tactics, put a stone in their shoe (because people typically don't spend time thinking about what annoying people told them in an argument).

As I've heard so many apologists say, particularly Ravi Zacharias, our goal in apologetics is to win the person, not the argument. Paraphrasing can help us win the person and never lose an argument. Be sure to read about the super-secret weapon and other scientific methods of persuasion to take you to the next level.

Sunday, January 5, 2020

What you don't know about biological sex

Image result for intersex

Sexuality, biological sex, and gender are the hot topics of the day, and unfortunately, nearly everyone has an over-simplified view of each topic. For this article, I'm only going to focus on biological sex, and for 99% of people reading this, my guess is that there's a lot more to it than you realize. I have no intent to change anyone's mind on any theological implications, but instead, I want to help people understand the topic so they can think more clearly, speak more intelligently, and be less insulting to people who don't fit the standard categories.

When people talk about biological sex, they typically refer to what is considered "normal," an XY male with a penis or an XX female with a vagina, but there's more to it than that. As many as 1.7% of the population is intersex, or what is sometimes referred to as a disorder of sexual development (DSD), although some consider this term disrespectful (the term hermaphrodite is considered derogative, too). I am using the term intersex in the broadest sense, to refer to any instance where a person's biology does not align with what is typically considered male or female. This is the best definition I've seen and comes from an 11-year-old intersex kid in this TED Talk.
Image result for boy or girl
One of the reasons people are not more aware of these differences is because many intersex people are unaware of it themselves. Many don't find out until puberty or struggles with infertility, but some go their whole life without finding out. In these cases, only genetic testing could tell them they're intersex.

Below is a list and brief description of the lesser-known variations of sex. As you read these, think about whether you would consider someone with such a condition a male or female, how should they describe their gender identity, and what does it mean for them to be homo- or heterosexual?

Sex Chromosomes: XO 
(Turner syndrome (TS); 45,X; or 45,X0)
Females most commonly have two X chromosomes, but for women with Turner Syndrome, one or part of one of the sex chromosomes is missing. This affects about 1 in every 5,000 girls. Some people have no signs of it while others might be affected by various physical abnormalities such as a wide neck, low-set ears, heart abnormalities, delayed growth, and others.

Sex Chromosomes: XXX
(Triple X Syndrome; Trisomy X; or 47,XXX)
Just as some women can have only one X chromosome, a slightly more common variation (1 in 1,000) have three X chromosomes. Women with this genetic variation are at higher risk for behavioral problems, learning disabilities (e.g. ADHD), mental health problems, and some physical problems such as seizures, flat feet, and more. The symptoms are usually mild and often even non-existent.

Sex Chromosomes: XX
(XX Male Syndrome; 46,XX testicular disorder; XX sex reversal )
We all "know" that men have XY chromosomes and women have XX chromosomes, but about 1 in 20,000 people with a male appearance has two X chromosomes. They are usually shorter than the average male but look like a male in every way. They are infertile but otherwise, they usually have a normal functioning penis. Some will have smaller or undescended testes, a urethra hole on the underside of the penis, or ambiguous genitalia.

Sex Chromosomes: XY
(Swyer Syndrome; Gonadal Dysgenesis; 46,XY Complete Gonadal Dysgenesis (CGD); 46,XY Sex Reversal; XY Female Type)
About 1 in 80,000 people have Swyer syndrome, which is when someone looks like females and have female genitalia, but has XY chromosomes. Although they do not have ovaries, they can become pregnant and give birth via implantation. Most people with Swyer syndrome don't have symptoms until they never go through puberty or get a period, which will happen after hormone therapy.

Sex Chromosomes: XXY
(Klinefelter Syndrome (KS); 47,XXY Syndrome; XXY Syndrome; XXY Trisomy)
This is the most common variation, affecting about 1 in 650 newborn boys. Instead of copying an extra X chromosome as with XXX females, the extra chromosome copied is a Y. People with KS have few to no symptoms (estimates are that 75% of people with it never know). Individuals with this are often taller, but may have weaker muscles, decreased testosterone, delayed puberty, breast enlargement, small penis, undescended testes, and other related effects. As many as 10 percent of people with KS have autism.

Sex Chromosomes: XYY
(47,XYY Syndrome; Jacob's Syndrome; XYY Karyotype; YY Syndrome)
About 1 in 1,000 boys are born with an extra Y chromosome. These males are typically taller than average and may have an enlarged head or teeth, flat feet, widely spaced eyes, or other physical effects, but many have no side-effects. They're also at an increased risk of having ADHD, autism spectrum disorder, anxiety, and depression.

Sex Chromosomes: XXYY
(48,XXYY Syndrome; XXYY Syndrome)
This condition affects 1 in 18-40,000 newborn boys so they will have the appearance of males. People with it have an average height of 6'4". Other symptoms include infertility, intellectual disabilities, decreased testosterone, reduced body hair, poor muscle development, breast enlargement.

Sex Chromosomes: XXXY
(48,XXXY Syndrome; XXXY Males; XXXY Syndrome)
People with this chromosomal condition look like males. They tend to be taller than average, but usually have less testosterone, which can lead to enlarged breasts, smaller penis, undescended testes, and incomplete puberty. They're usually infertile and often have intellectual disabilities. This affects 1 in 17-50,000 newborn boys.

Sex Chromosomes: XXXXY
(49,XXXXY Syndrome; 49,XXXXY Chromosomal Anomaly; Chromosome XXXXY Syndrome; XXXXY Aneuploidy; XXXXY Syndrome)
This condition is sometimes called Klinefelter syndrome (see above) because it affects people similarly, but the effects are more severe and wide-ranging. It affects about 1 in 90.000 newborn boys so it's quite rare. All people with this condition are infertile

Sex Chromosomes: XXXYY
(49,XXXYY Syndrome; XXXYY Syndrome)
This is extremely rare, with only a handful of known cases. Some symptoms are severe intellectual disability, facial deformities, ambiguous genitalia, small penis or testes, enlarged (male) breasts, and more.

Sex Chromosomes: XXYYY
I couldn't find much on this. I think this genetic arrangement is more theoretical than actual, but there is at least one known case of it. It is likely similar to the cases discussed above, but probably has more severe symptoms similar to XXXYY.

Clitoromegaly
Image result for intersex society of north americaThis is an enlarged clitoris, which may not sound very severe, but in some cases, it can be so large it looks like a penis. This is a symptom of XX males and other conditions discussed in this article. The picture on the right was and is used to measure the clitoris/penis length to determine if a baby with ambiguous genitalia should have surgery to "fix" their genitals to look like a traditional male or female. This also essentially determines how they are raised.

Congenital Adrenal Hyperplasia (CAH)
(Subtypes: 21-hydroxylase deficiency (most common form); 11-beta-hydroxylase deficiency; 17-alpha-hydroxylase deficiency; 3-beta-hydroxysteroid dehydrogenase deficiency; Congenital adrenal hyperplasia due to cytochrome P450 oxidoreductase deficiency; Congenital lipoid adrenal hyperplasia )
This condition can affect males and females, but females usually experience more severe symptoms. It affects the production of three different hormones, one of which is the male sex hormone, androgen, which is why it relates to this article. It can cause either sex to have ambiguous genitalia. In less severe forms, males can a have small penis and testes and females can have an enlarged clitoris.


Androgen Insensitivity Syndrome (AIS)
(Androgen Receptor Deficiency, Androgen Resistance syndrome, AR Deficiency, Dihydrotestosterone Receptor (DHTR) Deficiency)
This is a rare condition (1 in 25-50,000) that affects sexual development before and after puberty in males. Their bodies do not respond to the male sex hormone, androgen, so they develop external female sex characteristics but have no uterus or ovaries and so they cannot give birth like XY females (see above). They are typically raised female and have a vagina. However, there is partial androgen insensitivity (Reifenstein Syndrome), which exists on a continuum and is less severe,. People with this could appear as male or female, but even when appearing as a male, they are usually infertile and tend to have enlarged breasts.

5-Alpha-Reductase Deficiency
(Familial Incomplete Male Pseudohermaphroditism (type 2), Pseudovaginal Perineoscrotal Hypospadias, PPSH)
This is similar to AIS (above), but instead of an insensitivity to androgen, it's an insensitivity to the male hormone dihydrotestosterone (DHT). People born with this often have female genitalia at birth and are raised as females. However, they might also have ambiguous genitalia or a small penis. During puberty, increases in male hormones often gives the appearance of more male-like secondary sex characteristics (deeper voice, greater muscle mass, etc.) for those raised as females. Those raised as males will likely have decreased body and facial hair and will likely be infertile, at least without reproductive assistance. Most are raised females but many eventually adopt a male gender identity in adolescence.

Conclusion
This article merely laid out the biological facts, which is merely a starting point for talking about gender identity, and sexuality. In one study, over half (52%) of people with a disorder of sexual development identify as something other than heterosexual, which should cause us to think, what does it even mean for people with these genes to be heterosexual, homosexual, or other? Is an XX male a homosexual if he is attracted to XX females or XY males?

Even though intersex conditions are somewhat rare for the general population, intersex people may be a large percentage of people who are not heterosexual or identify with a different gender. I say it's possible because there is limited research on this and what does exist shows a wide range of correlations between intersex, homosexuality, and gender dysphoria.

Hopefully, this gave you something to think about and has challenged you, at the very least, to be a little more careful and respectful about how you speak about sexuality, gender, and biological sex. If  I said anything in this article in an offensive way, I apologize. I tried speaking respectfully, but I also tried using common language so people would get what I'm saying. If I failed at either one, please let me know.

References
I relied heavily on these websites for this information and would recommend you start there for further research.
Genetics Homs Reference
RareDiseases.info.nih.gov
National Organization for Rare Diseases (NORD)
Mayo Clinic

Here are some other useful resources. If you can't access peer-reviewed journals through Google Scholar, trying using Sci-Hub.

APA.org article that gives a brief overview of some of these conditions
-Ahmed, S. F., Morrison, S., & Hughes, I. A. (2004). Intersex and gender assignment; the third way?. Archives of disease in childhood, 89(9), 847-850.
-Blackless, M., Charuvastra, A., Derryck, A., Fausto‐Sterling, A., Lauzanne, K., & Lee, E. (2000). How sexually dimorphic are we? Review and synthesis. American Journal of Human Biology: The Official Journal of the Human Biology Association, 12(2), 151-166.
-Furtado, P. S., Moraes, F., Lago, R., Barros, L. O., Toralles, M. B., & Barroso Jr, U. (2012). Gender dysphoria associated with disorders of sex development. Nature Reviews Urology, 9(11), 620.
-Jones, T., Hart, B., Carpenter, M., Ansara, G., Leonard, W., & Lucke, J. (2016). Intersex: Stories and statistics from Australia. Open Book Publishers.
-Meyer-Bahlburg, H. F. (1994). Intersexuality and the diagnosis of gender identity disorder. Archives of Sexual Behavior, 23(1), 21-40.
-Sax, L. (2002). How common is Intersex? A response to Anne Fausto‐Sterling. Journal of sex research, 39(3), 174-178.