Explanations of Basic Fallacies

Prepared by T. Gracyk   

(For definitions of other 
logic terminology, click here)

Ad Hominem (Personal Attack) 


Anecdotal Evidence 

Affirming the

Appeal to Authority 

Appeal to Common Belief 

Appeal to Common Practice 
(Bandwagon Fallacy) 

Appeal to Fear (Scare Tactics) 

Appeal to Good Intentions 

Appeal to Ignorance 
(Shifting the burden of proof)

Appeal to Loyalty 

Appeal to Pity 

Appeal to Tradition 

Attacking the Person (ad hominem)

Begging the Question 

Biased sample 

Black-and-White Fallacy 

Causal fallacies 

Changing the Subject (red herring) 



Common Practice 

Complex Question 

Confirming Evidence 

Denying the 




False analogy 

False dilemma 

Hasty Generalization 


Loaded Question 

Middle Ground Fallacy 

Misuse of Authority 

Misuse of Hypothesis 

No True Scotsman Fallacy 

Overlooking common

Phony Refutation 

Post Hoc 

Poverty of Aspect 

Prejudicial Language 

(includes nationalism & flag-waving) 

Red Herring (Changing the Subject)

Reversing cause and effect 


Slippery Slope 

Straw Man 

Wishful Thinking 



Ad Hominem (Personal Attack or Attacking the Person)

The fallacy of responding to an opponent's argument by changing the subject to the person who gave the subject, introducing the false assumption that a person of this sort cannot offer an argument worth considering. One can deflect attention from the arguer's position by shifting to discussion of the arguer's personality, character, associates, motives, intentions, qualifications, and so on.

Example: "Ignore what Professor Schiff says about the origins of the Old Testament. I happen to know that Schiff is an atheist."

When we attack the person, we ignore the content of the argument. But an argument is a relationship between premises and conclusion, and the argument's soundness (or lack of soundness) is completely independent of who is giving the argument. "There is nothing to what Smith says about the Nazi death camps; just remember that his family included prominent Nazis" may sound convincing, but this refusal to look at Smith's evidence is a refusal to evaluate the soundness of the argument. (After all, Smith may be ashamed of her relatives and might be confirming the existence of the death camps against anti-Semitic opponents who argue that the camps are a hoax.) 

Given this definition, the fallacy is restricted to cases where there is an opponent and where one is responding to that opponent. 

Recognition of the fallacy of attacking the person DOES NOT DENY OR CONFLICT WITH the legitimate need to evaluate the source of information that is put into an argument. Consider these two cases: 

  • "In thinking about Green's conclusions about human impacts on the Amazon basin, don't accept Green's claim that the region has only two million people. Green used an encyclopedia from 1955 that he bought in a junk store." 
  • "Don't accept Johnson's predictions about productivity in her report. I work at the desk next to her and I heard her saying she couldn't get the research done on time, so she just made up new numbers after looking at last year's report." 

These responses to Green and Johnson are perfectly appropriate, since they are calling arguments into question by questioning sources of information in the arguments. They are pointing to flaws in the argument by discussing sources of specific premises, not dismissing the whole argument by appeal to facts about Green and Johnson.

Similarly, discussing someone's trustworthiness or expertise is always relevant if we are evaluating testimony. 

A special case of Ad Hominem is phony refutation, in which one dismisses an argument or position by citing inconsistency between the speaker's words and actions. Inconsistency is a fallacy when the inconsistency is between two parts of an argument, but inconsistency between words and actions may be due to understandable moral weakness or to a legitimate change in one's position. 

Example "How can you advise me to wait until I'm older? I happen to know that you were pretty wild at my age." 

Example: "How can you tell me not to smoke? You smoke two packs a day."

Example: "How can support a crackdown on underage drinking? I remember at time when you thought that such policies were a waste of money." 

Yes, and the first speaker may regret that "wild" youth and may be offering you hard-earned advice that is worth your consideration. The smoker who advises you against smoking may have tried to quit many times, and is warning you of what you face if you make the mistake of starting. The third speaker may now understand, through personal experience, that underage drinking is a serious social problem, and has revised their thinking about the topic. 

The Latin name for phony refutation is Ad Hominem Tu Quoque. Informally, it's known as the "You Too Fallacy."



As with many of the fallacies, ambiguity is only a fallacy if we first establish that it takes place within a context of reasoning! People make ambiguous statements all the time, but it doesn't have the status of a fallacy unless they are engaged in reasoning (in leading someone to a conclusion).

The fallacy requires the following:

  • A word or phrase is used that has two or more distinct meanings.
  • We can state paraphrase both, that is, we can say how the two uses differ.
  • We cannot tell which of the meanings is meant by the arguer.

When we cannot "pin down" the meaning of a premise due to its ambiguous wording, we must suspend judgment on its truth, making the argument unsound.

Example: "Why are you opposed to my decision? You didn't object that time I did almost the same thing."

Since we can't tell what was done or when (the speaker could be directing us to any number of things), we don't know what the decision is being compared to, and the attempt to argue by analogy is undercut by the fallacy of ambiguity.

When the distinct meanings are generated by a punctuation error or error in grammatical construction, the fallacy is technically known as the fallacy of amphiboly.

Warning: There is no standard agreement on where to draw the line between ambiguity and equivocation. Many logicians treat the fallacy of ambiguity more like the way I have treated the fallacy of equivocation


Anecdotal Evidence 
When generalizing, the fallacy of drawing on limited personal experience or a vivid example as the basis of the generalization. The arguer places too much trust in the example or personal experience even though it is not really representative. Because the sample size will be inadequate and because the sample itself will be chosen to confirm the intended conclusion, anecdotal evidence makes generalizing unsound.

Example: "I don't care what the experts say. Someone who lives in my hometown burned to death in a car accident because he couldn't get out of his seat belt. Anyone who wears a seat belt is just asking for trouble."

Real Example (NY Times) -- A response to an essay on the value of college majors that promote critical thinking:

In 2006, I graduated from a respected 4 year university with a degree in Political Science, emphasis Middle Eastern Affairs. After failing to find a job in my field, and many years working at a bank, I'm back in school studying engineering. So much for critical thinking, no money in it.

NY Times, online comment by gitrjoda, March 5, 2011



Affirming the Consequent 
The invalid argument pattern in which the second premise endorses the consequent of a conditional. This pattern is fallacious.

 Argument form:

1. If A then B

2. B     



1. If Sam is tired then he'll fail the exam.

2. Sam failed the exam.   

    Sam was tired.

It's invalid because there might be a different reason why the antecedent fails (in this example, maybe Sam was well rested but didn't study)


Appeal to Common Belief 

Any argument that defends a belief by pointing out how many other people have the same belief. But consensus does not make something true. Just remember that even today, huge numbers of people remain ignorant of basic science and think that the earth is the center of the universe. The fact that most Americans believe in angels doesn't make angels real; the fact that most Americans believe that JFK was a great president doesn't prove that he was.

There is only a subtle difference between this fallacy and appeal to common practice.


Appeal to Common Practice (or Bandwagon Fallacy, or fallacy of mindless conformity)

Any argument that defends or recommends a behavior by pointing out how many other people have the same behavior.

Remember when you told your mother, "But I have to have it! All the other kids have one!"? She probably answered, "If all the other kids jumped off a bridge, would you do that, too?" 

Just because large numbers of people do something is not particularly good evidence that you should, too.

Real Example: (Reader's Digest, October 2002)

#1 DVD

There is only a subtle difference between this fallacy and appeal to common belief. Common practice is an appeal to people's behavior, while common belief is an appeal to their opinions.

The following advertisement is not a fallacy of appeal to common practice:

   Japanese Steak House & Sushi Bar

"The Best Tasting Show In Town"

2001 Diner's Choice Award
Best Japanese
              MSP Magazine

Sushi Bar Buffet


This advertisement does not tell us to patronize the place because it is popular. Instead, it tells us that we should go there because an authority (MSP Magazine) says it's the area's best Japanese restaurant.


Appeal to Fear, Appeal to Force (Scare Tactics)

An argument in which a strong emotion, in this case fear of what will happen if we disagree, is offered by the arguer as a reason to agree. Basically, the arguer says "Agree with me or else something bad will happen to you." Example: "How can you say that Bill Clinton was a sleaze? Remind me of that next time you want to borrow my car!" (There is no relationship between the two things, and the threat of no car is being used to force agreement.)

Keep in mind that fear can be a good reason to do something. I floss my teeth because I'm afraid that not doing so will lead to gum disease. But there is a real and pre-existing relationship between these things. But fear cannot itself be a reason in the necessary sense of evidence: fear cannot be a reason to agree with someone. In the fallacy, the arguer creates a threat to force agreement, where the arguer has no legitimate right to do so. 

Real Example: Dr. Ned Buyukmihci raised questions about the way animals were being used in research at the University of California, Davis. He received a written statement from C. A. Hjerpe, director of the facility (the Veterinary Medical Teaching Hospital), in which Hjerpe wrote, "one of the best arguments against your position is that 22 members of the veterinary staff are hunters. Do you want to antagonize all these people, and for what useful purpose? Wake up and smell the coffee." Here, Hjerpe used a threat to try to get Buyukmihci to agree to prevailing views.

Real Example (from the Star Tribune, December 3, 2003): 
In a surprising move that would reverse a nearly century-old ban on capital punishment in Minnesota, Gov. Tim Pawlenty vowed Tuesday to push for restoring the death penalty after learning that the suspect in the Dru Sjodin abduction is a repeat sex offender. "As a Minnesotan, as the governor, as the parent of two young daughters . . . I have had it with sexual predators and individuals who are repeat offenders," Pawlenty said at the State Capitol. "I'm fed up." In certain cases of murder or attempted murder, he said, "I support the death penalty." 

ANALYSIS: Notice that Pawlenty is calling for the death penalty to protect his 2 daughters, based on the mere arrest of a SUSPECT who is not yet charged with a crime. There is no reason to believe that capital punishment will protect his daughters, or anyone else, given the fact that sexual predation is not punishable by death anywhere in the United States. Pawlenty is trying to gain political support for himself by scaring people.

The Latin name for this fallacy is Argumentum Ad Baculum. 


Appeal to Good Intentions 

An argument that foolishly assumes that good intentions excuse all behavior. It often takes the form, "but he/she meant well." But as the saying goes, "The road to hell is paved with good intentions." Example: "I just can't believe that our government knew about Nazi death camps and did nothing about it. FDR must have had some good reason for it."

A variation of it is the foolish assumption that a good person can neither do nor approve of anything bad: "Brutus is one of the nicest people I've ever met, so it just can't be true that he beats his children!"

Real example: Cameron Helder, father of domestic terrorist Lucas Helder, made this statement about his son: "I really want you to know that Luke is not a dangerous person," Cameron Helder said, choking back tears. "I think he's just trying to make a statement about the way our government is run. I think Luke wants people to listen to his ideas, and not enough people are hearing him, and he thinks this may help."
In other words, he blew up six people, but he meant well. 
(To read the article and get more details, click here.)


Appeal to Ignorance (Shifting the Burden of Proof)

Being ignorant is not a fallacy. We're all ignorant about a great many things.

The appeal to ignorance or argument from ignorance is a special case of false dilemma. The arguer assumes that every claim is either known as true or known as false, overlooking the agnostic position that we sometimes don't know if a specific claim is true or false!

The arguer does not spell out the assumption, but it is implicit in the argument, which proceeds from a premise that acknowledges ignorance about the issue: "I don't know that my position is false." (Usually in something like this wording: "You haven't proven me wrong" or "There's no evidence that I'm wrong" or "You might be wrong" or even "Nobody knows about this.") From this recognition of ignorance, the arguer draws the conclusion that their position on the issue is correct!

Example: Nobody was there when the dinosaurs roamed the earth. Scientists don't really know if there were humans around. So we should accept the view that humans and dinosaurs once co-existed.

Analysis: By this reasoning, scientists "don't really know" that viruses cause colds. Does that prove that black magic causes them? This overlooks the middle ground that one position may be a better hypothesis than another.

Viewed from a different angle, this fallacy misplaces the burden of proof. When an issue is being debated, someone who wants to defend the truth of a claim must have some evidence for it. One can't "prove" one's side by simply knocking down the other side. (Remember: you might both be wrong if there is a third alternative.) Don't confuse arguing for a claim with the legal standard of innocent until proven guilty, where the burden of proof is on the prosecution and where both ignorance and knocking down the prosecution argument are enough to render the person not guilty.

Ignorance is sometimes combined with wishful thinking, as in the popular saying, "What you don't know can't hurt you." Unfortunately, it can.

Real Example: Ignore Bleak Statistics. Remember, statistics about people in general say nothing about your odds as a unique individual. And the statistics the doctor cites could simply be wrong. So if you're ill, avoid the possibility of depressing yourself and your immune system by listening to the negatives. (New Woman Magazine, March 1990)

Analysis: The author of the article wants us to feel better about ourselves, and asks us to treat relevant data as irrelevant so that we can proceed in ignorance, believing whatever we want to believe about our illness. Unfortunately, what you don't know can hurt you.


Appeal to Pity 

An argument in which a strong emotion, in this case pity, is offered by the arguer as a reason to agree. Most often, the arguer warns us that disagreement will bring harm to the arguer. ("How can you say that I did badly on this exam, Dr. Watson? You know that if I fail this course I'll have to drop out of school!") Occasionally we are asked to agree so that harm will not befall some third party, as in a bogus charity appeal that tells us that small children will starve to death if we don't contribute (when the money is really going to line the pockets of the arguer).

Pity can be a good reason to DO something (such as giving to legitimate charity). This only works if we already see that there is a real and pre-existing relationship between our actions and harm to others. ("You shouldn't burn those leaves. Don't you know that by burning them you might give your next door neighbor an asthma attack?") But pity cannot be a reason in the sense of evidence, as a reason to agree with someone.

So we have a fallacy of appeal to pity in either of two situations: 
(a) we are expected to believe something based on pity, or 
(b) we are asked to do something where it is questionable that a real and pre-existing relationship ties our pity to the choice of what to do.

Real example: In December of 2001, Richard Reid was arrested for trying to detonate explosive on an airliner over the Atlantic Ocean. His father responded with this statement to the press:

"Please don't hate my son. Look at the terrible childhood he had and the broken home he came from. Every time he needed me I was nowhere to be found. I was locked up," said Robin Reid, who spent a total of 18 years in jail for such offenses as burglary and car theft. "With that kind of childhood, what sort of defense could he put up against lunatic religious fanatics leading him astray?"

But our pity for Richard Reid is not a good reason to excuse his behavior, which is what Robin Reid is asking us to do. (To read the article, click here.)

The Latin name for this fallacy is Ad Misericordiam.


Appeal to Tradition 

Any argument that defends a behavior or choice by pointing out that the behavior or choice is a longstanding practice. Unfortunately, many foolish and destructive behaviors are also very traditional, such as slavery, forced prostitution, and punishing children by hitting them with belts. 

For an excellent example of how tradition is used to justify the mistreatment of children, click here

This fallacy is very common in advertising, as in this simple example: (Reader's Digest, March, 1999, p. 15)


The fact that this brand has been made the same way for over one hundred years is no reason to buy it. There are many brands that have been around less time that I like much better than this one!

The Latin name for this fallacy is ad antiquitatem.


Begging the Question 

The classic definition is that the arguer uses premises that are no more plausible than the conclusion. To see that this is a problem, remember that the conclusion is being supported, and support should be more secure than whatever it supports. So if the premises are just as questionable as the conclusion, the argument is unsound.

The argument usually begs the question (literally: avoids the issue through a questionable assumption) by using premises that seem like good reasons to the arguer, but only because the arguer is sure that the conclusion is correct. The arguer therefore offers reasons that will be just as problematic to others as would be the conclusion itself. This problem is sometimes known as "preaching to the choir."

Example: "Of course John is a jerk. All men are jerks!"

Analysis: Why would anyone who questions whether John (a particular man) is a jerk be persuaded by the premise that all men are?

A special case is begging the question by synonymy, in which the arguer simply uses the conclusion as a premise, but disguises this operation by rewording it (by saying something synonymous).

Example: "Of course this drug will put you to sleep. It contains a soporific agent." (The arguer has said that it will put you to sleep because it contains something that will put you to sleep.)

Example: "Why is Zconian brand toothpaste better than the competition? Because our special team of research scientists ensures that our product is the best on the market."

Another variation is loaded question, where the arguer biases the debate by "loading" a questionable assumption into a question.

A special case is the fallacy of misuse of hypothesis or persuasive definition. In this case, the arguer gives words a definition favorable to the argument, but has no good reason for defining things in this way except to advance the argument. This is most often done by attaching a special emotional significance to the words. But it can also be a case of giving a term an unjustifiably non-standard interpretation. 

Example: "How can you say that Tom is a jerk? A jerk is a guy who gets you pregnant and then skips out on you. All Tom did was to cheat on me a few times."

(from the movie My Big Fat Greek Wedding)

"You're a vegetarian? You don't eat meat? We'll make lamb."

Prominently displayed by anti-war protestors during the U.S. invasion of Iraq in 2003: "War is terrorism."

(Compare this to the "No true Scotsman fallacy," in which much the same thing is done by putting a qualifying slanter in front of a word.)



Biased Sample 

The problem of generating an unrepresentative sample by using a method of sampling that misrepresents an important subgroup of the population. Obviously, this is not an issue in a highly homogenous population. Looking at penguins in one location of Antarctica is fine if you just want to know the average height of penguins; where they live in Antarctica probably has no effect on penguin height. But choosing just one place to sample all Americans is going to be biased, because Americans include many, many different subgroups. So going to a shopping mall in Fargo, N.D., is not going to get you a representative sample when trying to determine how many Americans are of Norwegian descent (such a sample will be biased by over-representing that group).

One way to create bias is through loaded questions, which contain assumptions that influence the response.

Real example: A questionnaire sent to voters by a California congressman asks, "Do you support gun control laws that restrict the rights of law abiding citizens?" and "Should a child under 18 years of age be able to obtain an abortion without parental consent?"

If the questions are not themselves biased, most bias can be eliminated by stratifying the sample. Stratification is the process of sorting the sample into groups ("strata") that have been identified in advance as highly relevant to the issue being studied. For instance, when the topic is abortion, both a person's sex and religious affiliation will influence his or her position. If we cannot get a highly random sample, then we should stratify our sample for at least these two factors. So in addition to asking them whatever we want to know, we need to determine these additional facts. If we sample more than we really need, we can then sort the sample into the relevant subgroups, see if any are over-represented, and then randomly eliminate samples from the over-represented groups until we reach a sample in which the major sub-groups are represented according to their numbers in the general population. Notice that stratification only works when we already know which subgroups are relevant to the issue and we know their numbers in the general population. 

Confirming evidence is a special case of biased sample. It is the special case of using a method that gets the arguer or researcher a result favorable to his or her desired conclusion. The method can create this bias either intentionally or accidentally. Either way, the fallacy consists in choosing a method that generates a large sample of one very specific, highly unrepresentative group. There are two ways to generate confirming evidence. One is to use a method that initially selects a sample from one particular group. For example, if I want to know how many Americans think that it's time for a woman to become the U.S. President, then I would generate confirming evidence if I only called women. It would be even worse if I only called women whose telephone numbers were provided by N.O.W. (the National Organization for Women). Similarly, if I'm interesting in supporting the idea that video games harm children, I might generate my sample of children by choosing children at a juvenile corrections facility.

The other method to generate confirming evidence is to ask a question that automatically favors one answer over other possible answers. If I want to "discover" strong support for private schools, I might ask people, "Do you favor continuing massive subsidies for our failing public schools or do you support directing some of that money to school vouchers that give parents a choice?" This method involves the fallacy of the leading question.

If one generates a sample with confirming evidence, stratification is not going to remove the biases from the sample.

Arguments that generalize are unsound when they have a biased sample or involve confirming evidence.


Causal Fallacies 

Four fallacies pose problems for arguments that try to demonstrate a cause in a population. They are:

The fallacy of reversing cause and effect 

The fallacy of coincidental correlation 

The fallacy of overlooking a common cause 

The fallacy of Post Hoc 

Circularity (Arguing in a circle)

In a situation that calls for evidence, circularity occurs when the person who needs to supply the evidence avoids doing so and instead offers, as evidence, the very thing that needs support by evidence. To put it another way: the very claim that is being debated is introduced into the argument as a piece of evidence for itself. Its Latin name: circulus in probando.

(Technically, arguing in a circle is deductively valid. Any claim can be derived from itself. However, such arguments lack soundness, for if the truth of the conclusion is in doubt then the truth of the support will also be in doubt.)

For example, consider this argument: "Clean-O Toothpaste is the best on the market, because it outperforms all competitors." Since something is only "best" when it outperforms its competitors, the evidence is merely a paraphrase of the conclusion that it supports. This is circularity by synonymy.

Circularity is sometimes equated with begging the question. However, I think that begging the question is better understood as a broader category of fallaciously offering evidence that is no better than the conclusion it defends. Circularity is the a special case.

Circularity is often disguised by introducing the repeated information, but doing so several steps removed from the conclusion. For example: "God's existence can be proven by the miracles that He provides. When a skydiver's parachute fails and yet a tree breaks the fall, allowing the skydiver to survive, that survival is miraculous. Since no one but God can do anything miraculous, such cases prove God's existence." The circle is revealed when one ask why God is needed to explain the skydiver's survival if a tree already explains it.

Fallacy of coincidence  

Coincidence is a fallacy of sampling, but it is mainly a concern when we use sampling to establish a correlation in order to establish a cause. Basically, it is the fallacy of putting too much trust in a sample that, due to no other fallacy, fails to represent the general picture. How does this happen? 

Statistical information has both a margin of error and a confidence level. The two are related. Failure to consult the margin of error can result in hasty generalization

The fallacy of coincidence is the failure to understand the importance of the confidence level. No matter how carefully we sample a population, about 5% of the time we'll get a result that does not reflect what's really happening in the population. Sound sampling gives us a confidence level of 95% (we have a 95% likelihood that the truth falls within our margin of error). 

Example: We do a random sample of 400 people in North Dakota and ask, "Were you born in North Dakota?" 64% say yes. The sample has a margin of error of 5%. So we can soundly conclude that there is a 95% likelihood that a majority of North Dakotans were born in North Dakota.

Because we cannot know the truth falls outside our margin of error, it is foolish to trust studies and experiments that are not replicated by a second study or experiment. (If a separate study gets the same result and neither has any other fallacies, then there is almost no chance that both got the same erroneous result by coincidence!)

We can improve the confidence level (say, to 3%) by lowering our confidence level.

Example: The same random sample of 400 people in North Dakota can be reported as having a 4% margin of error if we reduce the confidence level to 90%.

So what is the fallacy of coincidence? Basically, it is trusting statistical information that would, on further study, turn out to lack statistical significance. Either the result was one of those few cases predicted to be wrong by the confidence level, or the statistic is just a coincidence of the circumstances in which we conduct our study. 

Either way, we can only say that a statistic is subject to the fallacy of coincidence when we can point to further data that shows the sample  happened to be one of the "bad" results predicted by the confidence level. 

Denying the antecedent 
The invalid argument pattern in which the second premise conflicts with the antecedent of a conditional. This pattern is fallacious.

Argument form:

1. If A then B

2. Not A     

   Not B


1. If Sam is tired then he will fail the exam.

2. Sam is not tired.   

   Sam will not fail the exam.

It's invalid because the premises might be true while the conclusion remains false, due to some additional reason making the consequent of the first premise true as it originally stands. (In this example, perhaps Sam is well rested but Sam didn't study for the exam.)


An extreme form of provincialism in which someone identifies one's own needs and then settles a problem by simply determining how those needs can be met. Someone is egocentric when she or he oversimplifies a problem by refusing to consider the relevance of facts and information that conflicts with one's personal interests. A display of egocentrism constitutes a fallacy when it blocks discussion of relevant information in the context of arguing.


A common form of provincialism in which someone identifies oneself in terms of one's ethnicity and then settles a problem by simply determining how that group is benefited. Someone is ethnocentric when she or he oversimplifies a problem by refusing to consider the relevance of facts and information that conflicts with the beliefs and/or values of one's ethnic group. A display of ethnocentrism constitutes a fallacy when it blocks discussion of relevant information in the context of arguing.


As with many of the fallacies, equivocation is only a fallacy if we first establish that it takes place within a context of reasoning! People equivocate all the time, but it doesn't have the status of a fallacy unless they are engaged in reasoning (in leading someone to a conclusion).

The fallacy requires the following:

  • A word or phrase is used that has two distinct meanings.
  • We can state paraphrase both, that is, we can say how the two uses differ.
  • The word or phrase clearly means one thing the first time it is used, but means the other thing the next time that it is used. (This shift in meaning can be a shift from the meaning between the first premise and the second, or between a premise and the conclusion.)

The fallacy can be intentional or unintentional. In the former case, the person giving the argument misleads the audience by exploiting the equivocation. In the second case, the speaker does not try to mislead, but the audience draws an unsound conclusion by misinterpreting statements that can be taken two different ways.

Example: John has catholic interests: he likes sports, science, mystery novels, and silent films. Most Catholics go to mass on a regular basis, so John probably goes to mass on a regular basis.

Analysis: The first premise uses 'catholic' with a lower case 'c', meaning broad or universal. The second uses an upper case 'C', meaning the religion. Since the two premises are true according to different interpretations of the term, there has been a change of subject and the two premises do not connect to make a strong pattern. It is unsound.

Example: In this country, a suspect is innocent unless proven guilty. Since my trial has not yet reached the verdict stage, I'm innocent.

Analysis: Pretty silly! In reality, one is or isn't innocent quite apart from the jury's verdict. One is merely treated as innocent unless proven guilty (it's a statement about the burden of proof in law, not a statement of fact). One can't go from the legal principle to a claim of real innocence.

When the equivocation is created by a punctuation error or error in grammatical construction, the fallacy is technically known as the fallacy of amphiboly.

Real example: Jared Blair, manager of a Hooters Restaurant in Panama City Beach,  told the waitresses at the restaurant that the company would reward the one who sold the most beer during April with a "new Toyota." At the end of April, Blair told employee Jodee Berry that she'd won. When Berry was led blindfolded to the parking lot to claim her prize, she got a new "toy-Yoda" Star Wars doll instead of the promised car, the suit contends. Berry has sued the Restaurant. (For the complete news article, click here.)

Analysis: Berry alleges breach of contract. If her claims are true, it is clear that Blair is guilty of the fallacy of equivocation, for choosing wording that would mislead the waitresses. 

False or faulty analogy

When an argument by analogy overlooks significant differences, it is subject to this fallacy and is unsound. To accuse it of false or faulty analogy, one must note at least one significant difference between the things being compared, and must explain how the difference is relevant to the issue being debated.

Example: Some people argue that because alcohol is legal for adults, all other mood-altering substances should be legal for adults. But this is a false analogy, because many mood-altering substances are drugs that permanently alter the pleasure center of the brain, causing strong addictions almost immediately. Alcohol does not work in the same way, so it does not have the same level of addiction that we would get if we legalize many other substances.

False Dilemma (Limited Options Fallacy)
The fallacy of arguing by offering someone a false or implausible set of choices. In other words, an excluding possibilities argument with a false disjunction. A false dilemma is always unsound.

We show that the fallacy has taken place by pointing out one or more plausible but overlooked options.

Example: "Either Pat should study harder or take easier classes. Pat won't study harder, so Pat should take easier classes." 

Analysis: This argument has a false dilemma because it ignores the plausible alternative that Pat might change majors, or perhaps leave school.

Example: My opponent wants to extend the hunting season by starting it a week earlier. But that isn't fair to the young animals, cheating them of time to develop and making them easy targets. Instead, we should extend the season by a week. 

Analysis: Fails to consider the option of "none of the above," for example, by just leaving the dates of the season alone.

In this fallacy, the argument is usually valid. So this is not a "formal fallacy" (the problem is not the form of the argument). 

One variation of this fallacy is known as the Black-and-White Fallacy. In this case, the arguer oversimplifies a complex situation by seeing the situation as "black and white," putting all cases into one of two extreme categories. The arguer ignores "shades of gray." (Shown the color yellow, the arguer classifies it as white because it's more like white than black!)

Example: He can't be a Christian. Look, he's got tattoos! 

Analysis: Implicit but false assumption: You conform to a certain "dress code" or you're not a Christian.

Example: You're either a patriot or you're one of those radicals criticizing the war! 

Analysis: Implicit but false assumption: there is no middle ground.

Another variation is the middle ground fallacy (a.k.a. the split-the-difference fallacy or the moderation fallacy). In this variation, the arguer begins the argument with a false dilemma by proposing two extreme options. Based on their extreme nature, the arguer then proposes that the reasonable option must be in the middle ground between the two, and proposes a specific third option as that middle ground. Yet this remains a false dilemma if the arguer has ignored additional options that would be alternatives (such as none of them).

Example: Smith thinks that all the faculty should use Mac computers, because that's what they use in her department. Jones thinks that all the faculty should use PCs, because they're more common in industry. But neither of those plans can work. PCs lack the capacity for graphics needed by the art department, and Macs don't run all the accounting software needed in the business school. So the university should adopt the compromise: half of future computers will be Macs, and half will be PCs.

The National Academy of Science recommends the teaching of evolution in all high schools. However, most Americans do not believe in evolution and do not want it taught. What we should do is balance the teaching of evolution with equal time given to teaching the Bible's book of Genesis.

Hasty Generalization

The fallacy of generalizing from a small sample and failing to take this into account in the conclusion of the argument. "Small" is a relative term here. A sample that is adequate in size when asking people who they plan to vote for will be too small for medical research that looks for low levels of allergic response to a new medication. Basically, the fallacy occurs when we present a conclusion that is more precise than the sample warrants.

To be more precise, every sample size creates a margin of error for the resulting generalization. These margins specify the range within which we can expect to find the correct answer. A 10% margin of error means that the truth is somewhere within a range of 10% on either side of the reported number. A 3% margin of error (commonly found in professional polling) means that the truth is somewhere within a range of 3% on either side of the reported number.

For example, suppose a unbiased sample of 100 Americans gets 55 positive responses to the question "Do you like ice cream with apple pie?" There is no fallacy of hasty generalization if the conclusion is reported as "Approximately 55% of Americans like ice cream with apple pie." There is a fallacy if we report it as "A clear majority of Americans like ice cream with apple pie." This is misleading, because our sample is consistent with the result that as few as 45% agree.

Inconsistency (Self-Contradiction)

The fallacy of giving evidence against the position being defended, either directly, by undercutting the conclusion, or indirectly, by undercutting evidence being presented by the conclusion. 

It is not a fallacy to state objections to one's argument if one then goes on to answer those objections. To recognize serious problems but to insist that one is still correct would be a case of inconsistency.

Real Example: A contest promotion that advertised "50 random entries automatically win this one-of-a-kind T-Shirt!"

If 50 people get the same shirt, how is it one-of-a-kind?

Real Example: On May 16, 2006, Kentucky Fried Chicken announced a new product with a press release. The announcement includes this paragraph: 
KFC's new Famous Bowls, a departure from the restaurant's popular family style bucket, provides lunch-starved Americans with the perfect all-in-one, "made for one" remedy to their usual rushed and unsatisfying lunchtime routine. The new KFC Famous Bowls™ offer a hearty meal "just like mom used to make," with layers of mashed potatoes, sweet corn and bite sized crispy chicken, drizzled with signature home style gravy and topped off with a three-cheese blend - in one convenient bowl.

How can it be famous if it's new?

Real Example: Cameron Helder, father of domestic terrorist Lucas Helder made this statement about his son: "I really want you to know that Luke is not a dangerous person." But Cameron Helder also publicly pleaded with his son to surrender to the FBI: "Luke, you need to talk to someone. Please don't hurt anyone else. It's time to talk. You have the attention you wanted." There is an obvious inconsistency in claiming that Luke is not dangerous while also asking him not to hurt anyone else (at this point, six people had been injured by bombs). (To read the article and get more details, click here.)

Cameron Helder's defense of his son appears to be a case of egocentric perspecitive.

Loaded Question 

The fallacy of biasing an exchange by asking a question that has an unjustified assumption built right into the question, influencing the answer given to it.

Real Example: A newspaper asks the following question of its readers:

Do you think the Fargo City Commission should meddle in private property matters?

  • Yes

  • No

  • I'm not sure

The Forum - 05/01/2002 Daily Poll



Another version of this problem is known as complex question, where two unrelated topics are combined in a single question, so that answering on part will seem to answer the other, as well. 

Example: "Do you support our constitutional freedoms, such as the right of individuals to possess handguns?"

Loaded question is often done by introducing a false dilemma: 

Example: "Are you going to support me on this in the meeting today, or are you siding with those spineless worms on the other side?" 


Misuse of Authority / False Authority 

Often called Appeal to Authority, I prefer a title that doesn't suggest that there's a problem with appealing to authorities! It is often appropriate to defend the truth of a claim, or a recommendation of behavior, by citing the testimony or advice of some authority. The fallacy consists in citing someone who is not really an authority on the subject at issue.

The Latin name for this fallacy is ad verecundiam.

The fallacy takes many forms:

  1. The person cited is not an authority in any way. (Example: An advertisement shows a famous movie star endorsing an automobile. For an example, click here.) 
  2. The person cited is an authority, but not about the subject at issue. 
    (Example: An advertisement shows a famous scientist endorsing a computer company, as when Apple uses Albert Einstein's photo in an ad for its computers. But he died before there was such a product! A real authority on this issue would be a product testing lab, such as Consumers Report.) 
    (Another example: "There must be life on other planets, because my history professor says so.")
  3. The subject matter does not call for authority, but one is cited. (Example: An advertisement shows a famous actor endorsing the taste of fried chicken. We can all judge for ourselves which fried chicken is tasty.)
  4. The person cited has expertise, but there is no consensus on the particular question that is at issue. (Example: We cite Dr. X's views on the origins of the HIV virus, but since there is no consensus about this topic, Dr. X's authority settles nothing.)
  5. The source is out of date but we pretend it is still reliable. (Example: Citing the encyclopedia figure of one million people as the population of Minnesota, without noting that this edition of the encyclopedia is 40 years old.)
  6. There is not enough detail about the supposed authority to be able to specifically identify the source of the information. This is an appeal to anonymous authority. (Example: Weight loss advertisements that say that "A double-blind study showed the product to be 98% effective," without telling us who did the study, etc.)

The Latin name for this fallacy is ad verecundiam.

Some authors (such as V. R. Ruggiero), distinguish between misuse of authority (the person cited doesn't count as an authority) and irrational appeal to authority (versions 4, 5, and 6 in the list just above).

Compare these two paragraphs: 

  • I saw that my English professor has a yard sign for Nicholson for city council, so that's who I'm going to vote for. 

  • No, I'm not going to go see the theater production of "Twelfth Night." My English professor says that Shakespeare's comedies are worthless.

The first is misuse of authority (the professor is no authority on local politics), the second is irrational appeal to authority (the authority that comes from being an English professor has to be considered in light of the professor's clear rejection of standard expert opinion on this topic).

Variations of this fallacy appeal to other kinds of bogus authority, among them:


No True Scotsman Fallacy 

This fallacy is sometimes treated as a type of ambiguity or begging the question. However, I regard it as primarily a slanter fallacy: the arguer adds the qualifying term "true" or "real" in order to secure their questionable assumption.

The fallacy involves making a claim and then, in the face of counterexamples, protecting the claim by adding the qualifying term "true" or "real." Anthony Flew gives this example:

Imagine Hamish McDonald, a Scotsman, sitting down with his Press and Journal and seeing an article about how the 'Brighton Sex Maniac Strikes Again'. Hamish is shocked and declares that "No Scotsman would do such a thing". The next day he sits down to read his Press and Journal again and this time finds an article about an Aberdeen man whose brutal actions make the Brighton sex maniac seem almost gentlemanly. This fact shows that Hamish was wrong in his opinion but is he going to admit this? Not likely. This time he says, "No true Scotsman would do such a thing".

We see this fallacy in operation whenever someone opines that "real Christians" behave a certain way, or argues that "true Americans" support the President, no matter what. 

Overlooking common cause 

This fallacy is restricted to arguments to establish a cause. It is the mistake of finding a correlation between two things, then drawing a conclusion without checking for other variables that are also correlated with those two. This problem does not occur in a controlled experiment, but it is a common problem in a study of existing behaviors and events.

Let's suppose we correlate two things, A and B. But perhaps A keeps turning up with B because some previous thing, X, is independently causing A and independently causing B. Here, X is the common cause of A and B. 

Failure to screen for such things is the fallacy of overlooking common cause.  

Example: I notice that when I get a sore throat, it will not be long before I get a runny nose. I conclude that sore throats are a cause of runny noses. 

But this overlooks the common cause: I get a sore throat and then a runny nose because I first get a viral infection (a cold). The virus attacks my throat, giving me a sore throat, and it attacks my nasal passages, which respond defensively with mucus. The two things (sore throat and runny nose) are each caused by the virus, not one by the other. 


Post Hoc 

The full title is Post Hoc, Ergo Propter Hoc. This Latin phrase means "After it, therefore because of it." This fallacy is restricted to arguments to establish a cause. This problem does not occur in a controlled experiment, but it is a common problem in a study of existing behaviors and events.

It is the fallacy of thinking that if one thing happens and then another thing happens, the first thing was the cause of the second. But time order alone cannot show that something is a cause. At the very least, we also need a control group! The Post Hoc Fallacy occurs when someone does not understand the need for a control group and draws a conclusion based on nothing but the time relationship. 

Many superstitions are based on post hoc reasoning.

Example: I wore my blue sweater when I took that biology test, and I got a 95%! From now on, I'll always be sure to wear my lucky blue sweater when I take an exam.

Post Hoc is merely one of four common fallacies associated with causal reasoning.


Poverty of Aspect 

Any fallacy of oversimplifying a complex issue due to a limited perspective on it. Some common sub-categories are egocentrism, ethnocentrism, and patriotism/nationalism. In most cases, it arises from over-generalizing, from looking at a problem only from a narrow professional perspective, or from assuming that one can identify "normal" patterns of behavior within human cultural patterns. See also provincialism (group identification that assumes group superiority).


Prejudicial Language 

A huge category of fallacies, covering all strategies for substituting argument wording for legitimate reasons for a position. Basically, we have a fallacy of prejudicial language whenever an arguer's choice of words is used to hide the arguer's introduction of a false or questionable assumption.

Major types include:


Any argument that attempts to settle a complex issue by appeal to group membership and group loyalty. Such arguments normally postulate an "us" versus "them" distinction and improperly assume that "our side" is automatically better than (or more trustworthy than, or more important than) theirs. Such reasoning is unsound because we often find ourselves to be members of groups who, as a group, turn out to be wrong. (Some social scientists call this inability to look beyond our group "poverty of aspect.")

A common variation is the appeal to loyalty. In this variant, the arguer says that we must act as a group and therefore cannot permit disagreement, or that we should not listen to informed, internal criticism. This form of provincialism is common in arguments against minority or dissenting views. 

Example: This country was founded on Christian values! Christianity is as American as apple pie! So how can you think that the huge display of the 10 commandments at the county court house is a violation of the separation of church and state?

A common variation is the "it-couldn't-happen-here" assumption.

Real example: A lesbian couple in Missoula, Montana, awoke one night to find that their house was on fire. The fire, which authorities quickly announced was set deliberately and was investigated as an attempted murder, shocked Missoula. The fire appears to have been set in retaliation for their filing a discrimination lawsuit against the University of Montana. "My thought was it had to come from outside our community," says Christopher Peterson, then a senior at Montana and the openly gay president of the student senate. "It just wouldn't happen here." (The Chronicle of Higher Education, 6 Dec. 2002)

For another real example, click here

A common variation is nationalism: expecting others to agree on the basis of national identity. For example, advertisements aimed at the American market sometimes feature an American flag or red-white-and-blue imagery for no apparent reason. This is also know as flag-waving.


Red Herring (Changing the Subject / Lack of Relevance)

The fallacy of introducing, as reasons for one's position, a topic that is not of genuine relevance to the issue originally being debated. In effect, the arguer starts on one topic, changes the subject, and then proceeds as if there has been no change in subject.

Supposedly, the fallacy is known as "red herring" on an analogy with escaped convicts who might smear herring (a smelly fish) on themselves to throw bloodhounds off their trail.

Real Example: "No, no, he's a friend of mine. He's not a moron at all -- he's a friend. I had a good time with him today." Canadian Prime Minister Jean Chretien, responding to reports that his chief spokesman had called President George W. Bush a moron. (Newsweek, 2 Dec. 2002)

Analysis: Perhaps it's true that Bush is his friend. Perhaps it's true that they had a good time together. What has that information got to do with answering the charge that Bush is a moron? Chretien is changing the subject instead of talking about Bush's intelligence.

Since a great many fallacies involve giving reasons that lack relevance to the issue under debate, we reserve the accusation of "red herring" to those cases where the argument fallacy is a simple change of subject, apart from one of the specialized tactics (e.g., Appeal to Pity, Appeal to Force, Straw Man, etc.)


Reversing cause and effect 

This fallacy is restricted to arguments to establish a cause. This problem does not occur in a controlled experiment, but it is a common problem in a study of existing behaviors and events.

The fallacy occurs when we have a genuine correlation, but we have not clearly established which of the two things really comes first. We simply assume we know which comes first, but in reality it is the other way around. If it is plausible that we have turned the time order around, then the argument is unsound due to this fallacy.

Example: Suppose there is a strong correlation between drinking a lot of coffee and being a type A personality. (In the story of the ant and the grasshopper, ants are Type A. Grasshoppers are Type B. Type As work hard to meet goals, are self-critical, have a chronic sense of time urgency and are often impatient, often display  hostility, and usually display fast movements and rapid speech.) We conclude that drinking a lot of coffee is a cause of being a type A. But did we firmly establish that the people in the study were Type A before they started to drink coffee? Perhaps they had this personality type in childhood, long before they started to drink coffee. Perhaps their sense of time urgency and need to meet goals makes them more interested in using stimulants, which attracts them to coffee. So using the correlation to argue that coffee is a cause of personality would be the fallacy of reversing cause and effect.



As with many of the fallacies, slanters are only a fallacy if we first establish that it takes place within a context of reasoning! People use slanters all the time, but it doesn't have the status of a fallacy unless they are engaged in reasoning (in leading someone to a conclusion).

This fallacy is a type of prejudicial language in which the wording of one's argument is designed to influence the audience into accepting unstated but questionable assumptions. (To put it another way, the arguer "hides" assumptions through word choice.) Slanters are regarded as a fallacy because a good argument should make its assumptions clear to the audience.

Real Example: How can one attempt to rationalize strapping on explosives with the intent of murdering innocent men, women, and children? If the civilized world does not defeat the deadly scourge of suicide bombers, no city in the world will be safe from any group with a grievance. (Letter to Time Magazine, April 29, 2002) 

Analysis: Here, the phrase "murdering innocent men, women, and children" simply assumes that none of the targets have military status. The phrase "civilized world" imports the questionable assumption that we are civilized and they are not. The arguer may have a good point, but it is obscured by the presence of slanters.

One variation is the No True Scotsman Fallacy.


Slippery Slope 

There are two versions of this fallacy. They are both given this name because they share a common idea that taking a first step will lead us to something we don't want. It is the unjustified assumption of this idea that is the fallacy. (When the assumption is justified, there's no fallacy, even if the argument otherwise looks like any other slope.)

The assumption in question is that choosing one thing leads to, or is equivalent to, choosing a second thing. But the move from the first to the second is not immediate: one leads to the other (or is shown equivalent) by a series of small, plausible steps. The result is then noted to be undesirable, and therefore (by the valid move of modus tollens), we are advised to avoid the first.

Hence, the name "slippery slope," which conjures up a picture of sloped ground that is slippery. If you take even one step onto the slope, you will find yourself down on the bottom, where you may not want to be!

        CLICK HERE for another illustration.

         Oh, no! Watch out for that first step!

The first type (where the first step leads to the second by causing it) is the causal slope. The second type (that one leads by equivalences to the second) is the semantic slope. 

When there no justification (no good reason to believe) that the first step must cause the second or that the first is really equivalent, then the argument contains a fallacy. 

Example of causal type:

You shouldn't see any "R" rated action movies. If you do, you will expose yourself to situations that glamorize violence, and exposure will lead to your accepting it, which will lead to your own willingness to act out your violent impulses. Before you know it, you'll find yourself in court for some senseless violence that you've committed.

Example of semantic type:

You should always be generous when tipping in a restaurant. If you skimp at all, you may fail to adequately compensate your server, and to fail to compensate someone for their time is the same as stealing their time, and stealing someone's time is no different than forced servitude, which is just another name for slavery. If you're not generous when tipping, it's morally equivalent to participating in the slave trade.

Why are these fallacies? Because in each case we can point to a "break" in the chain, a place where one step really does not take us to the next. In the first example, acceptance of fake violence does not necessitate acceptance of real violence, at least for those of us who distinguish fantasy from reality. In the second example, even if we "steal time," that's not equivalent to forced servitude. The server's job is to serve even if the customer does not tip, and the server knows that when taking the job. The customer doesn't force the server to be a server.

Slippery slope is closely related to scare tactics, the difference being that the slope argument tries to hide the threat by following a series of steps before arriving at the "scary" result.


Straw Man 

This fallacy consists of misrepresenting an opponent's position in order to make your own position look more reasonable. To be blunt, it involves putting words into your opponent's mouth that your opponent would not recognize as theirs. By substituting a different position (usually much weaker than their real position), one fails to genuinely engage the opponent, and thus one hasn't really done anything to support one's own position. 

The name comes from the practice of stuffing dummies and scarecrows with straw. When one attacks an opponent by putting words into the opponent's mouth, one makes up a "dummy" position. But just as beating up a scarecrow doesn't demonstrate any athletic accomplishment, beating up a "straw man" in an argument doesn't demonstrate anything.

Given this definition, the fallacy of straw man is restricted to cases where there is an opponent and where one is responding to that opponent.

The difficulty with spotting the fallacy is that one must know enough about the issue to know when an opponents is being misrepresented. 

Example: If you vote for Smith, you're voting for a candidate who supports continuing government waste and inefficiency, continuing decline in our nation's schools, and a weakened military.

Real example: "As North Dakotans, we are incensed by researchers at Rutgers University proposing that our state be turned into a part of "Buffalo Commons." Now our state legislature is in the process of placing an official seal of approval on this concept by voting to maintain a 1000-strong buffalo herd along I-94 in the hope of creating a tourist attraction. Surely, a legislature that is intent on cutting education and social services has more worthy projects for our tax dollars." (Opening of a letter to the Fargo Forum)

Analysis: Straw man, twice! The Rutgers researches did not PROPOSE a Buffalo Commons (where large stretches of the plains would become nature preserves, virtually devoid of human occupation). They PREDICTED it as trend in the region. Here, a small change completely misrepresents the opponent. Then the arguer misrepresents the legislature by claiming their action will be "an official seal of approval on this concept." The real issue appears to be impending cuts in education and social services, but the straw men have no clear relationship to that issue (red herring!).

Second Real example:" '[Senator John] Kerry adds something else that annoys me very much,' [Christopher] Hitchens told Tim Russert in a September 2004 interview in which he endorsed Bush for re-election. 'He gives the impression, sometimes overtly, that our policy has maddened people against us and … incited hatred in the Muslim world and so on, in which, again, there is an element of truth.' Kerry, of course, was overtly right; but when Hitchens finished twisting the senator’s words, he was objectively on the side of evil: 'If people say, ‘Let’s have a foreign policy that does not anger the bin Ladenists’ … what are they asking for?'" (Harold Meyerson, "Their War, Too," The American Spectator, August 2005)

Analysis: What Hitchens first says about Kerry is correct. The straw man occurs when Hitches then suggests that Kerry wants a foreign policy that "does not anger bin Ladenists," as if Kerry will routinely avoid any foreign policy decision that upsets Bin Laden and his followers. There is no reason to think Kerry approaches the situation in this manner.


Wishful Thinking 

An argument in which a strong emotional investment persuades the arguer to advance a completely implausible reason, where it's clear that the arguer merely advances that reason to feel better about himself or herself (rather than because it's the truth).

The arguer adopts the (usually implicit) assumption: "I should believe whatever makes me feel better about myself." An example would be: "I'm not really hurting the environment by driving this gas-guzzling SUV. After all, it lets me go out into rugged places where I can be at peace with nature."

Real example: Singer Whitney Houston discusses her drug addiction: "I like to think . . . I had a bad habit. . . . I don't like to think of myself as addicted." (Newsweek, 16 Dec. 2002)

Real example: Cameron Helder, father of domestic terrorist Lucas Helder, made this statement about his son: "I really want you to know that Luke is not a dangerous person," Cameron Helder said, choking back tears. "I think he's just trying to make a statement about the way our government is run. I think Luke wants people to listen to his ideas, and not enough people are hearing him, and he thinks this may help." (To read the article and get more details, click here.)

Analysis: But the father had previously arranged for the son's arrest, fearing that the son would hurt more innocent people. The father's claim that the son is not dangerous is a case of wishful thinking (the father will feel bad about himself if he admits the truth about the son).




Return to Practical Reasoning Home Page  Return to Theodore Gracyk's Home Page 


Explanations and examples (but not the graphics) on this page © 2002, 2006 Theodore Gracyk

Last updated March 6, 2011