Logic, Fallacies & Argument
Bad Moves is a fortnightly series by philosopher Julian Baggini detailing the various ways in which arguments or points are made badly, but often persuasively.
The following are suggested as tools for testing arguments and detecting fallacious or fraudulent arguments.
According to cognitive dissonance theory, there is a tendency for individuals to seek consistency among their cognitions (i.e., beliefs, opinions). When there is an inconsistency between attitudes or behaviors (dissonance), something must change to eliminate the dissonance. In the case of a discrepancy between attitudes and behavior, it is most likely that the attitude will change to accommodate the behavior.
Scientific information abounds. New findings emerge daily. Imagine a study linking vaccinations to child autism: Should you believe it? Some government leader downplays the effects of global warming. Another claims that some hazardous waste material has been safely processed. The media report a study that cell phones may damage the brain. Science permeates choices in our lives, both public and private. No one can be expert in everything. The challenge, then—especially important for educators to appreciate—is learning how to deal with the information. Basic scientific concepts provide a framework. But one must also know about science -- how research is pursued, how conclusions are justified, even how scientists may sometimes err or be shaped by cultural biases. This deeper understanding of the nature of science may help us assess the reliability of claims.
I began collecting and studying logical fallacies about twenty years ago, when I first became interested in logic. This collection took two forms: 1. A collection of named fallacies -- such as "ad hominem" -- that is, types of bad reasoning which someone has thought distinctive and interesting enough to name and describe. 2. A collection of fallacious, or otherwise bad, arguments, that is, examples of reasoning which may commit one or more of the named fallacies under 1, or are bad in some way yet to be classified.
We use causal graphs and a partly hypothetical example from the Physicians' Health Study to explain why a common standard method for quantifying direct effects (i.e. stratifying on the intermediate variable) may be flawed. Estimating direct effects without bias requires that two assumptions hold, namely the absence of unmeasured confounding for (1) exposure and outcome, and (2) the intermediate variable and outcome. Recommendations include collecting and incorporating potential confounders for the causal effect of the mediator on the outcome, as well as the causal effect of the exposure on the outcome, and clearly stating the additional assumption that there is no unmeasured confounding for the causal effect of the mediator on the outcome.
General Shared Characteristics: A priori - (knowable independent of experience); conspiracy; jargon; ad hominem; false analogy; Post hoc, ergo propter hoc (After this, therefore because of this); appeal to authority.
Thanks to some helpful readers for pointing out that I missed a few "Generally Shared Characteristics" of pseudoscience and quackery. Here they are, along with some others I found.
Logical Arguments, Uses of Language, Definition and Meaning, Fallacies of Relevance, Presumption, and Ambiguity, Categorical Propositions and Immediate Inferences, Categorical Syllogisms and Their Validity, Syllogisms in Ordinary Language,
Logical Symbols expressing Argument Form and Statement Form, Rules of Inference and Replacement to prove Validity or Invalidity,
Basics of Quantification Theory, Analogical Inferences, Causal Reasoning, Scientific Explanation, and Probability Theory.
Fallacies are fun. It is true that they are a source of enduring fascination; men have studied them for at least two and a half millennia. It is also true that a knowledge of them is useful, both to avoid those tossed in your direction by others, and to lob over a few yourself. The fascination and the usefulness which they impart, however, should not be allowed to conceal the pleasure they can give.
Debate is, fortunately or not, an exercise in persuasion, wit, and rhetoric, not just logic. In a debate format that limits each debater's speaking time, it is simply not reasonable to expect every proposition or conclusion to follow precisely and rigorously from a clear set of premises stated at the outset. Instead, debaters have to bring together various facts, insights, and values that others share or can be persuaded to accept, and then show that those ideas lead more or less plausibly to a conclusion. Logic is a useful tool in this process, but it is not the only tool -- after all, "plausibility" is a fairly subjective matter that does not follow strict logical rules. Ultimately, the judge in a debate round has to decide which side's position is more plausible in light of the arguments given -- and the judge is required to pick one of those sides, even if logic alone dictates that "we do not know" is the answer to the question at hand.
The fallacies are ad hominem, affirming the consequent, appeal to ignorance (ad ignorantium), argument to logic (argumentum ad logicam), begging the question (petitio principii), composition fallacy, deny ing the antecedent, disjunctive fallacy, division fallacy, false analogy, false dilemma, golden mean fallacy, mistaking deductive validity for truth, naturalistic fallacy, post hoc ergo propter hoc (after this, therefore because of this), red herring, straw person, and you too (tu quoque).
The focus of this paper is on logical errors in scientific writings. These logical errors are known as fallacies. If an argume1nt contains a fallacy, then the conclusion will not necessarily be proven. Some fallacies are just accidental, but they can also be used to trap an unwary listener or reader into believing faulty conclusions. First, this paper will identify and describe logical fallacies, then a selection of scientific writing will be analyzed in light of those fallacies. Most logical fallacies can be grouped into three general categories. These are material fallacies, fallacies of relevance, and verbal fallacies.(3) These three categories will be explained in detail in the sections to follow.
Nonsense: A Handbook of Logical Fallacies is the best study of verbal logical fallacies that is available anywhere. It is a handbook of the ways that people deceive themselves and others, and offers a complete course in everyday logic, a form of thinking that most human beings value but often horrendously abuse.
From the perspective of one of the "entrenched old fossils", a lot of the "thinking outside the box" looks like "not understanding the underlying principles of the field". This sort of error is not limited to the scientific novice, either. Linus Pauling, Nobel Laureate in Chemistry (and recipient of the Nobel Peace Prize) made almost laughable mistakes when he strayed into biology and medicine. In his article "Orthomolecular Psychiatry" (will need JSTOR access - see here and here, as well) and the book of the same name, he showed clearly that he hadn't a grasp of the probabilistic nature of biology when he dismissed concerns that his results were not statistically significant.
If a winner of the Nobel Prize in chemistry could be so wrong when he strayed "across the hall" into a somewhat related field, what are we to make of the lesser mortals who disregard the principles of science because they are "thinking outside the box"?
With so much political and journalistic confusion it is useful to remember that academia has produced a long list of useful tools and techniques to evaluate the logical and conceptual validity of any argument regardless of political content or viewpoint. Useful rational standards by which to judge the merits of any statement or theory are easily found in textbooks on debate, rhetoric, argument, and logic. These books discuss which techniques of argumentation are not valid because they fail to follow the rules of logic. There are many common fallacious techniques or inadequate proofs:
The First and Father-cause of common Error, is, The common infirmity of Human Nature; of whose deceptible condition, although perhaps there should not need any other eviction, than the frequent Errors we shall our selves commit, even in the expresse declarement hereof: yet shall we illustrate the same from more infallible constitutions, and persons presumed as far from us in condition, as time, that is, our first and ingenerated forefathers. From whom as we derive our Being, and the several wounds of constitution; so, may we in some manner excuse our infirmities in the depravity of those parts, whose Traductions were pure in them, and their Originals but once removed from God. Who notwithstanding (if posterity may take leave to judg of the fact, as they are assured to suffer in the punishment) were grosly deceived, in their perfection; and so weakly deluded in the clarity of their understanding, that it hath left no small obscurity in ours, How error should gain upon them.
New comers to the autism world might be astounded to hear warnings of the vitriol of certain arguments. A friend in a related field once asked me what could inspire such strength of opinion; autism is not, after all, politics or religion. Sadly, my friend was incorrect and I am prepared to argue that aspects of the autism debate are indeed politics and religion. This is relevant because it places some context as to why we see such depth and diversity in the logic fallacies in autism.
Argumentum Ad Hominem; Tu Quoque: "You also"; Strawman; Wishful thinking; Inflation of Conflict; Non Sequitur; False Equation; False Dilemma; Fallacy of Origin; Begging the Question; Circular Reasoning; Argument to the Future; Post Hoc, Ergo Prompter Hoc; Argumentum Ad Absurdum; Argumentum Ad Nauseam; Inductive Fallacy; Deductive Fallacy; Two Wrongs Make a Right Fallacy; Error of Fact; Dishonesty; Argument from the Unknowable Fact; Red Herring Fallacy; Fallacy from Popularity.
Anticipated Strawman; Invoking the Bandwagon; The fallacy of Samaritan Intent; Fallacy of the Assumed but Hidden Truth; Fallacy of the Proven Hypothesis; Ignoring Regression Towards the Mean; Common Sense Fallacy; Fallacy of Intuition; Magical Thinking.
Fallacy of Inconsistent Application; Needling; Slippery Slope; Argument by Rhetorical Question; Psychogenetic Fallacy; Argument from the Beard; Argument by Tradition; Not Invented Here; The Galileo Gambit; Argument ad Baculum; Argument from Elitism; False Authority/Anonymous Authority; Argument from Conversion; Confusing Correlation and Causation; Plural of Anecdote Fallacy; Texas Sharpshooter Fallacy; Error of Reification; Affirming the Consequent; Moving Goalposts.
"Alt-med" believers usually feel that they are in a privileged position when it comes to debate. Being in possession of a
"revealed truth", they tend to feel that anyone who does not share their beliefs is obliged to defend their skepticism. This
is expressed as a set of (until now) unwritten rules of debate that place the skeptic and rationalist in an inferior position.
1. The discoverer pitches the claim directly to the media. 2. The discoverer says that a powerful establishment is trying to suppress his or her work. 3. The scientific effect involved is always at the very limit of detection. 4. Evidence for a discovery is anecdotal. 5. The discoverer says a belief is credible because it has endured for centuries. 6. The discoverer has worked in isolation. 7. The discoverer must propose new laws of nature to explain an observation.
Logical fallacies are errors of reasoning, errors which may be recognized and corrected by prudent thinkers. This site indexes and describes all known logical fallacies. The purpose of this site is to ensure that information about logical fallacies is freely available. Many sites fulfill this objective by linking to or mirroring the Guide.
By showing the lack of connection between the "facts" used in this data-chain, it becomes apparent that what we actually have is a series of separate "facts" with an implied connection, something that another writer called a "string of pearls". A "string of pearls" is a beautiful group of data without any visible connection.
We all know someone that's intelligent, but who occasionally defends obviously bad ideas. Why does this happen? How can smart people take up positions that defy any reasonable logic? Having spent many years working with smart people I've catalogued many of the ways this happens, and I have advice on what to do about it. I feel qualified to write this essay as I'm a recovering smart person myself and I've defended several very bad ideas. So if nothing else this essay serves as a kind of personal therapy session.
Always claim that the other guy is "closed-minded" and that you're as free-thinking as a newborn baby. Other woo-woos love the concept of "open-mindedness" and will take you into their inner circle without question. They have no tolerance for those "mean old nasty" types who demand evidence for everything.
Opinions expressed by the authors of pages to which this site links do not necessarily reflect this site developer's opinions.
In other words: Sublime or ridiculous? You decide!
Copyright © 2004-2008, Kathleen Seidel. All rights reserved.
This page was last updated on 5 November 2008, 3:48 pm
Hosted by TextDrive