notes-cog-rationality-miscFallacyNotes

i have a somewhat jaundiced view of "fallacies" and i don't know exactly what i want to say here, so i'll ramble.

the infeasibility of conclusive traditional rational debate

In order to have a truly rational debate under the traditional model, you have to be willing to define every term, look at every piece of evidence, follow every point, and explain every step of your reasoning, including metareasoning. This just takes too long. Many time when i've tried to have debate under this model, the debate has stretched to hours and we've never even gotten thru defining terms. Often the evidence presented would have demanded years of study to really understand and potentially refute.

It seems to me that in order to finish debates of this sort, the participants would have to be employed full-time in the activity for a long period of time.

There are various things that are frowned upon in traditional fully rational debate, but breaking these rules seems to be the only path to feasibility.

argument by authority and its (sorta) opposite, ad hominem

One way to save time is to delegate your thinking to others instead of thinking for oneself. The problem with this is that it can lead you to adopt false beliefs and for the opponent to have no way to dissuade you from them.

faith / irrefutable beliefs / prior biases

One way to save time is to make up your mind about something in a way such that no feasible amount of evidence will persuade you.

For example, in a debate about the reality of psychic phenomena, what do you do if someone comes up and does something that you can't explain, such as bending a spoon?

You could decide that you don't believe in psychic stuff, and say that bending the spoon is probably just a magic trick, even though you aren't clever enough to figure out how they pulled off the trick.

On the other hand, if a psychic were to fail a test under controlled conditions, you could say that the skeptical beliefs of the audience by themselves had a psychic effect on the results of the test, damping out the energy of the psychic and causing the test to fail.

There is no logical flaw in either of these beliefs (first, that there might be magic tricks that are still tricks even if you can't figure them out; and second, that the skepicism of experimenters might alter the results of experiments). They are both priori possible. Yet holding either belief exposes you to the criticism that you are impervious to evidence.

What the traditional view of rationality would demand is that you learn enough about magic tricks to have a reasonably informed opinion of if the spoon bending is a trick; and that you devise a complex controlled experiment in which the skepticism of the experimenters is itself controlled. Either of these would probably take more time and money than you have.

injunction against meta-argument

Another thing you can do to save time is to refuse to engage in debate about the definition of terms or of the rules of debate. This opens the door, however, to conducting the debate itself in an irrational fashion, negating its value.

guessing

To save time, instead of actually exploring each proposition relevant to this debate, you could simply say, "I don't want to discuss this and i'm not sure, but my guess is that Proposition A, which you think is true, is actually false". Formally, this is sort of like appeal to authority, where the authority is yourself. Again, the problem with this is that if your guess is wrong, and if the proposition is material to the debate, your opponent may be left with no way to persuade you of the truth.

argument by neutrality

One technique is to 'ground' your beliefs (weight prejudices / generate Bayesian priors) by seeking out supposedly neutral sources of opinion and letting them guide your prejudices about which propositions are controversial/deserving of skepticism. For example, you could ask an expert on the issue, or consult a textbook.

But this is just a form of appeal to authority.

argument by popularity

Similarly to argument by neutrality, you could 'ground' beliefs by seeing which beliefs are overwhelmingly popular, and then believing these with little evidence, but demanding a higher standard of evidence for the negation of those beliefs.

But this is just a form of appeal to authority.

argument by balance

Similarly to argument by popularity, you could 'ground' beliefs by assuming that when there are large opposed factions of roughly equal size with differing beliefs on some issue, "the truth is probably somewhere in the middle".

But this is just a form of appeal to authority. In fact, the truth could be one extreme or another.

argument from convenience

How do you respond to a 'conspiracy theory' (or, to be more specific, what i call a political information-distorting conspiracy theory, the hypothesis that a powerful political conspiracy is deliberately hiding evidence and planting false evidence)? A priori, there is no reason that this can't happen, indeed, according to mainstream history it has happened at various times in the past.

One response is to dig into it properly. But this can take years of study.

Another response is to say, 'Maybe it's true, but it would be expensive for me to investigate it, so i'm not going to bother. In lieu of looking at the evidence, I'm going to assume it's not true, on the basis of one or more of (a) that's my (uninformed) guess, (b) it's a simpler theory, (c) argument from popularity; the majority seems to not believe in it, or (d) believing in this and acting accordingly would be inconvenient for me (e.g. maybe i'd have to wear a tinfoil hat; maybe i'd have to avoid eating or drinking certain common things; maybe i'd have to evangelize the conspiracy to others, threatening my social standing)".

the way forward

i'm not exactly sure what i want. What i really want is to have a superpowerful mind that can think and read so fast that i actually have time to confront evidence and think for myself on important issues. But i don't (and, it seems like this (not having time to think through things) is possibly a fundamental reality, along the lines of NP-hard problems, that even powerful minds might face).

perhaps three things i want are this:

(a) a respect for the principal that computation (and thought) is not free, and the fact that rationally confronting issues often requires an infeasibly large expenture of thought (computation)

(b) acceptance of bias, guessing, appeals to authority, and the like as necessary evils

(c) a satisficing rather than a binary approach to valid argument; not 'this is disallowed in rational discussion' but rather more like 'let's acknowledge the fallacy here and then agree to permit it for now, and move on from this issue, bearing in mind that there's a potential source of error here, and reserving our right to change our minds and come back and reject this fallacy later if it becomes too pivotal'. It's hard to square this with the adversarial model, though, because why should an adversary ever agree to let you get away with a fallacy?

(d) explicitness about which fallacies are being accepted; "although i'm a journalist, it's true i didn't fact-check anything this guy said but that's because i trust him"

(e) a kind of debate in which the goal is something other than complete rationality and objectivity (one candidate: a learning experience for the participants in which they have a high probability of increasing their subjectively perceived probabilities of believing true things), and hence a retreat from the adversarial model (in which each person can demand that any irrationality that they can pinpoint be excised from the discussion) towards something else (one candidate: a subjective persuasive model, in which each person can say, "i've decided to irrationally guess on this issue; so i won't consider this proposition further unless you convince me it's worth me time").

(f) a normative/ethical softening of the idea that fallacies are impermissable/dumb/bad (e.g. if you say "There's been an injustice and i can prove it and you are the responsible party for setting it aright", and the other person says, "Perhaps, but i have no obligation to spend as much time examining your evidence as would be necessary to confirm your supposed proof, and in fact in this instance i've chosen not to spend that time, and in lieu of that, i've chosen to guess that your assertion is incorrect based on one or more fallacies", then that should be seen as upstanding behavior, not irresponsibility or stupidity).

(g) an acceptance that, due to the fact that people must frequently rely on fallacies rather than 100% valid reasoning, there will be persistently wrong beliefs, and that certain kinds of truths will be more difficult than others to realize. For example, because of the inconvenience of investigating conspiracies, if there were a conspiracy it is not unlikely that it even if it were uncovered, the majority would refuse to pay attention to the evidence of it. Therefore, depending on your biases, it may make sense to guess that there may be a bunch of conspiracy theories that are not generally thought to be true but are, and to assign to the most popular conspiracy theories a rather higher degree of belief than is given to them by respectable sources, even as you also say that you aren't going to bother to investigate further.


this one seems pretty good and not as overly broad as usual, although i havent yet read all the text: https://bookofbadarguments.com/?view=allpages