While examining some other ideas in another context, I saw Richard Carrier’s use of the word “delusional”. He always provides definitions, if you search back through his links. In this case, just a couple of weeks ago. Key points.
The definition is that you have examined the evidence, understand it, but still hold the wrong the conclusion. (If you have the right conclusion with bad logic, you’re lucky, but not delusional).
In any case, knowing how to recognize fallacious reasoning is the key, i.e. use scientific methods of reasoning.
Important point: someone who has a delusional belief on a topic is not “delusional”. It is common for people to have delusional conclusions but otherwise function normally in most situations. Even if they are delusional on many points, it’s still not helpful or logical to claim that they have a condition called “delusional”.
Delusion is a psychological trap that anyone can fall into. Intelligence is not a determining factor, it can even make it easier to be stuck in the trap because they can be good at making an argument, but still miss evidence or ignore it.
From the blog:
First, some definitions. A delusion is any continued belief in something the evidence against is overwhelming (psychiatry exempts majoritarian cults, i.e. culturally acceptable delusions, but that has no epistemic relevance). An irrational belief is any belief held for fallacious reasons, which can include reasons of false evidence or false attendance to evidence, since those are often maintained fallaciously. Reliably nonfallacious reasoning will immunize you from most false facts; while the remainder you will be able to escape once the error or deception misleading you comes to light—which you can maximize the probability of by being constantly on the look out for exactly that, which is The Scary Truth about Critical Thinking. This is why science is so successful.
This means the irrational party to a debate is the one whose conclusions depend on fallacies. Not the one who merely deploys a fallacy; since, if you removed all fallacious lines of reasoning the conclusion still holds, their conclusion is still not fallacious, and so their belief remains rational (this is why learning how to steel-man a position is so important to successful critical thinking). Whereas if you removed all fallacies and there is nothing left to keep the conclusion probable, you are then looking at an irrational conclusion. Then, if after being shown this you don’t change your belief, you are delusional. You are then probably trapped by motivated reasoning or emotion-driven blindness to disquieting facts.
This happens a lot if your very sense of identity (who you are, personally or socially, or the meaning you assign to life itself, what your purpose is, even your faith in the reliability of your judgment or competence) is threatened if your beliefs are false. In result your brain will protect those beliefs against all refutation. And you might not even be able to notice this is happening, because your brain will tell you to deny it, or not see it, or distract you with something else. Delusion is literally a psychological trap. And by immunizing you against all evidence of itself, it becomes a hall of mirrors from which you might never escape. For an example of how trap beliefs can lock you into a delusion, see my Vital Primer on Media Literacy (and then study how a mass media delusion-engine caused the Rwandan genocide).
Finally, it’s important to distinguish between delusional disorder and mental illnesses. One can be technically crazy, but not a babbling lunatic eating grass. In fact almost all mental disorders leave the subject entirely competent in every other aspect of their lives and reasoning; they misfire only in the one very narrow domain that demarcates their disorder. So, for example, someone can have a pathologically paralyzing fear of worms—and thus be officially insane—but unless worms are around, they are entirely normal, capable, reasonable people. And even when worms are around, their irrationality will only be triggered in respect to the worms—or any attempt to push back against their irrational reaction to the worms, because one of the defining features of delusions is that they defend themselves against attack (that is what makes them a delusion, and not just an erroneous belief).
So we should be wary of Problems with the Mental Illness Model of Religion. But we still need to acknowledge that we are not dealing with rational argument when engaging delusional people; we are dealing with a mental disorder that cripples their cognitive ability in precisely the one domain being argued over (whether it’s worms, ghosts, elections, or gods). We can be sympathetic to that, particularly when they are making every sincere effort to be reasonable. But you’re probably not ever going to cure them. In general, people escape delusionality only when self-motivated to do so. They have to get themselves out. Which requires their own motivated self-exploration.
Moreover, that someone is delusional can be explanatory, but it is never useful as a criticism in and of itself. If you are using claims of their delusionality alone to dismiss what they are saying, you are probably the one who is delusional. Only after establishing a belief they maintain is false, and on abundant evidence, can you explain its persistence with a hypothesis of delusionality. But you cannot refute their belief on the mere grounds that it is a delusion. You also cannot use the persistence of a delusion to claim the deluded are incompetent in any other aspect of life or thought, or character.
That said, let’s proceed.