Sir-Sister-of-Suck wrote: ↑Thu Aug 24, 2017 9:43 pm
I have often wondered just how powerful language alone can actually be. Is it possible to convince anyone of anything by just saying the right thing?...
If the answer is no, how do you think this affects philosophical arguments? Are some debates simply meaningless to get into with the intent of convincing the other person?
Psychologist Jonathan Haidt has done some interesting research into this. One of the things he points out is that people "reason" in two different ways. One, is to "reason" for the purpose of finding truth. The other is not so much to "reason' as merely to "rationalize," or find some line of defence for, an action or preference one already has.
In the first case, one lets facts, data and evidence (and, of course, the basic laws of logic) modify one's mind, because truth is the goal and learning is the necessary process.
In the second case, one has limited interest in facts, data, evidence, and even the basic principles of logic: what one is after is just enough of these to allow one to close the question off, and to continue to believe what one has always believed. Once that goal is reached, learning stops, and one is no longer moved by better arguments or new data.
Many arguments are type 2. Very few are type 1. We'd maybe like to think that human beings are truth-seeking machines, but they're just not. For the most part, they tend to defend what they already think rather than change their minds...even when something comes along that puts considerable strain on their previous beliefs, and makes sticking to them a bit of an effort of the will and a bit of a project of denial.
Now, some people are perhaps more motivated by type 1 reasoning than by type 2: but psychologically, people are disposed away from type 1, because it's hard and costly.
What Haidt says makes the difference is whether or not failure to have the truth (rather than a mere rationalization defending a falsehood) is going to be sufficiently costly or not. Will the penalty for believing something less than the whole truth be sufficient to outweigh the cognitive upset involved in changing one's mind? Will one be called to significant account for what one believes? Is anyone watching? Will one be significantly disadvantaged if one persists as one is? If you will, then you'll need the truth -- it's your only chance of defence, security, avoidance of pain, or of avoidance of some other unpleasant situation. But if you won't, or at least think you won't, or the penalty for being wrong is not very big, then perhaps the upset involved in having to change one's mind (and maybe alter one's behaviour, relationships or lifestyle as well) will likely seem too great to warrant a rethinking. And in that case, people are inclined simply to keep believing whatever has previously worked for them, since doing so saves any upset. And defending even a bad or weak position seems better than accepting the cost of having to reorganize one's life.
So maybe the question moves to a new question: what's the penalty for failing to grasp a particular truth? And do people believe there is anything significantly costly about preferring less-than-the-truth about whatever question it is?