Bad Thinkers

Is the way people think, to use the computer metaphor, always amenable through a software update? Or, are there people whose hardware just can’t be updated?

I suspect we all know of people (hopefully not too many) who believe in wild conspiracy theories. They tend to reject contrary evidence, solid science, or sound arguments. So a fair question arises: Are these people misinform or just badly educated? Or is their bad thinking something innate? Is this way of thinking just part of their character?

Well Quassim Cassam, a professor of philosophy at the University of Warwick at Coventry, discusses this idea in an Aeon piece. Cassam illustrates his argument through the fictitious story of Oliver, a guy who believes 911 was a massive conspiracy. Regardless of the overwhelming evidence against his conspiracy theory, Oliver dismisses that evidence and believes 911 was a big conspiracy. Oliver cannot be reasoned out of this belief. You cannot, and will not, get through the thicket of nonsense that’s taken hold of him.

The typical argument is that people like Oliver have an information problem: they either lack enough or cannot process it correctly. But Cassam sees the problem as even more fundamental. Oliver’s wild conspiracy beliefs aren’t the result of a lack of relevant information, they have more to do with Oliver’s intellectual character:

I want to argue for something which is controversial, although I believe that it is also intuitive and commonsensical. My claim is this: Oliver believes what he does because that is the kind of thinker he is or, to put it more bluntly, because there is something wrong with how he thinks. The problem with conspiracy theorists is not, as the US legal scholar Cass Sunstein argues, that they have little relevant information. The key to what they end up believing is how they interpret and respond to the vast quantities of relevant information at their disposal. I want to suggest that this is fundamentally a question of the way they are. Oliver isn’t mad (or at least, he needn’t be). Nevertheless, his beliefs about 9/11 are the result of the peculiarities of his intellectual constitution – in a word, of his intellectual character.

This is a controversial idea, but my own experience leads me to believe Cassam may be right. It does make sense intuitively, but it would take experimental research to confirm it. The difficulty I see in conducting this research is trying to separate structural, innate, mental deficiencies in thinking from information processing problems and willful ignorance.

I suspect some bad thinkers, I realize I’m being optimistic, can be improved if they have the right attitude and receive the right kind of education. But “attitude” is critical. I say this because without some admission or realization that you’re wrong or maybe not applying good thinking skills it’s not likely any amount of education will work. As the saying goes, “There are none so blind as those who will not see.” So I’ll admit attitude may be the very thing you can’t turn with a bad thinker. Cassam is not as optimistic:

It is in the nature of many intellectual character traits that you don’t realize you have them, and so aren’t aware of the true extent to which your thinking is influenced by them. The gullible rarely believe they are gullible and the closed-minded don’t believe they are closed-minded. The only hope of overcoming self-ignorance in such cases is to accept that other people – your co-workers, your spouse, your friends – probably know your intellectual character better than you do. But even that won’t necessarily help. After all, it might be that refusing to listen to what other people say about you is one of your intellectual character traits. Some defects are incurable.

For the most part a good education should lead to a high degree of epistemological humility. The more we learn the more we realize just how much we don’t know. Barring strong evidence or sound logic we should approach ideas cautiously. Our path to knowledge should be inferentially, moving in a linear path from one valid point to the next, stepping stone to stone across the river. But it may be that the ability to do this or not do this, at least in some of us, is more about something that resides in our personal constitution than in our cognitive tool kit.

David Hume and Moral Judgements

David Hume
David Hume (photo: Wiki)

Morals excite passions, and produce or prevent actions. Reason of itself is utterly impotent in this particular. The rules of morality, therefore, are not conclusions of our reason. — David Hume

Today in history is David Hume’s birthday. Hume was a 18th century Scottish philosopher who’s writings have gotten more attention lately because modern psychologists, notably Jonathan Haidt, have been praising parts of Hume’s moral philosophy. Haidt sees the core of Hume’s moral philosophy as anticipating Haidt’s research conclusions about how people reach moral judgements.

In short, Hume & Haidt both argue that moral judgements aren’t something we reach through a process of reflective reasoning. That’s a rationalist delusion. Our moral judgements are primarily the product of our intuitions, or in the Humian sense, our passions. Reason is simply the ex post facto lawyer who’s primary job is to defend your moral intuitions. Haidt’s social intuitionism model says, “Moral intuitions come first, strategic reasoning comes second.”

As David Hume wrote: “Reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them.” Reason serves our passions, our emotions, our non-rational intuitions.

Of course the down side of Hume’s & Haidt’s idea is that logic and good arguments are not likely to persuade anybody of anything when people’s beliefs or moral convictions are set. Haidt would argue, and I think rightly, that it’s not that reason can’t persuade at all, it’s just that until emotional barriers are soften or lowered Reason cannot make much headway in influencing our morals, politics, religion or football.

As Jonathan Swift wrote: “Reasoning will never make a man correct an ill opinion, which by reasoning he never acquired…”

I think an honest observer of people and human nature would think Hume’s & Haidt’s idea a confirmation of the obvious. I mean watch people at election time…like now in the US.

Politics, which is very moral, is a good example. Facts, evidence, and sound argument are not what typically guides most people’s decisions about issues and who to vote for. Now, I don’t think that’s true with everyone. Certainly there are people who can be persuaded by logic, evidence, and good arguments. There are people who are aware of their emotional barriers and who do want to get at the truth or what’s best for themselves or the society. But that persuadable group is relatively small, unfortunately.

The saving grace for our society is that this group of Independent voters and thinkers have a big impact in the outcome of close elections. They’re willing to go either way and are weighing the reasons and arguments for each candidate—or so I’d like to believe.

To Weigh and Consider

It seems almost beyond saying, but then human psychology seems never finished teaching us the same lessons over and over, that there is almost always more to something than we’re led to believe at first blush. We are, as Daniel Kahneman says, “a machine for jumping to conclusions.”

This is not to say we shouldn’t decide quickly when we must. When the situation demands it, then we must make a quick decision based on the what we know at the time. Hopefully, we can correct mistakes made as they’re realized along the way.

But in many instances we don’t have to make quick decisions. We usually have time to weigh and consider, to be thoughtful and morally responsible. We have time to do the work—and it’s mental work!—that someone who really cares about truth, or the pursuit of it, will do and take seriously. And this is so very important and responsible and noble where the lives and welfare of our fellow human beings are at stake.