The answer I gave -- 'To promote both the use of rational thought and the awareness that people are not fundamentally rational' -- is far from perfect (or even profound) perhaps, but it does more-or-less express what has become a guiding principle of mine.
It has at least the benefit of being both parsimonious and symmetrical, as each side of this particular outlook has a single source:
Do we need to promote rational thought? Yes, why, just look at all the crazy shit people do?!
Are we, deep down, fundamentally rational beings? No, why, just look at all the crazy shit people do?!
What is more, given the sheer volume of said crazy shit, I get to see my beliefs confirmed on a daily basis.
That is nice.
It's helpful now and then, though, to discover that at least some of the views I hold have a more firm grounding in what has succinctly and pithily described as 'earth-logic', something from which all too many people's thinking achieves escape velocity.
Last week, Elizabeth Kolbert had an interesting article in the New Yorker on a related point. In 'What was I thinking?', she looks at a couple of books on research being done on the irrational bases of behaviour. She focuses on 'behavioural economics' in the form of Dan Ariely's book Predictably Irrational: The Hidden Forces that Shape Our Decisions.
I've not read it, but I like the gist of his arguments.
He claims that his experiments, and others like them, reveal the underlying logic to our illogic. “Our irrational behaviors are neither random nor senseless—they are systematic,” he writes. “We all make the same types of mistakes over and over.” So attached are we to certain kinds of errors, he contends, that we are incapable even of recognizing them as errors.
With regard to a somewhat different sphere, 'The Moral Instinct', by Steven Pinker, appeared at the New York Times, summarising a variety of work on the issue of where people's moral beliefs come from. Confronting the assumption that morality is (simply) imposed through learning and imitation, he points to research suggesting a far more intuitive understanding of moral concepts that underlies a large part evaluating right and wrong.
At least a certain portion of culture, then, appears as a result of the effort to find post-hoc rationalisations for what we think anyway. (That certainly explains a lot of blogging...)
Like a lot of Pinker's writing, the article is a mixture of effective summary, brilliant insight and sometimes careless quips. (For instance: I'm not convinced that arguments about the environmental impact of, say, S.U.V.s are necessarily based on personal moral abhorrence about 'over-indulgence'. One does not need moral priggishness to critique personal wastefulness, merely an understanding that individual behaviour multiplied by hundreds of millions of individuals can have an enormous impact.)
Also like a lot of Pinker's writing, it is enormously compelling. He observes:
The science of the moral sense also alerts us to ways in which our psychological makeup can get in the way of our arriving at the most defensible moral conclusions. The moral sense, we are learning, is as vulnerable to illusions as the other senses. It is apt to confuse morality per se with purity, status and conformity. It tends to reframe practical problems as moral crusades and thus see their solution in punitive aggression. It imposes taboos that make certain ideas indiscussible. And it has the nasty habit of always putting the self on the side of the angels.
One of the researchers mentioned by Pinker is Jonathan Haidt. Haidt has a curious article at Edge: 'Moral Psychology and the Misunderstanding of Religion'. I say 'curious', because its first part (up to about page six on the printed version) is a fascinating and convincing look at the intuitive nature of moral judgements and the unconscious causation of most behaviour whereas its second half is a much less convincing critique of 'New Atheism' .
This is one of the best bits, at least with regard to the topic I'm discussing here:
Our brains, like other animal brains, are constantly trying to fine tune and speed up the central decision of all action: approach or avoid. You can't understand the river of fMRI studies on neuroeconomics and decision making without embracing this principle. We have affectively-valenced intuitive reactions to almost everything, particularly to morally relevant stimuli such as gossip or the evening news. Reasoning by its very nature is slow, playing out in seconds.And there's another conclusion that is also important:
Studies of everyday reasoning show that we usually use reason to search for evidence to support our initial judgment, which was made in milliseconds. But I do agree with Josh Greene that sometimes we can use controlled processes such as reasoning to override our initial intuitions. I just think this happens rarely, maybe in one or two percent of the hundreds of judgments we make each week. And I do agree with Marc Hauser that these moral intuitions require a lot of computation, which he is unpacking.
Hauser and I mostly disagree on a definitional question: whether this means that "cognition" precedes "emotion." I try never to contrast those terms, because it's all cognition. I think the crucial contrast is between two kinds of cognition: intuitions (which are fast and usually affectively laden) and reasoning (which is slow, cool, and less motivating).
The basic idea is that we did not evolve language and reasoning because they helped us to find truth; we evolved these skills because they were useful to their bearers, and among their greatest benefits were reputation management and manipulation. (Emphasis added)
And this, I think is a key point: our psychologies are about use, not truth.
Haidt's article is fairly lengthy, as are the responses to it by David Sloan Wilson, Michael Sherman, Sam Harris, PZ Myers and Marc Hauser. So, this posting is an entirely too brief summary of what that discussion is all about.
Much of that discussion focuses on the much weaker part of Haidt's paper, where he tries to apply his empirical conclusions to the 'New Atheism'.
I'll simply direct you to the responses by Myers and Harris on that topic.
But I also think that Haidt's efforts to link religiosity to the current effort by some people to rehabilitate 'group selection' are unconvincing.
Hauser makes a very good point about this:
This is bad evolutionary reasoning, and the kind of speculation that ultimately led Gould and Lewontin to have a field day with loose just-so stories. But there is more. Just because there is variation doesn’t mean it will be selected. It has to be heritable variation. One has to show that the belief systems are genetically passed on in some way, or one has to argue for cultural selection, which is an entirely different affair, at least at the level of mechanism and timing of change. I don’t see any evidence that the observed variation in beliefs is heritable in a genetic sense. (Emphasis added)
Neither do I, and there are other problems with group selection in the sense that Wilson and others seem to be trying to revive. This is not a new spat: Geoff made some good observations about another Dawkins-Wilson tiff on a similar topic last year.
(Hauser's point about heritability is also germane, of course, to Gregory Clark's recent speculations about the genetic basis of capitalism. I commented here, here, and here.)
It seems clear to me that while Haidt is right to point out the benefits that might accrue to those who are well integrated into their communities, he is mistaking those benefits as being purely religious in nature. (This is partly what Myers rebukes him for.)
Moreover (and this comes out in Sam Harris's response), Haidt seems to be basing his view of religion largely (or maybe even exclusively, as far as his empirical evidence goes) on the relatively contained, civilised, reformed -- in short, tamed -- version that you find in some parts of the modern world and not the less cheerful versions of it so common in much of the past and present.
Finally, I think the issue of 'benefit' (are religious people 'happier') is a different -- and far less interesting -- one than that of 'truth' (do gods exist). It's mainly the latter question that the recent best-selling atheist authors have confronted; however, even on the issue of the former one, Haidt's view of religion seems oddly one-sided.
Anyway, the topic of intuitive judgements seems difficult to escape these days.
Just this morning I ran across Momus using Malcolm Gladwell's book Blink to think about the great deal of information we can gain from the briefest of impressions:
Gladwell calls this "thin-slicing" and explains that "as human beings we are capable of making sense of situations based on the thinnest slice of experience". This might sound lazy, but there's something rather elegant -- and sometimes startlingly acute -- about it. "In a psychological experiment, normal people given fifteen minutes to examine a student's college dormitory can describe the subject's personality more accurately than his or her own friends." It's why I always scribble down my first impressions of a new city within minutes of arriving. It's not just that first impressions are lasting, they're also some of the most penetrating thin-slices you'll ever get. "Reality", said Willem de Kooning, "is a slipping glimpse".
And our minds present us a view of that reality (in most cases) that is useful rather than truthful.
Finally: Some digging around has brought up a couple of very interesting-looking articles on this topic by John Bargh -- whom Haidt mentions -- that I've not managed to read yet: 'The Unbearable Automaticity of Being' (pdf) and 'What Have We Been Priming All these Years?' (pdf).
What all of these insights mean for topics of interest at this blog -- namely, the study of history and literature -- is a challenging question.
Last year, in 'The Limits of Culture?', I at least tried to make a start on thinking about how evolutionary psychology might be integrated into historical studies with regard to the topic of violence. (The article, by the way, is FREE for download. I mention this only because articles in most academic journals are not...and also because, as Elizabeth Kolbert points out in the opening paragraphs of her New Yorker article, the word 'free' has a profound affect on the human psyche. I'm trying to start a stampede... Also note: occasionally, the IngentaConnect site seems to pitch a fit and either never load or tell you that the content is not there. It is. Just keep trying. Or get in touch if you can't.)
Responses to the article can be found here and here; my response to the responses here. Access to these latter bits, however, will require either that you be affiliated with some kind of institution that subscribes to such online content or that you cough up some bucks first. Sorry. As in so many things, as the man said, TANSTAAFL.
Ain't it the truth.
1 comment:
Thanks for the plug, but more importantly, thanks for this post. I look forward to exploring all the references. You mentioned Dan Ariely - and this prompts me to mention this interview: http://www.bloggingheads.tv/diavlogs/9058?in=00:25:11&out=00:25:46 - which I haven't yet watched, but which is on my to-do list.
Post a Comment