Jul 24 2008


Published by under Talks



NOTE: What follows is Margaret Feist’s notes for speaking to the Dunedin Sea of Faith group on 24 July 2008. It is partly a summary of Steven Pinker’s paper, and partly notes for generating discussion.
The paper it refers to is available on-line at the New York Times website. One easy way to find it is to enter “New York Times” in a Google search, and then to enter “The Moral Instinct” into the New York Times search box. You will then have to register in order to access the article, but that is free, and doesn’t take long.
Another interesting paper that takes a similar line about morality is by Jonathan Haidt,and is available at: Edge: The Third Culture

“Morality,” writes Pinker, “is not just any old topic in psychology; moral goodness is what gives us the sense that we are worthy human beings. We seek it in our friends and mates, nurture it in our children, advance it in our politics and justify it with our religions.”

A Question: Who is the most admirable, Mother Teresa /Bill Gates/Norman Borlaug?

Pinker speaks of a psychological state which can be turned on and off, like a switch; he calls it the moralisation switch and, when it is on, he says, it commandeers, takes over, our thinking. It is the mind set which tells us something is immoral – wrong.

When an action turns our switch on,
we feel that our response is universal – the rules apply to everyone,
we feel that people who break them deserve to be punished,
we feel righteous, angry, want to make converts.

However in our lifetimes some things that used to be seen as individual issues now trigger a moralistic response : smoking, domestic violence [eg huge punitive damages against tobacco companies] And some that used to trigger strong moralistic responses are now seen as individual issues: mixed flatting, for instance; abortion, same-sex relationships.
However, we do have double standards, and we tend to align our moralisation with our own lifestyles.

*****Summary: The CONTENT of our moral judgment is questionable because:
*our moral sense can sometimes deceive us
*what triggers our moralisation switch may change over time

So far, what I’ve said has depended on argument but there are some scientists who have been using experiment and observation to study the moral sense Pinker introduces us to some of their material.

First, I’m going to tell you a story. I want you to make an instant mental response to it .
Family’s dog killed by car; dog meat delicious; cut it up, eat it for dinner.

And now I’ll tell you another. Again, notice your own response to the story.
Julie and Mark are brother and sister –travelling together in a foreign country – decide to make love – she’s on the pill – he uses a condom – they both enjoy the sex – they decide not to do it again – they keep it secret- they feel closer to each other.

You know what your own response is; now, still in your head, can you justify it? It’s very hard to do. Most people immediately judge the actions to be wrong but Mark and Julie don’t risk having a child; they aren’t hurt by the experience; they don’t offend the community. In the end, people usually say, I don’t know; I can’t explain it; I just know it’s wrong.


It looks as though we make an emotional response and then try to find a justification.

Now, the scientists who have been studying the moral sense want more than this to come to any conclusion and Pinker describes a thought experiment worked out by two philosophers.

Imagine you’re standing beside a railway incline.
A bit further below you, five men are working on the line.
Suddenly you see a runaway wagon careering down the line towards the men.
They will be killed if it reaches them.
Beside you is a lever that will divert the wagon on to a siding.
But on the siding, one man is working. If you divert the wagon, he will be killed.
Would you pull the lever?

You’re standing on a bridge over a railway incline. A bit beyond the bridge, five men are working on the line. Suddenly you see a runaway wagon careering down the line towards the men. They will be killed if it reaches them. The only thing that might stop the wagon is a very large object thrown on to the line from the bridge. And standing beside you on the bridge is a very fat man, big enough to stop the wagon. Would you throw the fat man off the bridge?


This experiment was carried out on the web with 200,000 people from 100 countries. Participants were black and white, young and old, male and female, wide educational range and Muslims, Christian, Hindu, Buddhist, Jews and atheists. Even though the choice in each case was to kill one person in order to save five, most people would pull the lever in the first case, but not heave the man over the bridge in the second. What’s interesting, is that they but couldn’t justify the difference.

Joshua Greene, a philosopher and neuroscientist, suggested evolution might have given us an instinct of revulsion against man-handling an innocent person, which could be right but doesn’t prove anything.

However, Dr Greene teamed up with another neuro-scientist and some colleagues from Princeton. to scan some people’s brains while they were working on the problem you’ve just done. They wanted to see if there was a conflict between areas of the brain associated with emotion and areas dedicated to rational analysis. An MRI scan lights up areas of the brain when they become active. What they found was that when all you had to do was pull a lever, which would kill one man and save five, only the area concerned with calculation and rational decision lit up.

But when people faced the dilemma of killing someone with their bare hands, three areas of the brain lit up showing that they had become active – one concerned with emotions and relationships with other people, one concerned with calculation, (both in the frontal lobes) and a very ancient strip at the base of the brain which warns of conflict between messages coming from different parts of the brain. This experiment showed emotion defeating calculation when the more ancient part of the brain was activated. This result is supported by other studies which show that damage to the frontal lobes may result in people having blunted emotions so that they see it as perfect sense to throw the man off the bridge; for instance children who suffer severe frontal lobe damage, may grow up, despite normal intelligence, to be callous, irresponsible and unable to think through even simple moral dilemmas.

And thirdly, this idea, that there is a physical base for the human moral sense also seems to be borne out by a study of twins separated at birth, which shows that they are more like each other in characteristics called conscientiousness and agreeableness than they are like other members of the family in which they have been raised.


So it looks as though Dr Greene was right; our moral decisions are often emotional and irrational and this balance is decided in a very ancient part of our brain.THE NEXT QUESTION
Pinker then goes on to cite some studies which show that “ the idea that the moral sense is part of human nature is not far-fetched.” He quotes an anthropologist who compiled a list of “human universals” including many moral concepts and emotions. He also cites evidence that the stirrings of morality emerge early in childhood.

But if evidence suggests moral sense is rooted in the human brain,
how come that different societies can apparently hold such different values.
How come crowds in the Sudan demanded public flogging and the death penalty for a teacher who allowed her class to name a teddy bear Mohammed, after the most popular boy in the class, when we see the situation so differently?

The answer seems to be that there is a limited number of spheres which are universal. They are, [do no] harm, fairness, community, authority and purity. They appear to have deep evolutionary roots, that is, they are found not only in humans but in other advanced but non-human species, to confer an advantage in survival. For example,
The theme of avoiding harm to others is seen in rhesus monkeys, who will go hungry, rather than pull a chain which delivers food to them and an electric shock to another.
Respect for authority is shown in the pecking order of birds, the dominance and appeasement patterns widespread in animals.
Purity, the avoidance of defilement, relates to the disgust triggered by potential sources of disease – body fluids, excreta, decaying flesh, incest.
Fairness – is close to what scientists call reciprocal altruism; at its crudest this says if you’re good to other people, they’ll be good to you. Sympathy and anger are emotions which trigger this.
Community – which prompts us to share or sacrifice without any expectation of payback relates to what scientists call nepotistic altruism; in the evolutionary urge for survival, if you favour your own relatives, the family genes are more likely to survive even if you don’t (and this applies to lions, dogs and other pack animals as well as humans.)

If we grant for a moment that these five spheres are common to the moral instinct of all humans, we have to ask, how it is that we differ so much over what’s right and what’s wrong. The answer that Pinker offers is that how they are ranked in importance, and how they affect social attitudes to sex, government, commerce, religion and diet depends on culture. “Many of the flabbergasting practices in faraway places become more intelligible when you see the moralising sphere behind them.” For example:
Japanese fear of non-conformity is in the sphere of community, group loyalty. The washing and diet restrictions of Jews, Hindus (and Maori?) come from purity.
Muslim outrage at any insult to the Prophet, comes from respect for authority.
Or look at it from another point of view.
In the West, we think fairness in government and business is more important than nepotism and cronyism. But if you value community more than fairness, your reaction would be “what heartless creep would favour a perfect stranger over his own brother.”
The ranking and weighting of one sphere against another explains a lot of the differences between East and West, conservative and liberal, [and possibly, in New Zealand, Pakeha and Maori.] Liberals put most weight on harm and fairness, much less on group loyalty, authority and purity. Conservatives value all five about equally.

Any attempt to reassign a behaviour from one sphere to another can create outrage. You should be able to think of recent examples in our own society from purity to fairness for example (same sex relationships) from authority to fairness, domestic violence towards wives or children. Pinker’s examples of questioning the way activities are assigned to particular moral spheres are: Market economies putting everything up for sale. Science seeking to understand rather than judge. Secular philosophy scrutinising all beliefs. It’s not surprising, he says, that they are often seen to be morally corrosive.

Human moral instinct has five common spheres.
Cultures differ in how they rank spheres; this creates different values.
Attempts to shift a behaviour from one sphere to another creates outrage.


IS THE SCIENCE OF THE MORAL SENSE CORROSIVE? Now Pinker turns to this question. This scientific exploration of the moral sense, he says, does not lead to the conclusion that all morality is relative and that there is no right and wrong. Nor does it show that our noblest motives are mere self-interest. When a mother stays up all night with a sick child she is not being selfish; when tourists tip a waitress in a country to which they will never return, they are not expecting repayment; nor is a soldier who falls on a grenade to save his mates. A biological understanding of the moral sense does not entail that people are calculating maximisers of their self interest.

But where , he asks, does it leave the concept of morality itself? We know that some parts of our subjective experience are products of our biological make-up and have no objective counterpart in the real world. The difference between red and green, the foulness of carrion, the scariness of heights are design features of our nervous system. If The difference between right and wrong is also a product of our brain wiring, why should we believe it is any more real than the difference between red and green?

One answer is, “God ordains it.” If God commanded us to torture a child, would that make it right? Are there any reasons to resist such a command? They don’t exist in the physical world, like wavelengths or mass. Are they just figments of our brains?

Pinker suggests that moral truths might be like mathematical truths. We are born with a rudimentary concept of number and as we build on it with mathematical reasoning, we discover that it is in the nature of things that some things are true and some aren’t. If we understand 2 and 4 and the concept of addition, we must conclude that 2 + 2 always = 4.
Suppose we are born with a rudimentary moral sense and as soon as we build on it with moral reasoning, moral reality forces us to some conclusions and not others.

He says that “two features of reality point any rational, self-preserving social agent in the same direction”. The first: Two parties are objectively better off if they both act in an unselfish way than if they act selfishly. The second: if I ask you to do something that affects me, I can’t act as if my interests are special and yours are not. I have to state my interests in a way that would force me to apply them to you.

The core of this second idea, says Pinker, is at the heart of history’s best thought out moral philosophies, including the Golden Rule itself and he goes on to mention The Expanding Circle” a theory by Peter Singer, which he explains is “the optimistic proposal that our moral sense, though shaped by evolution, to overvalue self, kin and clan, can propel us on a path of moral progress as our reasoning forces us to generalise it to larger and larger circles of sentient beings”.

*****Response TO SUM UP, Pinker says:

Morality is still something larger than our inherited moral sense. The new science of the moral sense does not make moral reasoning and conviction obsolete but its implications for our moral universe are profound. At least it tells us that someone who sees things differently may have a moral mindset that seems as universal and mandatory to them as ours does to us. Our moral sense is as vulnerable to illusion as our other senses.
It confuses morality with purity, status, conformity.
It reframes practical problems as moral crusades.
It imposes taboos that make some things undiscussable.
It always puts the self on the side of the angels.
We hit the moralisation button too quickly and look for villains rather than solutions. e.g. climate change [the cause is over-indulgence and defilement; we moralise the problem and let feeling-right feelings get in the way of doing the right thing.]


Far from debunking morality, the science of the moral sense can allow us to see through the illusions we inherit from evolution and culture to focus on views we can share and defend.


[Medial (inward facing part of the frontal lobes) – deal with emotions about other people.

Dorso lateral (upper, outer- facing area of frontal lobes) deal with mental computation e.g. whether to go somewhere by train or car.

Anterior cingulate cortex is an evolutionarily ancient strip at base of inner surface of each cerebral hemisphere.]

No responses yet

Trackback URI | Comments RSS

Leave a Reply