Archive for the ‘humanism’ Category
bonobos and us – lessons to be learnt

Let’s be sexy about this
Bonobos separated from chimps maybe less than a million years ago, according to some pundits. We haven’t yet been able to determine a more precise date for the split. So which species has changed more? Have chimps become more aggressive or have bonobos become more caring? Is there any way of finding out?
It’s not just about genes its about their expression. It will take some time to work all that out. Brain studies too will help, as we move towards scanning and exploring brains more effectively and less invasively.
But surely we seek not just to understand the bonobo world but to change our own. Who wouldn’t want a world that was less violent, less exclusionary in terms of sex, more caring and sharing, without any loss of the dynamism and questing that has taken us to to the very brink of iphone7?
That last remark will date very quickly… Nah, I’ll leave it in.
So we can learn lessons, and of course we’re already on that path. Advanced societies, if that’s not too presumptuous a term, are less patriarchal than they’ve ever been, without losing any of their dynamism. On the contrary, it can easily be seen that the most male-supremacist societies in the world are also the most violent, the most repressive and the most backward. Some of those societies, as we know, have their backwardness masked by the fact that they have a commodity, oil, that the world is still addicted to, which has made the society so rich that their citizens don’t even have to pay tax. The rest of the world is supporting tyrannical regimes, which won’t change as long as they feel well-fed and secure. Not that I’d wish starvation and insecurity on anyone, but as Roland Barthes once said at one of his packed lectures, the people standing at the back who can’t hear properly and have sore feet must be wondering why they’re here.
Maybe a bit of discomfort, in the form of completely shifting away from fossil fuels for our energy needs haha, might bring certain Middle Eastern countries to a more serious questioning of their patriarchal delusions? Without their currently-valuable resource, they might wake to the fact that they need to become smarter. The women in those countries, so effective on occasion in forming coalitions to defend their inferior place in society, might be encouraged to use their collective power in more diverse ways. That could be how things socially evolve there.
Meanwhile in the west, the lesson of the bonobos would seem to be coalitions and sex. We’ve certainly arrived at an era where sexual dimorphism is irrelevant, except where women are isolated, for example in domestic situations. The same isolation also poses a threat to children. The bonobo example of coalitions and togetherness and sharing of responsibilities, and sexual favours (something we’re a long way from emulating, with our jealousies and petty rivalries) should be the way forward for us. Hopefully the future will see a further erosion of the nuclear family and a greater diversity of child-rearing environments, where single-parent families are far less isolated than they are today, and males want to help and support and teach children because they are children, not because they are their children…
was the invasion of Iraq justified?
“What difference does it make to the dead, the orphans and the homeless, whether the mad destruction is wrought under the name of totalitarianism or in the holy name of liberty or democracy?”
― Mahatma Gandhi
In 2003 I protested against the impending attack on Iraq, along with so many others, though I don’t like being involved in mass protests, because they tend to over-simplify the response. A lot of the protesters were saying things I didn’t agree with, as is often the case. For example, some were using the national sovereignty argument, which I have little time for. Others were saying that war is always wrong, but I think war can be justified if it results in less harm than non-intervention, though this isn’t always easy to determine. As a humanist, I don’t think national or cultural boundaries should interfere with what we owe, ethically, to others, though I recognise as a pragmatic fact that they often do.
To me, the Iraq invasion has always been a clear-cut case of a criminal act, resulting in a loss of life – hardly unforeseeable – far greater than that suffered by the USA on September 11 2001. Furthermore, the September 11 atrocities, without which the invasion clearly would never have occurred, were in no way connected to the Iraqi regime. In the lead-up to the invasion, at the time of the protests, I was incensed, like others, at the Bush regime’s bullying treatment of the weapons inspectors in Iraq, and Hans Blix in particular, because their findings didn’t fit with the story Washington was trying to sell. This bullying proliferated, of course, to the leaders of major European nations such as France and Germany. The response of the French government to the possibility of war still seemed to me the most sensible and prescient one. In January of 2003, their foreign minister, Dominique de Villepin said ‘We think that military intervention would be the worst possible solution’, even though the French government felt at the time that Iraq wasn’t being truthful about WMD. In an impassioned speech to the Security Council only a few weeks later, Villepin spoke of the “incalculable consequences for the stability of this scarred and fragile region”, whose overwhelmingly Moslem inhabitants had sound historical reasons for suspecting and wanting to resist western interventions. He said that “the option of war might seem a priori to be the swiftest, but let us not forget that having won the war, one has to build peace”. He also reported on the intelligence of France and its allies, which failed comprehensively to support links between al-Qaeda and Hussein’s regime. Of course, Villepin’s speech was roundly rejected and disparaged by the US and UK leadership, and the rest is the history we’re making and trying to make sense of today.
I’m returning to the subject for two reasons – a philosophical summary of pacifism and just war theory in a recent issue of Philosophy Now magazine (issue 102), and the views of British leftist but pro-Iraq war writers such as Nick Cohen.
In 2006, a document called the Euston Manifesto was produced in Britain. A leftist document, it was designed to draw the line against what its authors and signatories claimed to be an overly-indulgent, cultural relativist tendency in a large sector of the leftist commentariat. The document focused largely on the positives – upholding human rights, freedom of expression, pluralism, liberalism, historical truth, the heritage of democracy, internationalism and equality. It expressed opposition to tyranny and terrorism, racism, misogyny and censorship. In more specific terms, it supported a two-state solution to the Palestinian conflict and opposed anti-Americanism – though in a somewhat backhanded way:
That US foreign policy has often opposed progressive movements and governments and supported regressive and authoritarian ones does not justify generalized prejudice against either the country or its people.
This is all outlined in the manifesto’s ‘statement of principles’ (section B), none of which I have any issue with. Section C, ‘elaborations’, addresses the Iraq war, inter alia, and is a little more problematic. Just before the Iraq campaign is dealt with there’s a paragraph on the September 11 attacks, which is uncompromisingly hostile to the view that it could be in any way justified as payback for US policy in the Middle East. Again I completely agree.
The paragraph that follows is interesting, and I will quote it in full, always remembering that it was written in 2006, before the execution of Saddam Hussein, and not long after the first parliamentary elections. Much has changed since then, with Iraqi governments becoming less democratic, and the contours of instability constantly changing.
The founding supporters of this statement took different views on the military intervention in Iraq, both for and against. We recognize that it was possible reasonably to disagree about the justification for the intervention, the manner in which it was carried through, the planning (or lack of it) for the aftermath, and the prospects for the successful implementation of democratic change. We are, however, united in our view about the reactionary, semi-fascist and murderous character of the Baathist regime in Iraq, and we recognize its overthrow as a liberation of the Iraqi people. We are also united in the view that, since the day on which this occurred, the proper concern of genuine liberals and members of the Left should have been the battle to put in place in Iraq a democratic political order and to rebuild the country’s infrastructure, to create after decades of the most brutal oppression a life for Iraqis which those living in democratic countries take for granted — rather than picking through the rubble of the arguments over intervention.
Since this post is precisely about the arguments over intervention, I should say something in justification of my writing it. While we can’t predict precisely the outcome of an intervention or invasion or liberation (words are so important here), there are often broad and quite obvious signs to indicate whether such an event will advantage or disadvantage the targeted population. In analysing these signs we utilise history (or we should do) – that’s to say, we pick through the rubble of previous experiences of intervention. The question of whether the invasion (or whatever you choose to call it) of Iraq was justified is therefore a question about the future as well as the past. How, in the future, and in the present, should we, as humanists, deal with oppressive, reactionary, murderous regimes, such as exist today in North Korea, in Myanmar, and in the wannabe state of ‘the caliphate’? Not to mention so many other dictatorial regimes whose likely ‘murderousness’ is hard to get data on, such as China, Russia, Saudi Arabia and other Asian and African tyrannies large and small.
I also have a quibble with the view that all good liberal leftists, regardless of their position before the war, should jump on board with the invaders to ‘remake’ Iraq into a democracy. The obvious problem with this view is that many of the anti-war protesters were concerned, and deeply so, that the reason for the invasion wasn’t democracy-building. The stated reason for the invasion, after all, was a defensive one; getting rid of WMDs to make the world a safer place. Other reasons were suspected, including simple restoration of US pride, and economic exploitation. The bullishness of the invasion rhetoric didn’t sound much like an attempt at democracy-building.
But I think the overwhelming reason for this deep concern – it was certainly my concern – was the suffering and harm that the invasion and aftermath would inflict on the people of Iraq. Nations invaded by foreigners tend to fight back, regardless of how much of a basket case the invaders think the nation is. This is even more the case when the ‘liberators’ are seen as having values antithetical to the target nation. Think of the consternation caused by the threatened invasion of England by the Spanish in the 1580s, or the French in the early 1800s, surely mild compared to that felt by the overwhelmingly Moslem Iraqis, fed for decades on tales of western decadence and double-dealing. An invasion would be fought bitterly, Hussein or no Hussein, and democracy isn’t the sort of thing to be imposed from above. So it’s understandable that those opposed to the invasion, and crushed by their failure to stop it, didn’t rush to join hands with those whose motives they so distrusted in an enthusiastic experiment in nation-restructuring.
I’m no pacifist, and I’m concerned and demoralised by brutal dictatorships everywhere – many of which we know little about. I would like to see interventions wherever murder and oppression are the weapons of state control, but that’s a big ask, and where do we start, and how do we do it? Warfare is one of the most problematic options, but will a siege of sanctions be effective? A united, internationalist front which will offer credible threats – desist and democratise or else? And should we start with the tinpot dictatorships and work our way up to the giants? Which leads back to the question, why Iraq in the first place?
Muddled motives and intentions lead inevitably to muddled and contradictory outcomes. Indeed the stated motive for the intervention, dismantling WMDs and making the rest of the world a safer place, didn’t consider the Iraqi people directly at all. On that basis alone, the war could hardly be justified, because it was clear that even if Hussein’s weapons existed, they were not an imminent threat, with the dictator doing everything in his power to placate the west. Hussein was brutal and nasty, but his instinct for self-preservation was paramount, and it was clear in the last days of his regime that he was saving his sabre-rattling for his domestic audience while bending over backwards to comply with international demands.
One argument being put at the time was that anything was better than Saddam. But is this really the case? Consider two polar scenarios; a failed state in which there are no government regulations, and no police or legal institutions, an anarchic free-for-all; or a rigid dictatorship in which freedom is highly circumscribed and much that we value in life is sacrificed just for survival. Which is better? Well, with that very slight sketch it’s impossible to judge, but neither is very palatable. In the case of Iraq it would be comparing a ‘known’ with an ‘unknown’. The result of deposing Saddam was unknown and poorly planned for, but clearly it would unleash violent forces, and we knew from organisations such as Human Rights Watch that the day-to-day dictatorship, though repressive, wasn’t murderous at the time of the invasion.
My concern then, was saving lives, or more broadly, minimising harm. One thing I’ve always loathed is the ‘big picture’ politics of certain world leaders who like to redraw maps and bring down regimes with grand strategies, with very little thought to the ordinary struggles for survival, the lives and loves of people who suffer the consequences of those grand plans – including death and destruction. Of course, harm minimisation is fiendishly difficult to quantify when you’re talking about such variables as freedom and opportunity, but at least we can try. Just war theory might help us with some guidelines.
Duane Cady, Professor of Philosophy, Emeritus at Hamline University, Minnesota, provides a two-part outline of just war theory as currently understood. I’ll focus only on the first part, which seeks to answer the question – When is it justified to go to war?
Going to war justly requires meeting 6 conditions:
1. The war must be made on behalf of a just cause
2. The decision to go to war must be made by proper authorities
3. Participants must have a good intention rather than revenge or greed as their goal
4. It must be likely that peace will emerge after the war
5. Going to war must be a last resort
6. The total amount of evil resulting from making war must be outweighed by the good likely to come of it.
I hardly need to go into detail to show that a number of these conditions were not met in the case of the Iraq venture, but I’ll briefly discuss each one.
For condition 1, if WMDs were the cause, then it wasn’t just, as there weren’t any, and the best intelligence showed this. Other causes, such as getting rid of a despot, bringing about democracy, lead to the question – why Iraq? Why not Syria, or Saudi Arabia? Why pick on any Middle Eastern country where western interference would be fiercely combatted?
For condition 2, there are supposed to be strict rules regarding such decisions, though of course they’re unenforceable. In September 2004, the then UN Secretary General, Kofi Annan declared the Iraq invasion illegal from the point of view of the UN’s charter, presumably because of insufficient numbers in the Security Council agreeing to it. If you consider the UN the proper authority to make such final decisions – and if not what would be? – then condition 2 hasn’t been met.
Condition 3 goes to intentions, which might be muddled or concealed. My view is that revenge, or wounded pride, had much to do with it on the US side. People may disagree, but nobody can seriously argue that the Bush administrations’s intentions were clear and humane.
Condition 4 gives no timeline. ‘After’ is a long time, and peace might achieved at the cost of maximal loss of life. The condition is a little too vague to be useful. Certainly, a quick peace looked highly unlikely, and I think that was a major concern of protesters worldwide.
Condition 5 clearly wasn’t met. The term ‘last resort’ infers something else – a last resort before x occurs, that x being something catastrophic and to be avoided at all costs. Whether there was an x in Iraq’s case is highly questionable.
In the long view, I think, or fervently hope, condition 6 will be met, but that’s only because I’m a ‘better angels of our nature’ advocate, and anyway the lack of a time-frame attached to the condition renders it essentially meaningless. Is Europe now more humane and peaceful as a result of the Thirty Years’ War? To what degree is our greater tolerance of diversity a direct result of the Nazis’ homogenising race policies? There’s no doubt that the most horrible wars can result in massive lessons learnt, leading to accelerated positive outcomes, but that in no way justifies them.
So, okay, the Iraq war was a disaster. However, I thoroughly agree with Alex Garland, the writer and film-maker, who referred briefly to the war in a recent Point of Inquiry interview. It’s too late to wonder about whether the invasion of Iraq was a good idea, and it was essentially too late even when the protests began in 2003, as it had a horrible inevitability about it. Trying to work out the consequences, to minimise the negatives and maximise the positives, and to take responsibility for those consequences, is much more important. Particular nations, including Australia, imposed this invasion on the Iraqi people. Those nations, above all, should take most of the responsibility for the consequences. I don’t think that’s really happening at the moment.
some thoughts on humanism and activism
I’ve been a little more involved in ‘movements’ in recent years, though I’m not usually much of a joiner, and I’ve always been wary of ‘activism’, which is often associated with protesting, personning the barricades (doesn’t have quite the aggressive ring to it, does it?), even a bit of biffo – if largely verbal, by preference. I’ve just been hungry for a bit of stimulus – salon culture, witty and cultured and informative exchanges with people cleverer than myself. But since I’ve been occasionally asked to engage on a higher, or deeper level, in ‘the culture wars’, on the side of reason, atheism, secularism, humanism, whatever, my thoughts on the matter have started to crystallise, and they’re hopefully in evidence in my blog writing.
I don’t mind calling myself an activist for humanism, or for other isms, but I think we should be activists for rather than against. Now it might be argued that to argue for one thing is to argue against another, so it doesn’t really matter, but I think it matters a great deal. It’s a matter of trying to be positive and influencing others with your positivity. Secular humanism has a great case to promote, as do reason, self-awareness and ‘skepticism with sympathy’.
I’ve learned from years of teaching students from scores of different countries and cultures that we all can be excited by learning new stuff, that we’re amused by similar things, that we all want to improve and to be loved and appreciated. The ties that bind us as humans are far greater than those that divide us culturally or in other ways. I’ve also learned that the first principle of good teaching is to engage your students, rather than haranguing or badgering them. This may not seem easy when you’re teaching something as apparently dry and contentless as language and grammar, but language is essentially a technology for communicating content, and if we didn’t have anything meaningful or important to communicate, we’d never have developed it. So the key is to engage students with content that’s relevant to them, and stimulating and thought-provoking enough that they’ll want to communicate those thoughts.
I suppose I’m talking about constructive engagement, and this is the best form of activism. Of course, like everyone, I don’t always ‘constructively engage’. I get mad and frustrated, I dismiss with contempt, I feel offended or vengeful, yet the best antidote to those negative feelings is simple, and that is to throw yourself into the lives, the culture, the background of your ‘enemy’, or the ‘other’, which requires imagination as well as knowledge. I mis-spent a lot of my youth reading fiction from non-English backgrounds – from France and Germany, from Russia and eastern Europe, from Africa and Asia. It was a lot cheaper than travelling, especially as I avoided a lot of paid work in order to indulge my reading. Of course I read other stuff too, history, philosophy, psychology, new-wave feminism, but fiction – good fiction, of course – situated all these subjects and issues within conflicted, emotional, culturally-shaped and striving individuals, and provided me with a sense of the almost unfathomable complexity of human endeavour. The understanding of multiple backgrounds and contexts, especially when recognising that your own background is a product of so much chance, creates multiple sympathies, and that’s essential to humanism, to my mind.
However, there are limits to such identifications. Steven Pinker discusses this in The better angels of our nature (the best advertisement for humanism I’ve ever read) by criticising the overuse, or abuse, of the term ’empathy’ and expressing his preference for ‘sympathy’. Empathy is an impossible ideal, and it can involve losing your own bearings in identifying with another. There are always broader considerations.
Take the case of the vaccination debate. While there are definitely charlatans out there directly benefitting from the spread of misinformation, most of the people we meet who are opposed to vaccination aren’t of that kind, usually they have personal stories or information from people they trust that has caused them to think the way they do. We can surely feel sympathy with such people – after all, we also have had personal experiences that have massively influenced how we think, and we get much of our info from people we trust. But we also have evidence, or know how to get it. We owe it to ourselves and others to be educated on these matters. How many of us who advocate vaccination know how a vaccine actually works? If we wish to enter that particular debate, a working knowledge of the science is an essential prerequisite (and it’s not so difficult, there’s a lot of reliable explanatory material online, including videos), together with a historical knowledge of the benefits of vaccination in virtually eradicating various diseases. To arm yourself with and disseminate such knowledge is, to me, the best form of humanist activism.
I’ll choose a couple more topical issues, to look at how we could and should be positively active, IMHO. The first, current in Australia, is chaplaincy in schools. The second, a pressing issue right now for Australians but of universal import, is capital punishment.
The rather odd idea of chaplaincy in schools was first mooted by Federal Minister Greg Hunt in 2006 after lobbying from a church leader and was acted upon by the Howard government in 2007. It was odd for a number of reasons. First, education is generally held to be a state rather than a federal responsibility, and second, our public education system has no provision in it for religious instruction or religious proselytising. The term ‘chaplain’ has a clear religious, or to be more precise Christian, association, so why, in the 21st century, in an increasingly multicultural society in which Christianity was clearly on the decline according to decades of census figures, and more obviously evidenced by scores of empty churches in each state, was the federal government introducing these Christian reps into our schools via taxpayer funds? It was an issue tailor-made for humanist organisations, humanism being dedicated – and I trust my view on this is uncontroversial – to emphasising what unites us, in terms of human rights and responsibilities, rather than what divides us (religion, nationality, gender, sexual orientation etc). To introduce these specifically Christian workers, out of the blue, into an increasingly non-Christian arena, seemed almost deliberately divisive.
Currently the National School Chaplaincy Program is in recess, having been stymied by two effective High Court challenges brought by a private citizen, Ron Williams, of the Humanist Society of Queensland. As far as I’m aware, Williams’ challenge was largely self-funded, but assisted by a donation from at least one of the state humanist societies. This was a cause that could and should have been financed and driven by humanists in a nationally co-ordinated campaign, which would have enabled humanists to have a voice on the issue, and to make a positive contribution to the debate.
What would have been that contribution? Above all to provide evidence, for the growing secularism and multiculturalism of the nation and therefore the clearly anachronistic and potentially divisive nature of the government’s policy. Identification with every Christian denomination is dropping as a percentage of the national population, and the drop is accelerating. This is nobody’s opinion, it’s simply a fact. Church attendance is at the lowest it’s ever been in our Christian history – another fact. Humanists could have gone on the front foot in questioning the role of these chaplains. In the legislation they’re expected to provide “support and guidance about ethics, values, relationships and spirituality”, but there’s an insistence that they shouldn’t replace school counsellors, for counselling isn’t their role. Apparently they’re to provide support without counselling, just by ‘being there’. Wouldn’t it be cheaper to just have their photos on the school walls? The ‘spirituality’ role is one that humanists could have a lot of fun with. I’ve heard the argument that people are just as religious as ever, but that they’ve rejected the established churches, and are developing their own spirituality, their own relationship to their god, so I suppose it would follow that their spirituality needs to be nourished at school. But the government has made a clear requirement that chaplains need to be members of an established religion (and obviously of a Christian denomination), so how exactly is that going to work?
While humour, along with High Court challenges and pointed questions about commitment to real education and student welfare, would be the way to ‘get active’ with the school chaplaincy fiasco, the capital punishment issue is rather more serious.
The Indonesian decision to execute convicted drug pedlars of various nationalities has attracted a lot of unwanted publicity, from an Indonesian perspective, but a lot of the response, including some from our government, has been lecturing and hectoring. People almost gleefully describe the Indonesians as barbarians and delight in the term ‘state-sanctioned murder’, mostly unaware of the vast changes in our society that have made capital punishment, which ended here in the sixties, seem like something positively medieval. These changes have not occurred to the same degree in other parts of the world, and as humanists, with a hopefully international perspective, we should be cognisant of this, aware of the diversity, and sympathetic to the issues faced by other nations faced with serious drug and crime problems. But above all we should look to offer humane solutions.
By far the best contribution to this issue I’ve heard so far has come from Richard Branson, representing the Global Commission on Drug Policy (GCDP), who spoke of his and other commissioners’ interest in speaking to the Indonesians about solutions to their drug problems, not to lecture or to threaten, but to advise on drug policies that work. No mention was made about capital punishment, which I think was a good thing, for what has rendered capital punishment obsolete more than anything else has been the development of societies that see their members as flawed but capable, mostly, of development for the better. Solutions to crime, drug use and many other issues – including, for that matter, joining terrorist organisations – are rarely punitive. They involve support, communication and connection. Branson, interviewed on the ABC’s morning news program, pointed to the evidence showing that harsh penalties had no effect on the drug trade, and that the most effective policy by far was legalisation. It’s probably not a story that our government would be sympathetic to, and it takes us deeply into the politics of drug law reform, but it is in fact a science-based approach to the issue that humanists should be active in supporting and promulgating. Branson pointed to the example of Portugal, which had, he claimed, drug problems as serious as that of Indonesia, which have since been greatly alleviated through a decriminalisation and harm-reduction approach.
I hope to write more about the GCDP’s interesting and productive-looking take on drug policy on my Solutions OK website in the future. Meanwhile, this is just the sort of helpful initiative that humanists should be active in getting behind. Indonesians are arguing that the damage being done by drug pushers requires harshly punitive measures, but the GCDP’s approach, which bypasses the tricky issue of national sovereignty, and capital punishment itself, is offered in a spirit of co-operation that is perfectly in line with an active, positive humanism.
So humanism should be as active as possible, in my view, and humanists should strive to get themselves heard on such broad issues as education, crime, equity and the environment, but they should enter the fray armed with solutions that are thoughtful, practicable and humane. Hopefully, we’re here to help.
disassembling Kevin Vandergriff’s gish gallop, part 2
Feeling almost apologetic for dwelling on this for too long, with so many more important themes to tackle. Of course some out there, especially in those most heatedly devout parts of the USA, might consider that no more essential topic exists than giving proper due to the supernatural creator of the universe, but I would disagree, and I suppose here’s where I get to say why.
I was discussing Vandergriff’s third contention, that ‘Christian theism has significantly more explanatory power and scope than specified naturalism’. Here is his second argument for this:
God is the best explanation for why space-time and all its contents exist, rather than nothing.
Of course space-time has only existed as a familiar concept for about a century. It may well be replaced, or amended, by another concept, and I’m sure Christian theists will find their god to be the best explanation for that too. He’s amazingly flexible that way. Vandergriff here talks of a proof of supernatural causation under the presupposition that the universe is eternal but necessarily caused. It’s rather an unsurprising one drawn from a famous conundrum of quantum mechanics, that quantum indeterminacy can only be resolved through observation. The observation ‘collapses the wave function’. Vandergriff, or the person who posits this ‘proof’, then leaps from quantum states to the state of the universe. ‘What, or who, collapses its wave function?’ Vandergriff asks. This doesn’t strike me as a particularly valid leap. It seems more a desperate grab for an analogy. I’m not that boned up on my fallacies, but this might be the fallacy of the excluded middle, inter alia. I mean, ‘quantum/universal indeterminacy, therefore god’ does seem to take for granted an awful lot of in-between stuff. The supposed essential recourse to the disembodied mind again suggested here fails as Vandergriff has not presented any argument to show that this ‘disembodied mind’ is anything more than an abstract object. The play of such words as ‘necessary’ and ‘contingent’ really get us nowhere in providing answers to the very interesting questions around the beginnings of our universe and the well-established weirdness of quantum mechanics, regardless of whether the two are related.
The third argument is taken directly from William Lane Craig:
God is the best explanation of the applicability of mathematics to the physical world.
I’ve answered this claim from Craig here, though I’m amused at Vandergriff’s gloss, in that we’re still not sure that the Higgs boson has been discovered, as the data could well fit other scenarios. In any case, the main point about mathematics is clear. Mathematics seems highly abstract nowadays because over time and through painstaking human effort it has moved a long way from its beginnings. Mathematics developed as a tool to describe particular objects in general terms, that could be manipulated and developed, for example number, leading to multiplication, division, functions and the various forms of calculus. All of these, and further, developments make use of regularities, or explore regularities (some of which have as yet no known applicability). It’s hard to conceive of a physical world that has no regularity. All elements are describable, mathematically, in terms of their properties, which are regular, i.e. describable. Try to describe something that has no regularity at all. It would have no shape, no boundary between it and not-it. If this convinces you that a creator god exists, it’s likely that you were already convinced. As to a super-rational creator, which Vandergriff tries to point to, that would hardly be the brutal monster of the Old Testament who slaughters children and babies in a flood and supports the massacres of whole populations in favour of his ‘chosen people’.
Argument 4: God is the best explanation of the discoverability of the universe.
This is really just a repetition of the previous argument. The universe, to be physical (and therefore discoverable in terms of its properties) has to be regular. However, human development ‘at just the right time’ to discover the universe’s properties and origins supposedly supports a fine-tuning argument, as developed by Hugh Ross, a Christian astrophysicist who put forward this argument in the early nineties. The late Victor Stenger, among many others, has put these arguments to the sword. There’s also a problem with this and with other ‘best explanation’ arguments in that they are essentially self-refuting ‘first cause’ arguments. David Hume was one of the first to point out the deficiencies of such arguments centuries ago. Attempts to improve on them are well summarised and dealt with by the philosopher Theodore Schick here. To me, one of the best-arguments against fine-tuning relating we humans to the supernatural creator is its grotesquely overwhelming wastefulness. Why create a universe so enormously inhospitable to intelligent life throughout almost the entirety of its vast expanse in order to permit we humans to finally thrive on our small planet through a history of great suffering? A super-rational being could surely do better, and chance seems a much more coherent explanation.
Argument 5: God is the best explanation of why there are embodied morally responsible agents.
I presume Vandergriff is talking here about cetaceans. Or maybe not. In any case, the existence of such agents, he claims, is more probable under theism. Presumably his claim is based here on the idea that it would be more fun to create a universe with moral agents in it than, say, living beings who are little more than scuttling stomachs. Yet considering how enormously complex and diverse these scuttling stomachs are, it seems clear that, if Vandergriff’s god created them, he seems to have found them great fun. You can hardly argue with J B S Haldane’s remark that the guy has an inordinate fondness for beetles.
Vandergriff talks about the unique human ability for self-control and control over our environment because ‘our brains are the most complex things in the universe’. How does he know this? Well, he doesn’t. This line has often been used, by Richard Dawkins amongst many other scientists, but always, as far as I’m aware, with the cautionary addendum ‘according to our current knowledge’. And our current knowledge of the universe, I and many others would argue, is minuscule, in spite of the great strides we’ve made. Vandergriff is concerned here to emphasise human specialness. He describes, without providing any names, how various physical scientists have been ‘stunned’ to discover that the universe must have been fine-tuned to extraordinary precision to provide for this embodied moral agency. Yet this moral agency appears to exist, to varying degrees, in a number of social species on our planet (which Vandergriff doesn’t acknowledge). In any case, I’m sure plenty of other prominent physical scientists could be found who are considerably less ‘stunned’.
Argument 6: God is the best explanation of moral agents who apprehend necessary moral truths.
I don’t believe there are ‘necessary moral truths’, and I don’t find this a particularly interesting philosophical theme, though it obviously strongly exercises some philosophers.
In giving his example taken from Darwin and the behaviour of hive bees, however, Vandergriff completely misrepresents natural selection, comparing what natural selection ‘happens upon’ with the rational choices of human beings. I would strongly argue that there is more to natural selection than just ‘happening upon’ or ‘chance’ as theists like to describe it. Most theists like to think we’re rational moral agents guided by, or able to be guided by, their god; though how the god does the guiding can never be properly answered. Vandergriff cites the prohibition against rape as a necessary moral truth, but Christians have raped women throughout history, in times of warfare, just as readily as have members of other religions. Rape statistics are notoriously difficult to compare from nation to nation, because states have different laws, definitions, reporting methods and resources. It’s clear from even the most casual examination that cultural attitudes to rape vary widely. We don’t find a consistent or clear-cut prohibition against rape in the Bible. However in modern western countries, especially with the advent of feminism, rape has been raised to a higher level of seriousness as a crime. This hasn’t been driven by organised religion, so it just seems absurd to assert, or even to intimate, that the prohibition against rape is a necessary truth derived from a supernatural being.
Vandergriff talks about natural selection or evolution as being only conducive to our survival, and seems to find it unlikely that our ‘necessary moral truths’ or our aesthetic tastes or even such traits as benevolence or kindness could have been selected for, claiming that these qualities are unlikely under naturalism but highly likely under theism. Yet it’s abundantly clear that reducing the incidence of rape, developing better medicines, resolving conflicts by peaceful means, promoting sympathy for others, including those of other species, and exercising restraint and thoughtfulness in our personal lives is conducive, not only to our survival, but to our success and our enrichment. We’ve learned this, not through communication with spirits, but through honest examination of our own past behaviour as a species. It seems to me that it’s through these painstaking examinations that we’re learning to reduce our common misery and to promote our well-being. We’re learning from our mistakes, even if it’s a ‘two steps forward, one step back’ process. A thorough-going education system is essential in disseminating what we’ve learned from the past and carrying those gleanings into the future. It’s precisely because there are no necessary truths, because we could always go back to achieving our ends through brutality, dishonesty and blinkered self-promotion, that we need to maintain awareness of past errors, and of the complex needs of those around us and to whom we’re attached, including humans and non-humans.
Vandergriff has more ‘arguments’, which I’ll deal with next time, though I’m looking for ways to cut this short!
1914 – 2014: celebrating a loss of appetite
I’ve read at least enough about WW1 to be aware that its causes, and the steps made towards war, were very complex and contestable. There are plenty of historians, professional and amateur, who’ve suggested that, if not for x, or y, war may have been avoided. However, I don’t think there’s any doubt that a ‘force’, one which barely exists today, a force felt by all sides in the potential conflict of the time, made war very difficult to avoid. I’ll call this force the appetite for war, but it needs to be understood more deeply, to divest it of its vagueness. We know that, in 1914, lads as young as 14 sneaked their way into the militaries of their respective countries to experience the irresistible thrill of warfare. A great many of them paid the ultimate price. Few of these lambs to the slaughter were discouraged from their actions – on the contrary. Yet 100 years on, this attitude seems bizarre, disgusting and obscene. And we don’t even seem to realise how extraordinarily fulsome this transformation has been.
Let’s attempt to go back to those days. They were the days when the size of your empire was the measure of your manliness. The Brits had a nice big fat one, and the Germans were sorely annoyed, having come late to nationhood and united military might, but with few foreign territories left to conquer and dominate. They continued to build up their arsenal while fuming with frustration. Expansionism was the goal of all the powerful nations, as it always had been, and in earlier centuries, as I’ve already outlined, it was at the heart of scores of bloody European conflicts. In fact, it’s probably fair to say that the years of uneasy peace before 1914 contributed to the inevitability of the conflict. Peace was considered an almost ‘unnatural’ state, leading to lily-livered namby-pambiness in the youth of Europe. Another character-building, manly war was long overdue.
Of course, all these expansionist wars of the past led mostly to stalemates and backwards and forwards exchanges of territory, not to mention mountains of dead bodies and lakes of blood, but they made numerous heroic reputations – Holy Roman Emperor Charles V and his son Philip II of Spain, Gustavus Adolphus of Sweden, Frederick the Great of Prussia, Peter the Great of Russia, Louis XIV of France and of course Napoleon Bonaparte. These ‘greats’ of the past have always evoked mixed reactions in me, and the feelings are well summed up by Pinker in The Better Angels of our Nature:
The historic figures who earned the honorific ‘So-and-So the Great’ were not great artists, scholars, doctors or inventors, people who enhanced human happiness or wisdom. They were dictators who conquered large swaths of territory and the people in them. If Hitler’s luck had held out a bit longer, he probably would have gone down in history as Adolf the Great.
While I’m not entirely sure about that last sentence, these reflections are themselves an indication of how far we’ve come, and how far we’ve been affected by the wholesale slaughter of two world wars and the madness of the ‘mutually assured destruction’ era that followed them. The fact that we’ve now achieved a military might far beyond the average person’s ability to comprehend, rendering obsolete the old world of battlefields and physical heroics, has definitely removed much of the thrill of combat, now more safely satisfied in computer games. But let’s return again to that other country, the past.
In the same month that the war began, August 1914, the Order of the White Feather was founded, with the support of a number of prominent women of the time, including the author and anti-suffragette Mrs Humphrey Ward (whom we might now call Mary) and the suffragette leaders Emmeline and Cristobel Pankhurst. It was extremely popular, so much so that it interfered with government objectives – white feathers were sent even to those convalescing from the horrors of the front lines, and to those dedicated to arms manufacturing in their home countries. Any male of a certain age who wasn’t in uniform or ‘over there’ was fair game. Not that the white feather idea was new with WWI – it had been made popular by the novel The Four Feathers (1902), set in the First War of Sudan in 1882, and the idea had been used in the British Empire since the eighteenth century – but it reached a crescendo of popularity, a last explosive gasp – or not quite, for it was revived briefly during WWII, but since then, and partly as a result of the greater awareness of the carnage of WWI, the white feather has been used more as a symbol of peace and pacifism. The Quakers in particular took it to heart as a badge of honour, and it became a symbol for the British Peace Pledge Union (PPU) in the thirties, a pacifist organisation with a number of distinguished writers and intellectuals, such as Aldous Huxley, Bertrand Russell and Storm Jameson.
There was no PPU or anything like it, however, in the years before WWI. Yet the enthusiasm for war of 1914 soon met with harsh reality in the form of Ypres and the Somme. By the end of 1915 the British Army was ‘depleted’ to the tune of over half a million men, and conscription was introduced, for the first time ever in Britain, in 1916. It had been mooted for some time, for of course the war had been catastrophic for ordinary soldiers from the start, and it quickly became clear that more bodies were needed. Not surprisingly, though, resistance to the carnage had begun to grow. An organisation called the No-Conscription Fellowship (NCF), consisting mainly of socialists and Quakers, was established, and it campaigned successfully to have a ‘conscience clause’ inserted in the 1916 Military Service (conscription) Act. The clause allowed people to refuse military service if it conflicted with their beliefs, but they had to argue their case before a tribunal. Of course ‘conshies’ were treated with some disdain, and were less tolerated by the British government as the war proceeded, during which time the Military Service Act was expanded, first to include married men up to 41 years of age (the original Act had become known as the Batchelor’s Bill) and later to include men up to 51 years of age. But the British government’s attitude didn’t necessarily represent that of the British people, and the NCF and related organisations grew in numbers as the war progressed, in spite of government and jingoist media campaigns to suppress them.
In Australia, two conscription bills, in 1916 and 1917, failed by a slim majority. In New Zealand, the government simply imposed the Military Service Act on its people without bothering to ask them. Those who resisted were often treated brutally, but their numbers increased as the war progressed. However, at no time, in any of the warring nations, did the anti-warriors have the numbers to be a threat to their governments’ ‘sunken assets’ policies.
So why was there such an appetite then and why is the return of such an appetite unthinkable today? Can we just put it down to progress? Many skeptics are rightly suspicious of ‘progress’ as a term that breeds complacency and even an undeserved sense of superiority over the primitives of the past, but Pinker and others have argued cogently for a civilising process that has operated, albeit partially and at varying rates in various states, since well before WWI, indeed since the emergence of governments of all stripes. The cost, in human suffering, of WWI and WWII, and the increasingly sophisticated killing technology that has recently made warfare as unimaginable and remote as quantum mechanics, have led to a ‘long peace’ in the heart of Europe at least – a region which, as my previous posts have shown, experienced almost perpetual warfare for centuries. We shouldn’t, of course, assume that the present stability will be the future norm, but there are reasons for optimism (as far as warfare and violence is concerned – the dangers for humanity lie elsewhere).
Firstly, the human rights movement, in the form of an international movement dedicated to peace and stability between nations for the sake of their citizens, was born out of WWI in the form of the League of Nations, which, while not strong enough to resist the Nazi impetus toward war in the thirties, formed the structural foundation for the later United Nations. The UN is, IMHO, a deeply flawed organisation, based as it is on the false premise of national sovereignty and the inward thinking thus entailed, but as an interim institution for settling disputes and at least trying to keep the peace, it’s far better than nothing. For example, towards the end of the 20th century, the concepts of crimes against humanity and genocide were given more legal bite, and heads of state began, for the first time in history, to be held accountable for their actions in international criminal courts run by the UN. Obviously, considering the invasion of Iraq and other atrocities, we have a long way to go, but hopefully one day even the the most powerful and, ipso facto, most bullying nations will be forced to submit to international law.
Secondly, a more universal and comprehensive education system in the west, which over the past century and particularly in recent decades, has emphasised critical thinking and individual autonomy, has been a major factor in the questioning of warfare and conscription, and in recognising the value of children and youth, and loosening the grip of authority figures. People are far less easily conned into going into war than ever before, and are generally more sceptical of their governments.
Thirdly, globalism and the internationalism of our economy, our science. our communications systems, and the problems we face, such as energy, food production and climate change, have meant that international co-operation is far more important to us than empire-building. Science, for those literate enough to understand it, has all but destroyed the notion of race and all the baggage attend upon it. There are fewer barriers to empathy – to attack other nations is tantamount to attacking ourselves. The United Nations, ironic though that title often appears to be, has spawned or inspired many other organisations of international co-operation, from the ICC to the Intergovernmental Panel on Climate Change.
There are many other related developments which have moved us towards co-operation and away from belligerence, among them being the greater democratisation of nations – the enlargement of the franchise in existing democracies or pro to-democracies, and the democratisation of former Warsaw Pact and ‘Soviet Socialist’ nations – and the growing similarity of national interests, leading to more information and trade exchanges.
So there’s no sense that the ‘long peace’ in Europe, so often discussed and analysed, is going to be broken in the foreseeable future. To be sure, it hasn’t been perfect, with the invasions of Hungary in 1956 and Czechoslovakia in 1968, and the not-so-minor Balkans War of the 90s, and I’m not sure if the Ukraine is a European country (and neither are many Ukrainians it seems), but the broad movements are definitely towards co-operation in Europe, movements that we can only hope will continue to spread worldwide.
the rise of the nones, or, reasons to be cheerful (within limits)
This is a presentation based on a couple of graphs.
The rise of the nones, that is, those who answer ‘none’ when asked about their religious affiliation in surveys and censuses, has been one of the most spectacular and often unheralded, developments of the last century in the west. It has been most spectacular in the past 50 years, and it appears to be accelerating.
The rise of the nones in Australia
This graph tells a fascinating story about the rise of the nones in Australia. It’s a story that would I think, share many features with other western countries, such as New Zealand and Canada, but also the UK and most Western European nations, though there would be obvious differences in their Christian make-up.
The graph comes from the Australian Census Bureau, and it presents the answers given by Australians to the religious question in the census in every year from 1901 to 2011. The blue bar represents Anglicans. In the early 20th century, Anglicanism was the dominant religion, peaking in 1921 at about 43% of the population. Its decline in recent years has been rapid. English immigration has obviously slowed in recent decades, and Anglicanism is on the nose now even in England. In 2011, only 17% of Australians identified as Anglicans. The decline is unlikely to reverse itself, obviously.
The red striped bar represents Catholics – I’ll come to them in a moment. The grey hatched bar represents devotees of other Christian denominations. In the last census, just under 19% of Australians were in that category, and the percentage is declining. The category is internally dynamic, however, with Uniting Church, Presbyterian and Lutheran believers dropping rapidly and Pentecostals very much on the rise.
The green hatched bar represents the nones, first represented in 1971, when the option of saying ‘none’ was first introduced. This was as a result of pressure from the sixties censuses – that seminal decade – when people were declaring that they had no religion even when there was no provision in the census to do so. Immediately, as you can see, a substantial number of nones ‘came out’ in the 71 census, and the percentage of ‘refuseniks’ (the purple bar) was almost halved. But then in the 76 census, the percentage of refuseniks doubled again, while the percentage of nones increased. The Christians were the ones losing out, a trend that has continued to the present. Between 1996 and 2006 the percentage of self-identifying Christians dropped from 71% to 64% – a staggering drop in 10 years. The figure now, after the 2011 census, is down to 61%. If this trend continues, the percentage of Christians will drop below 50% by the time of the 2031 census. Of course predictions are always difficult, especially about the future.
One thing is surely certain, though. Whether or not the decline in Christianity accelerates, it isn’t going to be reversed. As Heinrich von Kleist put it, ‘When once we’ve eaten of the tree of knowledge, we can never return to the state of innocence’.
The situation after the 2011 census is that 22.3% of Australia’s population are nones, the second biggest category in the census. Catholics are the biggest with 25.3%, down from 26% in 2006 (and about 26.5% in 2001). The nones are on track to be the biggest category after the next census, or the one after that. Arguably, though, it’s already the biggest category. The refusenik category in the last census comprised 9.4%, of which at least half could fairly be counted as nones, given that the religious tend to want to be counted as such. That would take the nones up to around 27%. An extraordinary result for a category first included only 40 years ago.
Let me dwell briefly on this extraordinariness. As you can see, in the first three censuses presented in this graph, the percentage of professed Christians was in the high nineties. That’s to say, in the first two decades of the twentieth century, virtually everyone one identified as Christian. This represents the arse-end of a scenario that persisted for a thousand years, dating back to the 9th and 10h centuries when the Vikings and the last northern tribes were converted from paganism. We are witnessing nothing less than the death throes of Christianity in the west. Of course, we’re only at the beginning, and it will be, I’m sure, a long long death agony. Catholicism still has an iron grip in South America, in spite of the scandals it’s failing to deal with, and it’s making headway in Africa. But in its heartland, in its own backyard, its power is greatly diminished, and their’s no turning back.
The rise of the nones worldwide
But there’s an even more exciting story to tell here. The rise of the nones isn’t simply a rejection of Christianity, it’s a rejection of religion. And with that I’ll go to my second graph. This shows that the nones, at 750 million, have risen quickly to be the fourth largest religious category after Christians, 2.2 billion, Moslems, 1.6 billion, and Hindus, 900 million. These numbers represent substantial proportions of the populations of Australia and New Zealand, Canada, the USA and western Europe, as well as nations outside the Christian tradition, such as China and Japan. Never before in human history has this been the case.
One thing we know about the early civilisations is that they were profoundly religious. The Sumerians of the third millennium BCE, the earliest of whom we have records, worshipped at least four principal gods, Anu, Enlil, Ninhursag and Enki. These, as well as the Egyptian god Amon Ra, are among the oldest gods we can be certain about, but it’s likely that some of the figurines and statues recovered by archaeologists, such as the 23,000-year-old Venus of Willendorf, represented deities.
Why was religion so universal in earlier times?
We don’t know if the ancient Sumerians and Egyptians and Indus Valley civilisations were universally religious, but it’s likely that they were – because supernatural agency offered the best explanation for events that couldn’t be explained otherwise. And there were an awful lot of such events. Why did the crop fails this time? Why has the weather changed so much? Why did my child sicken and die? Why has this plague been visited upon our people? Why did that nearby mountain blow its top and rain fire and burning rocks down on us?
Even today, in our insurance policies, ‘acts of god’ – a most revealing phrase – are mentioned as those unforeseen events that insurers are reluctant to provide cover for. Nowadays, when some fundie describes the Haitian earthquake or Hurricane Katrina as a deliberate act of a punishing god, we laugh or feel disgusted, but this was a standard response to disasters in earlier civilisations. Given our default tendency to attribute agency when in doubt – a very useful evolutionary trait – and our ancestors’ lack of knowledge about human origins, disease, climate, natural disasters, etc, it’s hardly surprising that they would assume that non-material paternal/maternal figures, resembling the all-powerful and often capricious beings who surrounded us in our young years, and whose ways are ever mysterious, would be the cause of so many of our unlooked-for joys and miseries.
Why has that universality flown out the window?
It’s hardly surprising then that the rise of the nones in the west coincides with the rising success and the growing explanatory power of science. For the nones, creation myths have been replaced by evolution, geology and cosmology, sin has been replaced by psychology, and a judging god has been replaced by the constabulary and the judiciary. I don’t personally believe that non-believers are morally superior to believers because we ‘know how to be good without god’. We’ve just transferred our fear of god to our fear of the CC-TV cameras – as well as fear for our reputations in the new ultra-connected ‘social hub’.
It’s obvious though that the scientific challenge to ye olde Acts of God is very uneven wordwide. In the more impoverished and heavily tribalised parts of Africa, India, China and the Middle East, the challenge is virtually non-existent. Furthermore, it’s a very new challenge even in the west. To take one example, our understanding of earthquakes, tsunamis and volcanic activity has greatly increased in recent times through advances in technology and also in theory, most notably tectonic plate theory. This theory was first advanced in the early 20th century by Alfred Wegener amongst others, but it didn’t gain general scientific acceptance until the sixties and didn’t penetrate to the general public till the seventies and eighties. Even today in many western countries if you ask people about plate tectonics they’ll shrug or give vague accounts. And if you think plate tectonics is simple, have a look at any scientific paper about it and you’ll soon realise otherwise. Of course the same goes for just about any scientific theory. Science is a hard slog, while the idea of acts of god comes to us almost as naturally as breathing.
In spite of this science is beginning to win the challenge, due to a couple of factors. First and foremost is that the scientific approach, and the technology that has emerged from it, has been enormously successful in transforming our world. Second, our western education system, increasingly based on critical thinking and questioning, has undermined religious concepts and has given us the self-confidence to back our own judgments and to emerge from the master-slave relationships religion engenders. The old god of the gaps is finding those gaps narrowing, though of course the gaps in many people’s minds are plenty big enough for him to hold court there for the term of their natural lives.
The future for the nones
While there’s little doubt that polities such as Australia, New Zealand, Canada and the European Union will become increasingly less religious, and that other major polities such as China and Japan are unlikely to ‘find’ religion in the future, we shouldn’t kid ourselves that any of the major religions are going to disappear in our lifetimes or those of our grandchildren. Africa and some parts of Asia will continue to be fertile hunting grounds for the two major proselytising religions, and Islam has as firm a hold on the Middle East as Catholicism has on Latin America. If you’re looking at it in terms of numbers, clearly the fastest growing parts of the world are also the most religious. But of course it’s not just a numbers game, it’s also about power and influence. In all of the secularising countries, including the USA, it’s the educated elites that are the most secular. These are the people who will be developing the technologies of the future, and making decisions about the future directions of our culture and our education. So, yes, reasons to be cheerful for future generations. I look forward to witnessing the changing scene for as long as I can
what can we learn from religion?
Those are not at all to be tolerated who deny the Being of a God. Promises, Covenants and Oaths, which are the Bonds of Humane Society, can have no hold upon an Atheist.
John Locke, ‘A letter concerning toleration’, 1689
In my last post I referred to some aspects of religious belief that I think are worth focusing on if we want to get past the rational/irrational, or even the true/false debates. Alain de Botton created quite a stir recently when he claimed that arguments about the truth/falsity of religion were boring and without much value – or something like that. Typically, I both agree and disagree. There are essential empirical questions at stake, as I argued in my critique of Stephen Jay Gould here, but they’re hardly key to getting a handle on religion’s enormous popularity and endurance. That requires a deeper understanding of the psychological underpinnings of religious belief.
First, I’ve already written of the fact that, for all very young children, adults are supernatural beings. They’ve yet to learn about human mortality and limitations. They certainly learn quickly about their own pain and discomfort, but it comes as a shock when they first observe that all these competent, powerful, protective giants can be hurt, angry and frustrated just like them. These findings should hardly surprise us – children at this stage are entirely dependent on adults for their survival. These adults, they observe, can throw them up in the air and hopefully catch them, they can walk across a room in three seconds flat, they can transport them by car or plane to a completely different world, they’re not afraid of anything, and they miraculously provide all sustenance and succour.
While non-believers mostly understand such basic childhood beliefs, many are highly impatient of those who haven’t, at an appropriate age, abandoned this ‘theory of mind’ and replaced it with a more rational or sophisticated scientific worldview. The response of many psychologists in the field would be that, yes, we do change, but the idea of the supernatural, of transcending the usual limitations, has a long, lingering effect. The popularity of fairies, Harry Potter and Spiderman, which take us through early childhood into adolescence and beyond, attests to this. It’s worth noting that the nerdiest atheists are avid Trekkies and Whovians.
But none of this is really disturbing or unhealthy in the way that religious belief seems to be in the eyes of many non-believers – such as myself. The world’s most secular polities – in Australia, New Zealand, Canada, Japan, and in many European countries, are also the most law-abiding, secure and contented, as countless surveys show. As a regular dipper into history, I can’t help but note that social life in god-obsessed pre-Enlightenment Europe was far more volatile, cruel and corrupt than it is today in the era of democracy, human rights and secularism. Locke’s remarks above, have been throughly refuted by modern experience – though I suspect this is due to having a more regularised legal framework and a functioning police force than to the greater moral virtue of non-believers.
So for many of us, the point is not to understand religion, but to change it. Or rather, to neutralise it by understanding it and then applying that understanding within a more secular framework. For example, one of the themes of the religious is that you can’t be good without god x, y or z. Atheists rarely concede that theists might have a point here. The stock response is a personal one ‘I don’t need a supernatural fantasy-figure to frighten me into being good, I’m good because I have respect for others and for my environment’, etc. Psychological study, however, tells us a different story.
The Lebanese-born social psychologist Ara Norenzayan, at the University of British Columbia, points out that many of the gods of small societies have little interest in morality. Instead, ‘being good’ in these small societies is enforced by their very size, and their inescapability. Kin altruism and reciprocity, being the subject of gossip, the fear of ostracism, these are what keep society members on the right track. As numbers increase, though, a sense of anonymity engenders a greater tendency towards cheating and self-serving behaviours. Studies show that even wearing dark glasses, like the Tontons Macoutes, makes it easier to engage in anti-social behaviour. People behave much better when watched, by an audience, by a camera, and even by a large drawing of an eye in the corner of a shop.
The idea that non-believers can be ‘tricked’ into behaving better by the picture of an eye watching them should make us think again, not about gods, but about being watched. And about how we still over-determine for agency in our thinking. Civil libertarians get their backs up about CC-TV cameras on every street corner, but there’s no doubt they’ve been a success in catching robbers and muggers and king-hitters in the act, or just before or after. Even those of us with no urge to steal or who, like me, have left that urge behind long ago, tend to notice when a shop does or doesn’t feature an electronic scanning device, and if they’re like me they’ll wonder about the shop’s vulnerability or otherwise, and the trustworthiness or desperation of the customers around them. As to the painted eye, I presume it doesn’t have the deterrent effect of cameras and scanners, but the fact that it works at all should make us think again about our basic beliefs. Or does it only work on the religious?
That was a joke.
So how do more secular societies utilise the idea that someone knowing if you’ve been bad or good makes for a more moral, or at least law-abiding society? Well, it appears from the statistics that either they’ve already done so, or they’ve found other ways of being good. I suspect it’s been a complex mix of substitute gods, comprehensive education and community expectations. Large scale society has naturally subdivided into smaller groups based on family, business, sport, academic or professional interest and so on, so the age-old stabilisers of kinship, reciprocity and reputation within the group are still there, and these are bolstered by a greater set of ‘watched’ networks. Trade and travel, international relations, the internet, all of these things are always in process of being regulated to reflect community concepts of fairness. We are our own Big Brother (another supernatural agent). Modern liberal education teaches kids from an early age about human rights and environmental responsibility, so much so that they’re often happy to lecture their parents about it. The Freudian concept of the superego is a kind of internalised supernatural parental figure, finger-wagging at us during our weaker moments. The declaration of human rights, accepted by most countries today, though criticised as artificial and without teeth, surely presents a better framework for moral behaviour in the modern world than the often obscure and contradictory stories and proverbs found in the Bible and other religious texts. In short, there are many ways we’ve worked out for behaving well and generally flourishing in a secular society.
So I’m basically saying there isn’t much we can learn from religion, with respect to moral policing, that we haven’t learned already. But what about community and social bonding? In the USA and in other highly religious societies, the populace seems to be very united in its religion – especially against the irreligious. Some non-believers are concerned to replicate religion’s success in this area, and I’ve heard that there’s an atheist church, or I think they call it an atheist assembly – meeting on Sunday – somewhere in my area. I’m not particularly inclined to attend. Non-believers don’t necessarily have much in common apart from a lack of interest in religion, and I’m wary of in-group thinking anyway. I’m wary of just the kind of bonding above-mentioned, a bonding that might depend upon mutual congratulations and mocking or belittling, or despising, believers.
Non-believers are of course no less community-minded than the religious. Business, sporting, scientific and small-town communities, these attract us as social animals regardless of our views on the supernatural, and I don’t think we need a top-down ‘alternative’ to religious congregations or community spirit as advocated by de Botton.
Many of the religious point out that they’re more involved in charitable works than selfish unbelievers. Where are the atheist alternatives to Centacare and Anglicare, the welfare and social services arms of the Catholic and Anglican denominations? But these organisations have built up their considerable infrastructure and expertise under extremely favourable tax circumstances which have been a part of Australia’s religious history for a couple of centuries, so they’re always more favourably placed to win government and other contracts for social and educational services. I’ve experienced personally the frustrations of humanist organisations trying to attain the same tax-exempt status for charitable purposes. They’re not given a look-in. Nevertheless there are many powerful and effective NGOs such as Oxfam and MSF, and important human rights bodies like Amnesty International and Human Rights Watch, whose impetus comes directly from the secular human rights movement.
I would also argue, as a former employee of Centacare (as an educator) and of Anglicare (as a foster-carer) that one result of their having cornered so much of the education and social services market is that they’ve become more secularised. They no longer require their workers to share their supernatural beliefs, and this has enabled them to reach a wider market which they’ve been able to expand largely by downplaying or eliminating the proselytising. I’ve never heard any god-talk from Centacare or Anglicare employers, and this would surely not have been the case fifty years ago. It’s the same in Catholic schools I suspect, with so many non-Catholics sending their kids there due to doubts about under-funded state schools.
This is all to the good, as too-exclusive Christian or religious communities – as well as non-religious communities – lead to us-them problems. We need to be secure in our position on the supernatural without being dismissive.
So, what in the end do we have to learn from religion? My answer, frankly, is nothing much. We have far more to learn from history and from clear-minded examination of the evidence we uncover about ourselves and our fellow organisms in this shared biosphere.
spirituality issues, encore
To me – and I’ve written about this before – the invocation of the supernatural, the ‘call’ of the supernatural, if you will, is something deeply psychological, and so not to be sniffed at, though sniff at it I often do.
I’m prompted to write about this because of a program I saw recently on Heath Ledger (Australia’s own), an understandably romantic, mildly hagiographic presentation, in which a few film directors and friends fondly remembered him as wise beyond his years, with hidden depths, a kind of inner force, a certain je ne sais quoi, that sort of thing. As both a romantic and a skeptic, I was torn as usual. The word ‘spiritual’ was given an airing, unsurprisingly, though mercifully it wasn’t dwelt on. I once came up with my own definition of spirituality: ‘To be spiritual is to believe there’s more to this world than this world, and to know that by believing this you’re a better person than those who don’t believe it’. This might sound a mite cynical but I didn’t mean it to be, or maybe I did.
Anyway one of Ledger’s associates, a film director I think, told this story of the young Heath. A number of friends were partying in his apartment when he, the director, picked up a didgeridoo, which obviously Ledger had brought with him from Australia, and attempted to play it, but not knowing much about the instrument, held it upside-down. Heath gently took it from him and corrected him, saying ‘no, no, if you hold it that way it will lose its power, the power of the instrument and its maker,’ or some such thing. And the seriousness and respectfulness with which this young actor spoke of his didge impressed the director, who considered this a favourite memory, something which caught an ‘essence’ of Ledger that he wanted to preserve.
I’ve been bothered by this tale, and by my ambivalent response to it, ever since. It would be superfluous, I suppose, to say that I don’t believe that briefly holding a didge upside-down has any permanent effect on its musical power.
It’s quite likely that Ledger didn’t believe this either, though you never know. What I’m fairly sure of, though, was that his respectfulness was genuine, and that there was something very likeable, to me at least, in this.
All of this takes me back to a piece I wrote some years ago, since lost, about big and small religions. I was contrasting the ‘big’ religions, like Catholicism and the two main strands of Islam, with their political power in the big world, often horrific in its impact, with the ‘small’ religions or spiritual belief systems, such as those found among Australian Aboriginal or some African societies, who have no political power in the big world but provide their adherents with identity and a kind of social energy that’s marvelous to contemplate. My piece focused on the art work of Emily Kame Kngwarreye, whose prolific and astonishing oeuvre, with its characteristic energy and vitality, clearly owed so much to the beliefs and practices of her ‘mob’, the so-called Utopian Community in Central Australia, between Alice Springs and Tenant Creek to the north.
Those beliefs and practices include dreaming stories and totemic identifications that many western skeptics, such as myself, might find difficult to swallow, in spite of a certain romantic appeal. The fact is, though, that the Utopian Community has been remarkably successful, in terms of the usual measures of well-being, and particularly in the area of health and mortality, compared to other Aboriginal groups, and its success has been put down to tighter community living, an outdoor outstation life, the use of traditional foods and medicines, and a greater resistance to the more destructive western products, such as alcohol.
This might put a red-blooded but reflective skeptic in something of a quandary, and the response might be something like – ‘well, the downside of their vitality and health, derived from spiritual beliefs which have served them well for thousands of years, is that, in order to preserve it, they must live in this bubble of tribal thinking, unpierced by modern evolutionary or cosmological knowledge, and this bubble must inevitably burst.’ Must it? Is there a pathway from tribalism to modern globalism that isn’t entirely destructive? Is the preservation of tribal spiritual beliefs a good thing in itself? Can we take the statement, that holding a didgery-doo upside-down affects its spirit, as a truth over and above, or alongside, the contrasting truths of physical laws?
I don’t know the answer to these questions, of course. Groping my way through these issues, I would say that we should respect and acknowledge those beliefs that give a people their dignity, and which have served them for so long, but perhaps that’s because we’re feeling the generosity of someone outside that system who’s unlikely to be affected or to feel diminished by it. These are, after all, small religions, from our perspective, not the big, profoundly ambitious religions intent on global domination, with their missionaries and their jihadists and their historical trampling of other belief systems, as in Mexico and South America and Africa and here in Australia.
Of course there’s the question – what if those small religions grew bigger and more ambitious? Highly unlikely – but what if?
Some thoughts on morality and its origins
I remember, quite a few Christmases ago now, a slightly acrimonious discussion breaking out about religion and morality. I simply observed – it wasn’t my family. It never is.
A born-again religious woman asked her sister – ‘where do you get your morality from if not from religion?’ She responded tartly, ‘From my mum’. This response pleased one of those present, at least! But as to the implicit claim that we get our morality from religion, my silent response was ‘how does that happen?’
Religion, at least in its monotheistic versions, implies a supernatural being, from whom all morality flows. But if you ask believers whether their cherished supernatural entity talks to them and advises them regularly about the moral decisions they face in their daily lives, you would get, well, a variety of responses, from ‘yes, he does actually’, to something like ‘you miss the point completely’. The second response might lead on to – well, theology. We were given free will, the deity’s ways are mysterious but Good, he communicates with us indirectly, you need to read the signs etc etc. But you’ll be relieved I hope to hear that this won’t be an essay on religion, which you should realise by now I find interminably boring when it tries to connect itself with morality – which is most of the time.
I’m more interested here in trying, inter alia, to define human morality, to determine whether it’s objective, or universal, and if those two terms are synonymous. And as I generally do, I’ll start with a rough and ready, semi-ignorant or uninformed definition, and then try to smarten it up – possibly overturning the original definition in the process.
So, roughly, I consider human morality to be an emergent property of our socially wired brains, something which is, therefore, evolving. I don’t consider it to be objective, because that suggests something outside ourselves, like objective reality. We can talk about it being ‘universal’, as in ‘universal human rights’, which may be agreed upon by consensus, but that’s a convenient fiction, as there’s no true consensus, as, for example, the Cairo Declaration (on human rights in Islam) reveals. Not that we shouldn’t strive for consensus, based on our current understanding of human interests and human thriving. I’m a strong believer in human rights. I suppose what I’m saying here is that my ‘universality’, far from being a metaphysical construction, is a pragmatic term about what we can generally agree on as being what we need in terms of basic liberties, and limitations to those liberties, in order to best thrive, as a thoroughly social species (deeply connected with other species).
So with this rough and ready definition, I want to look at some controversial contributions to the debate, and to add my reflections on them. I read The Moral Landscape, by Sam Harris, a while back, and found it generally agreeable, and was surprised at the apparent backlash against it, though I didn’t try to follow the controversy. However, when philosophers like Patricia Churchland and Simon Blackburn get up and respectfully disagree, finding Harris ‘naive’ and misguided and so forth, I feel it’s probably long overdue for me to get my own views clear.
The difficulty that many see with Harris’s view is encapsulated in the subtitle of his book, ‘How science can determine human values’. I recognised that this claim was asking for trouble, being ‘scientistic’ and all, but I felt sympathetic in that it seemed to me that our increasing knowledge of the world has deeply informed our values. We don’t call Australian Aboriginals or Tierra del Fuegans or Native Americans savages anymore, and we don’t describe women as infantile or prone to hysteria, or homosexuals as insane or unnatural, or children as spoilt by the sparing of the rod, because our knowledge of the human species has greatly advanced, to the point where we feel embarrassed by quite recent history in terms of its ethics. But there’s a big difference between science informing human values, and enriching them, and science being the determinant of human values. Or is there?
What Harris is saying is, forget consensus, forget agreements, morality is about facts, arrived at by reason. He brings this up early on in The Moral Landscape:
… truth has nothing, in principle, to do with consensus: one person can be right, and everyone else can be wrong. Consensus is a guide to discovering what is going on in the world, but that is all that it is. Its presence or absence in no way constrains what may or may not be true.
Clearly one of Harris’s targets, in taking such an uncompromising stance on morality being about truth or facts rather than values, is moral relativism, which he regularly attacks. Yet the most cogent critics of his views aren’t moral relativists, they’re people, like Blackburn, who question whether the moral realm can ever be seen as a branch of science, however broadly defined (and Harris defines it very broadly for his purposes). One of the points of dispute – but there are many others – is the claim that you can’t derive values from facts. For example, no amount of information about genetic variation within human groups can actually determine what you ought to do in terms of discrimination based on perceived racial differences. Such information can and should inform decisions, but they can’t determine them, because they are facts, while values – what you should do with those facts – are categorically different.
It seems to me that Harris often chooses clear-cut issues to highlight morality-as-fact, such as that a secure, healthful, well-educated life is better than one in which you get beaten up on a daily basis. Presumably he imagines that all the gradations in between can be measured precisely as to their truth-value in contributing to well-being. But surely it’s in these difficult areas that questions of value seem to be most ‘subjective’. Can we make an objective moral claim, say, about vegetarianism, true for all people everywhere? What about veganism? I very much doubt it. Yet we also need to look skeptically at those values he sees as clear-cut. Take this example from The Moral Landscape:
In his wonderful book The Blank Slate, Steven Pinker includes a quotation from the anthropologist Donald Symons that captures the problem of multiculturalism very well:
If only one person in the world held down a terrified, struggling screaming little girl, cut off her genitals with a septic blade, and sewed her back up, leaving only a tiny hole for urine and menstrual flow, the only question would be how severely that person should be punished, and whether the death penalty would be a sufficiently severe sanction. But when millions of people do this, instead of the enormity being magnified millions-fold, suddenly it becomes ”culture”, and thereby magically becomes less, rather than more, horrible, and is even defended by some Western “moral thinkers”, including feminists.
Now, as a card-carrying humanist, and someone generally quite comfortable with the values that, over time, have emerged in my part of the western world, namely Australia, I’m implacably opposed to the practice described here by Symons. But even so, I see a number of problems with this description. And ‘description’ is an important term to think about here, because the way we describe things is an essential indicator of our understanding of the world. The description here is of a ‘procedure’, and it is brief and clinical, leaving aside the depiction of the ‘terrified struggling screaming little girl’. It isn’t a description likely to have much resonance for those who subject their daughters and nieces to this practice. After all, this is a traditional cultural practice, however horrific. It is still practiced regularly in many African countries, and in proximate countries such as Yemen. Clearly the practice aligns with rigid attitudes about the role and place of women in those cultures, attitudes that go back a long way – the first reference to female circumcision, on an Egyptian sarcophagus, dates back almost 4000 years, but it’s likely that it goes back a lot further than that. As Wikipedia puts it, ‘Practitioners see the circumcision rituals as joyful occasions that reinforce community values and ethnic boundaries, and the procedure as an essential element in raising a girl.’
Now, Symons (and presumably Pinker, and Harris) take the view that this is clearly a criminal practice, and that culture should not be used as an excuse. It’s a view backed up by most of the nations in which it occurs, who have instituted laws against it, and in 2012 the UN General Assembly unanimously voted to take all necessary steps to end it, but these national and international good intentions face a long, uphill battle. However, if you look at some of the first descriptions of this practice, by outsiders such as Strabo or Philo of Alexandria, both writing in the time of Christ, you won’t find any censoriousness, nor would you expect to. It was well accepted in the Graeco-Roman world that customs varied widely, and that many foreign customs were weird, wild and wonderful. It’s likely that observers from the dominant culture felt morally superior, as is always the case, but there was no attempt to suppress other cultural practices – any more than there was only 200 hundred years ago, in Australia, with respect to the native inhabitants. The ‘mother country’ sent out clear and regular messages at the time about treating the natives with respect, and non-interference with their cultural practices (though it would no doubt have considered them barbaric and savage as a matter of course). It’s really only in recent times that, as a result of our growing confidence in a universal approach to morality or ‘well-being’, we (the dominant culture) have spoken out against what we now unabashedly call female genital mutilation, as well as other practices such as purdah and witch-hunting.
From all this, you might guess that I’m ambivalent about Harris’s confident approach to moral value. Well, yes and no, he said ambivalently. I can’t tell you how mightily glad I am that I live in a part of the world in which purdah and infibulation aren’t prevalent. However, I can’t step outside of my space and time, and I don’t know what it would be like to live in a world where these practices were standard. And living in such a world doesn’t mean being being transported to it ‘suddenly’, it means being steeped in its values. After all, my own Anglo-Australian culture was one that, less than 200 years ago, transported homeless boys, in danger of ‘going to the bad’, to Australia where they often ended up being worked to death on chain gangs, and this was considered perfectly normal. I would have considered it perfectly normal, for I’m not so arrogant as to imagine I could transcend the moral values of my culture as it was in the 1830s.
So, to return to the passage from The Moral Landscape quoted above. It isn’t a factual passage, it’s a description, with interpretive and speculative features. It describes, first the actions of ‘one person’, engaged in what seems to us an insane surgical procedure, then we’re asked to multiply this act by millions, and ‘suddenly’ consider it culture. But this strikes me as a deliberately manipulative putting of the cart before the horse. The real motive seems to be to ask us to dismiss culture altogether. After all, any human product that can be called into being ‘suddenly’, and which ‘magically’ blights our moral understanding of the world cannot surely be taken seriously. Harris, as I recall, used similar arguments against religion, perhaps in The End of Faith (which I haven’t read), but certainly in some of his talks on the subject. A practice or belief which we might lock someone up for, ‘suddenly’ becomes acceptable when engaged in by millions and called ‘religion’.
This strikes me as a glib and naive argument, which could only appeal to historically uninformed (or indifferent) ‘rationalists’. Cultural and religious beliefs and practices, weird, wild, wonderful and occasionally horrifying though they might be, are far too widespread, and too deeply woven into the identity of individuals and social groups, to be set aside in this way.
This is a very very complex issue, one that, dare I say, middle-class intellectuals like Harris and Pinker tend to skate over, even with a degree of contempt. For myself, I deal with these cultural issues with a mixture of fear – ‘don’t provoke the culturally wounded, they’ll just get angry and dangerous’ – and concern – ‘if you take away these people’s cultural/religious identity, how will they cope?’. Perhaps I’m being arrogant about the power of western secular values, but it seems to me that much of the world’s turmoil comes from resentment at old cultural and religious certainties being undermined.
So I believe in cultural sensitivity, for strategic purposes but also because we are all culturally embedded, no matter how scientifically enlightened we claim to be. However, I don’t think all cultures are, or all culture is, equally valuable or equally healthy. How I measure that, though, is a big question since I can’t step outside of my own culture. Perhaps therein lies the difficulty about getting all ‘scientific’ about morality. Science itself is hardly culture-free – a dangerous point to make in some circles.
So I don’t think I’ve gotten much further as to where morality comes from. To say that it comes from culture requires a thorough definition and understanding of that concept, otherwise we’re just deferring any real explanation, but clearly that is the way to go. But I prefer to look at this connection with culture, and with other more fundamental aspects of our social nature, from a humanist perspective. Western secular humanism tends to wear its culture lightly, and to value skepticism, reflection and analysis as – possibly cultural – tools for dismantling or at least loosening the overly heavy and oppressive armour that cultural beliefs and practices can become.
on transcendental constructions: a critique of Scott Atran
Some years ago, when watching some of the talks and debates in the first ‘Beyond Belief’ conference at the Salk Institute, I noted some tension between Sam Harris and his critique of religion generally and Islam in particular, and Scott Atran, an anthropologist, who appeared to be quite contemptuous of Harris’s views. Beyond noting the tension, I didn’t pay too much attention to it at the time, but I’ve decided now to look at this issue more closely because I’ve just read Ayaan Hirsi Ali’s powerful book Infidel, which gives an insider’s informed and critical view of Islam, particularly from a woman’s perspective, and I’ve also listened to Chris Mooney’s Point of Inquiry interview with Atran back in April, shortly after the Boston marathon bombing.
The interview, called ‘What makes a terrorist?’ was mainly about the psychology of the more recent batch of terrorists, but in the latter half, Atran responded to a question about the role of Islam specifically in recent terrorist behaviour. It’s this response I want to examine, not so much in the light of Sam Harris’s contrasting views, but in comparison to those of Hirsi Ali.
In bringing up the role of Islam in terrorism, Chris Mooney cites Sam Harris as pointing out that ‘there’s something about Islam today that is more violent’. Atran’s immediate response is that ‘this is such a complex and confused issue’, then he says that ‘religions are fairly neutral vessels’. This idea that religions, especially those that survive over time, have a degree of neutrality to them, has some truth, and in fact it served as the basis for my critique of Melvyn Bragg’s absurd claims that Christianity and the KJV Bible were largely responsible for feminism, democracy and the anti-slavery movement. But there is a limit to this ‘neutrality’. Religions are clearly not so ‘neutral’, morally or culturally, that they’re interchangeable with each other. Fundamentalist, or ultra-orthodox, or ultra-conservative Judaism is not the same as its Islamic or Christian counterparts. In fact, far from it. And yet these three religions ostensibly share the same deity.
The interaction between religion and culture is almost impenetrably complex. I wrote about this years ago in an essay about traditional Australian Aboriginal religion/culture, in which it’s reasonable to say that religion is culture and culture is religion. In such a setting, apostasy would be meaningless or impossible – essentially a denial of one’s own identity. Having said that, if your religion, via one of its principal texts, tells you that apostasy is punishable by death, you’ve already got a yawning separation between religion and cultural identity – the very reason for the excessive threat of punishment is to desperately try to plug that gap. It’s like the desperate cry of a father – ‘you’ll never amount to anything without me!’ – as the son walks out the door for the last time.
These major religions – Judaism, Islam and Christianity – are embedded in texts that are embedded in culture. Different, varied texts interacting complexly – reinforcing, challenging, altering the culture from whence they sprung. Differently. Judaism’s major text, always arguably, is the Torah. Christianity’s is the New Testament, or is it the gospels? Islamic scholars – but also those believers who rarely ever read the sacred texts – will argue about which texts are most important and why. Nevertheless, Judaism, Christianity and Islam all have a different feel to them from each other, even given the enormous variation within each religion. Judaism is profoundly insular, with its chosen people uniquely flayed by their demanding, unforgiving god. Christianity is profoundly other-worldly with its obsession with the saviour, the saved, the end of days, the kingdom to come, the soul struggling for release, not to mention sin sin sin. Islam, a harsh, desert religion, somehow even more than the other two, is about denial, control, submission, and jihad in all its complex and contradictory manifestations and interpretations. The status of women in each religion, in a general sense, is different. Christianity gives women the most ‘wriggle-room’ from the start, but its interaction with the different cultures captured by the religion can sometimes open up that space, or close it down. The New Testament presents a patriarchal culture of course, but in the gospels women aren’t given too bad a rap. Paul of Tarsus notoriously displays some misogyny elsewhere in the NT, but it isn’t particularly specific and no detailed restrictions on women’s freedom are presented. More importantly, the dynamism of western culture has blown away many attempts to maintain the restrictions on women’s freedom dictated by Christian dogma – pace the Catholic Church. In any case, Christianity has no equivalent to Sharia Law, with its deity-given restrictions and overall fearfulness of the freedom and power of women. And neither Christianity nor Islam has the obsession with ritual and with interpretation of the deity’s very peculiar requirements that orthodox Judaism has.
To return, though, to Atran. He argues that the reason the big religions survive and thrive is precisely due to their lack of fixed propositions – which is why, he says, that we need sermons to continually update and modernise the interpretations of texts, parables, suras and the like. I’m not sure if the Khutbas of Moslem Imams serve the same purpose as priests’ sermons, but I generally agree with Atran here. The point, of course, is that though there is much leeway for interpretation, there are still boundaries, and the boundaries are different for Islam compared to Christianity, etc.
What follows is my analysis of what Atran has to say about what are, in fact, very complex and contentious matters relating to religion and social existence. Whole books could be, and of course are, devoted to this, so I’ll try not to get too bogged down. I’m using my own transcript of Atran’s interview with Mooney, slightly edited. Occasionally I can’t quite make out what Atran is saying, as he sometimes talks softly and rapidly, but I’ll do my best.
So, after his slightly over-simplified claim that these big religions are ‘neutral vessels’, Atran goes on with his definition. These religions are:
… moral frameworks that provide a transcendental moral foundation for large groups coalescing – for how else do you get genetic relatives to form large co-operative groups? They don’t have to be necessarily religious today, but it involves transcendental ideas. Take human rights, for example, that’s a crazy idea. Two hundred and fifty years ago a bunch of intellectuals in Europe decided that providence or nature made all human beings equal, endowed by their creator with rights to liberty and happiness, when the history of 200,000 years of human life had been mostly cannibalism, infanticide, murder, the suppression of minorities and women, and so [through the wars?] and social engineering, they took this crackpot idea and made it real.
I have a few not so minor quibbles to make here. Presumably Atran is using the term ‘transcendental’ in the way that I would use the term “over-arching’ – a much more neutral, and if you like, secular term. The trouble is – and he uses this term often throughout the interview – Atran uses ‘transcendental’ with deliberate rhetorical intent, taking advantage of its massive semantic load to undercut various secular concepts, in this case the ‘crackpot’ concept of human rights.
This isn’t to say that Atran objects to human rights. My guess is that he regards it as a somewhat arbitrary and unlikely concept, invented by a bunch of European intellectuals in the Enlightenment era, that just happened to catch on, and a good thing too. That’s not how I see it. It’s just much much more complex than that. So much so that I hesitate to even begin to explore it here. The germ of the concept goes back at least as far as Aristotle, and it involves the increasingly systematic study of human history, and human psychology. It involves the science of evolution, and it involves pragmatic global developments in commerce and diplomacy. Eighteenth century Enlightenment ideas had a catalytic effect, as did many developments of the scientific enlightenment of the previous century, as did the growth of democratic ideas and the concept of systematic universal education and health-care in the nineteenth century, in the west.
My point is that, though I have no problems with calling human rights a convenient fiction – nobody ‘really’ has rights as such – it’s based on a this-worldly (i.e. non-transcendental) understanding of how both individuals and societies flourish and thrive, in terms of the contract or compromise between them.
Atran goes on:
But, in general, societies that have unfalsifiable and unverifiable transcendental constructions win out over those that don’t – I mean, Darwin talked about it as moral virtue, and said that this is responsible for the kind of patriotism, sympathy and loyalty that makes certain tribes win out over other tribes in […] competition for dominance and survival, and again, without these transcendental ideas people can’t really be blinded to [exit strategies], I mean, societies that are based on social contracts, no matter how good they are, the idea that there’s always a better deal down the line makes them liable to collapse, while these societies are much less prone to that. And there are all sorts of other things associated with these sorts of unverifiable propositions.
Presumably these ‘unfalsifiable and unverifiable transcendental constructions’ are religions, and I’ve no great objection to that characterisation, but I’m not so convinced about the positive value for ‘dominance and survival’ of these constructions. One could argue that my kind of scepticism can only flourish in a secure environment such as we have in the west, where such ‘undermining’ values as anti-nationalism and atheism can’t threaten the social cohesion of our collective prosperity and sense of superiority to non-western notions. There are just no ‘better deals down the line’, except maybe more health, wealth and happiness, commitment to which requires the very opposite of an ‘exit strategy’. In other words, western ‘social contract’ societies, in which religious belief is rapidly diminishing (outside the US), are showing no sign of collapsing, because there is no meaningful exit strategy, unless a delusional one. There is no desire or motivation to exit. We’re largely facing our demons and rejecting overly ‘idealistic’ solutions.
Perhaps my meaning will be clearer when we look at more of Atran’s remarks:
So now, the propositions, these things themselves can be interpreted, however, depending on the political and social climate of the age. Islam has been interpreted in ways that were extremely progressive at one time, and at least parts of it are extremely retrogressive, especially as concerns science for example, the position of women in the world, especially parts of it in many countries it’s extremely retrograde. But, Islam itself, I mean does it have some essence that encourages this kind of crazy violence? No, not at all – that truly is absurd, and just false.
Atran’s becoming a bit incoherent here, and maybe he expresses himself better elsewhere, but his base argument is that there’s no ‘essence’ to Islam which renders it more violent than other religions, or transcendental constructions (eg communism or fascism) for that matter. He overplays his hand, I think, when he claims that this is ‘absurd’ and obviously false. We could call this ‘the argument from petulance’. Islam does have some essential differences, I think, which makes it more able to act against women and against scientific ideas, though I agree that this is a matter of degree, and that it’s very complex. For example, the growth of Catholicism in Africa has combined with certain aspects of tribal culture and patriarchy to make African Catholic spokesmen very outspoken against homosexuality – and a recent local television program had a Moslem leader speaking up in favour of gay marriage. So, yes, there is nothing fixed in stone about Islam or Christianity with respect to human values.
The thing is that, for writers like Ayaan Hirsi Ali, and I suspect Sam Harris too, the question of ‘essentialism’ is largely academic, for right here and right now people are being targeted by Moslems (under the pressure of cultural connections or disconnections), because they are apostates, or critics, or women trying to get an education, or women dressing too ‘immodestly’, and this is causing great tension, even to the point of death and destruction here and there. In fact, Hirsi Ali, in calling for an enlightenment in the Moslem world, is backing a non-essentialist view. It’s the culture that has to change, but of course religion, with its transcendentalist, eternalist underpinnings, acts as a strong brake against cultural transformation. To engage in the battle for moderation is to battle for this-wordly, evidence-based thinking on human flourishing, against transcendentalist ideas of all kinds.
Atran, I think, relies too heavily on his notion of ‘transcendental constructions’, which he uses too widely and sweepingly, even with a degree of smugness. Let me provide one more quote from his interview, with some final comments.
But again, I don’t see anything about Islam itself… you need some kind of transcendental ideal to get people to sacrifice for genetic strangers, for these large groups. Religion is the best thing that human history has come up with, but there are other competing transcendental notions of which democratic liberalism, human rights, communism, fascism, are others, and right now the democratic-liberal-human rights thing is predominant in a large part of the world and it’s a salvation [……..] and people don’t want that or feel left in the driftwood of globalisation, they are looking for something else to give them equal power and significance.
Methinks Atran might’ve been spending too much time in the study of religious/transcendental ideas – he’s seeing everything though that perspective. I myself have written about democracy, in its various manifestations, from a sceptical perspective many times, and I’ve been critical of the over-use of the concept of rights, and so forth. It’s true enough that people can take these concepts, along with fascism or communism, to a transcendental level, making of them an unquestionable given for ‘right living’ or ‘a decent society’, but they can also be taken pragmatically and realistically, reasonably, as the most serviceable approaches to a well-functioning social order. Social evolution is moving quickly, and we can make sacrifices for genetic strangers, based on our growing understanding, as humans, of our common genetic inheritance. We’re not so much genetic strangers, perhaps, as we once thought ourselves to be. Indeed, it’s this growing understanding, a product of science, that is expanding our circle of connection beyond even the human. We need to promote this understanding as much as we can, in the teeth of transcendentalist, eternalist, other-worldly ideas about submission to deities, heavenly rewards and spiritual superiority.









