In Praise of Unthinking National Religion
Image: Easter on Santorini: Georgios Michos, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons: Link to image here.
I spent Orthodox Easter in Greece. Then, and for the week afterwards, the neon displays over the main roads announced ‘Christ is Risen’, and the shopkeepers wished me a ‘Good Resurrection’.
This piety isn’t reserved for Easter. Almost everyone wears a cross around their neck. Drivers, without interrupting the high volume argument with their passengers, cross themselves when they pass a church.
‘Superstition, not true religion’, sneers the ardent Protestant – for whom, drawing on a Puritan tradition, diligent examination of conscience and the deliberate orientation of the will towards God are the only completely acceptable mental states. The professional philosopher typically agrees: what is philosophy, these days, other than the disciplined examination of propositions and reasons – and of course disciplined examination demands strenuous, conscious attention.
But I’m not so sure. Religion is part of the web and weave of these Greeks: a way primarily of being, and only secondarily of doing, and often not at all of thinking, in the sense that philosophers typically mean by ‘thinking’. It’s a reflex – or at the root of a reflex – which has ethical consequences. If one sees the right result (rather than the means to that result) as the most important thing about ethics, a reflex which produces the right result fast, invariably and unconsciously might be preferable to a process of highly cognitive deliberation which could be derailed before it produces the ethically appropriate end. And if what matters is general moral character, who is more praiseworthy: someone who is constitutionally altruistic (for instance), or someone who decides on a case by case basis whether or not to be altruistic? Continue reading
The Authentic Liar
Written by Muriel Leuenberger
A modified version of this post is forthcoming in Think edited by Stephen Law.
Authenticity is a popular ideal. Particularly in the western world, authenticity has developed into a prevailing ideal since its rise in Modernity.[1] The search for authenticity is a common trope in film and literature, countless self-help books advise us how to become more authentic, and marketing and politics have long discovered authenticity as a useful label to sell goods and candidates.
Boris Johnson and Donald Trump are recent examples of politicians who presented themselves and were perceived by many as particularly authentic. At the same time, both are known for not taking the truth too seriously, if not for being notorious liars. This seems like a contradiction. Can you be an authentic liar? Figures like Johnson and Trump can prompt us to reconsider and clarify what we mean by a concept like authenticity as well as how we should relate to ourselves and express ourselves to others.
Mummification and Moral Blindness
Image: The Great Sphinx and Pyramids of Gizeh (Giza), 17 July 1839, by David Roberts: Public Domain, via Wikimedia Commons
Words are powerful. When a word is outlawed, the prohibition tends to chill or shut down debate in a wide area surrounding that word. That tendency is much discussed, but it’s not my concern here. It’s one thing declaring a no-go area: it’s another when the mere use or non-use of a word is so potent that it makes it impossible to see something that’s utterly obvious.
There has recently been an excellent and troubling example. Some museums have started to change their labels. They consider that the use of the word ‘mummy’ demeans the dead, and are using instead the adjective ‘mummified’: thus, for instance ‘mummified person’ or ‘mummified remains’. Fair enough. I approve. Too little consideration is given to the enormous constituency of the dead. But using an adjective instead of a noun doesn’t do much moral work.
Consider this: The Great North Museum: Hancock, has on display a mummified Egyptian woman, known as Irtyru. Visitor research showed that many visitors did not recognise her as a real person. The museum was rightly troubled by that. It sought to display her ‘more sensitively’. It’s not clear from the report what that means, but it seems to include a change in the labelling. She will no longer be a ‘mummy’, but will be ‘mummified’. She is a ‘mummified person‘: She’ll still remain in a case, gawped at by mawkish visitors. Continue reading
What is the Most Important Question in Ethics?
by Roger Crisp
It’s often been said (including by Socrates) that the most important, ultimate, or fundamental question in ethics is: ‘How should one live?’. Continue reading
Hang Onto Your Soul
Image: https://the-conscious-mind.com
I can’t avoid Steven Pinker at the moment. He seems to be on every page I read. I hear him all the time, insisting that I’m cosmically insignificant; that my delusional thoughts, my loves, my aspirations, and the B Minor Mass’s effect on me are merely chemical events. I used to have stuck up above my desk (on the principle that you should know your enemy), his declaration (as stridently irrational as the sermon of a Kentucky Young Earth Creationist): ‘A major breakthrough of the Scientific Revolution – perhaps its greatest breakthrough – was to refute the intuition that the Universe is saturated with purpose.’ 1
He tells me that everything is getting better. Has been getting better since the first eruption of humans into the world.2 That there’s demonstrable progress (towards what, one might ask, if the universe has no purpose? – but I’ll leave that for the moment). That there’s less violence; there are fewer mutilated bodies per capita. He celebrates his enlightenment by mocking my atavism: he notes that the Enlightenment came after the Upper Palaeolithic, and (for the law of progress admits no exceptions) concludes that that means that our Enlightenment age is better than what went before. Continue reading
How we got into this mess, and the way out
By Charles Foster
This week I went to the launch of the latest book by Iain McGilchrist, currently best known for his account of the cultural effects of brain lateralisation, The Master and His Emissary: The Divided Brain and the Making of the Western World. The new book, The Matter with Things: Our brains, our delusions, and the unmaking of the world is, whatever, you think of the argument, an extraordinary phenomenon. It is enormously long – over 600,000 words packed into two substantial volumes. To publish such a thing denotes colossal confidence: to write it denotes great ambition.
It was commissioned by mainstream publishers who took fright when they saw its size. There is eloquent irony in the rejection on the ground of its length and depth of a book whose main thesis is that reductionism is killing us. It was picked up by Perspectiva press. That was brave. But I’m predicting that Perspectiva’s nerve will be vindicated. It was suggested at the launch that the book might rival or outshine Kant or Hegel. That sounds hysterical. It is a huge claim, but this is a huge book, and the claim might just be right.
Nobody can doubt that we’re in a terrible mess. The planet is on fire; we’re racked with neuroses and governed by charlatans, and we have no idea what sort of creatures we are. We tend to intuit that we are significant animals, but have no language in which to articulate that significance, and the main output of the Academy is to scoff at the intuition. Continue reading
What If Stones Have Souls?
By Charles Foster
Over the 40,000 years or so of the history of behaviourally modern humans, the overwhelming majority of generations have been, so far as we can see, animist. They have, that is, believed that all or most things, human and otherwise, have some sort of soul.
We can argue about the meaning of ‘soul’, and about the relationship of ‘soul’ to consciousness, but most would agree that whatever ‘soul’ and ‘consciousness’ mean, and however they are related, there is some intimate and necessary connection between them – even if they are not identical.
Consciousness is plainly not a characteristic unique to humans. Indeed the better we get at looking for consciousness, the more we find it. The universe seems to be a garden in which consciousness springs up very readily. Continue reading
We’re All Vitalists Now
By Charles Foster
It has been a terrible few months for moral philosophers – and for utilitarians in particular. Their relevance to public discourse has never been greater, but never have their analyses been so humiliatingly sidelined by policy makers across the world. The world’s governments are all, it seems, ruled by a rather crude vitalism. Livelihoods and freedoms give way easily to a statistically small risk of individual death.
That might or might not be the morally right result. I’m not considering here the appropriateness of any government measures, and simply note that whatever one says about the UK Government’s response, it has been supremely successful in generating fear. Presumably that was its intention. The fear in the eyes above the masks is mainly an atavistic terror of personal extinction – a fear unmitigated by rational risk assessment. There is also a genuine fear for others (and the crisis has shown humans at their most splendidly altruistic and communitarian as well). But we really don’t have much ballast.
The fear is likely to endure long after the virus itself has receded. Even if we eventually pluck up the courage to hug our friends or go to the theatre, the fear has shown us what we’re really like, and the unflattering picture will be hard to forget.
I wonder what this new view of ourselves will mean for some of the big debates in ethics and law? The obvious examples are euthanasia and assisted suicide. Continue reading
Coronavirus: Dark Clouds, But Some Silver Linings?
By Charles Foster
Cross posted from The Conversation
To be clear, and in the hope of heading off some trolls, two observations. First: of course I don’t welcome the epidemic. It will cause death, worry, inconvenience and great physical and economic suffering. Lives and livelihoods will be destroyed. The burden will fall disproportionately on the old, the weak and the poor.
And second: these suggestions are rather trite. They should be obvious to reasonably reflective people of average moral sensibility.
That said, here goes:
1. It will make us realise that national boundaries are artificial
The virus doesn’t carry a passport or recognise frontiers. The only way of stopping its spread would be to shut borders wholly, and not even the most rabid nationalists advocate that. It would mean declaring that nations were prisons, with no one coming in or out – or at least not coming back once they’d left. In a world where we too casually assume that frontiers are significant, it doesn’t do any harm to be reminded of the basic fact that humans occupy an indivisible world.
Cooperation between nations is essential to combating the epidemic. That cooperation is likely to undermine nationalist rhetoric.
2. It will make us realise that people are not islands
The atomistic billiard-ball model of the person – a model that dominates political and ethical thinking in the west – is biologically ludicrous and sociologically unsustainable. Our individual boundaries are porous. We bleed into one another and infect one another with both ills and joys. Infectious disease is a salutary reminder of our interconnectedness. It might help us to recover a sense of society.
3. It may encourage a proper sort of localism
Internationalism may be boosted. I hope so. But if we’re all locked up with one another in local quarantine, we might get to know the neighbours and the family members we’ve always ignored. We might distribute ourselves less widely, and so be more present to the people around us.
We might even find out that our local woods are more beautiful than foreign beaches, and that local farmers grow better and cheaper food than that which is shipped (with the associated harm to the climate) across the globe.
4. It may encourage altruism
Exigencies tend to bring out the best and the worst in us. An epidemic may engender and foster altruistic heroes.
5. It may remind us of some neglected constituencies
Mortality and serious illness are far higher among the old, the very young, and those suffering from other diseases. We tend to think about – and legislate for – the healthy and robust. The epidemic should remind us that they are not the only stakeholders.
6. It may make future epidemics less likely
The lessons learned from the coronavirus epidemic will pay dividends in the future. We will be more realistic about the dangers of viruses crossing the barriers between species. The whole notion of public health (a Cinderella speciality in medicine in most jurisdictions) has been rehabilitated. It is plain that private healthcare can’t be the whole answer. Much has been learned about the containment and mitigation of infectious disease. There are strenuous competitive and cooperative efforts afoot to develop a vaccine, and vaccines against future viral challenges are likely to be developed faster as a result.
7. It might make us more realistic about medicine
Medicine is not omnipotent. Recognising this might make us more aware of our vulnerabilities. The consequences of that are difficult to predict, but living in the world as it really is, rather than in an illusory world, is probably a good thing. And recognising our own vulnerability might make us more humble and less presumptuous.
8. Wildlife may benefit
China has announced a permanent ban on trade in and consumption of wildlife. That in itself is hugely significant from a conservation, an animal welfare, and a human health perspective. Hopefully other nations will follow suit.
Recent Comments