I have talked a lot about a generation born around 1900, who came to maturity in the inter-war years and were elevated to ‘public figures’ (like Snow) or ‘gurus’ (like Aldous Huxley, posthumously) in the 1960s. What of the present, now being shaped by those born since 1945? There is one point of similarity with the early 1930s (and I don’t mean declining respect for Parliament!) We are once again in a golden age of popular scientific writing. On any station bookstall you’ll find books by Stephen Jay Gould, Stephen Weinberg, Steven Rose, Stephen Hawking, Steve Jones (why are these men all called ‘Stephen’?) Their business is to explain complicated ideas in an accessible form to non-specialist readers, especially those on the other side of the two cultures divide. When they succeed they are the true torch-bearers for Bronowski’s ideal of a ‘democracy of the intellect’. Though I’m sure they have an eye to their bank balances as well, they are to be welcomed as part of the ‘public understanding of science’ movement which has mushroomed in the last ten years. Richard Dawkins, Professor of the ‘Public Understanding of Science’ at Oxford, sets out an agenda for the ‘appreciation’ of science by non-scientists, using an analogy with music (the best artform to choose, since often scientists with no other artistic interests will be music-lovers). It is indisputable, says Dawkins, that someone can enjoy the Mozart Clarinet Concerto without being able to play the clarinet:
Couldn’t we learn to think of science in the same way? It is certainly important that some people, indeed some of our brightest and best, should learn to do science as a practical subject. But couldn’t we also teach science as something to read and rejoice in, like learning how to listen to music rather than slaving over five-finger exercises in order to play it?
In the remainder of this talk I want to consider the implications of that suggestion. At one level it might be a call for a ‘holistic’ world view. This has been advanced by another of the new popularisers, Edward O Wilson, a Harvard biologist, in his book Consilience: The Unity of Knowledge. As he puts it:
The human condition is the most important frontier of the natural sciences. Conversely, the material world exposed by the natural sciences is the most important frontier of the social sciences and humanities [...] The two frontiers are the same.
According to Wilson, there should be no ‘Two Cultures’ divide, for sciences and arts have a common goal:
The central idea of the consilience world view (the word means ‘jumping together’) is that all tangible phenomena, from the birth of stars to the workings of social institutions, are based on material processes that are ultimately reducible, however long and tortuous the sequences, to the laws of physics (p. 297).
My hackles rise when I hear that word ‘reducible’. Reductivism is what troubled Blake and Keats about Newton, what troubled Tippett about science. And sure enough, when we turn to what Wilson says of the arts we find him espousing the new fad of biopoetics or bioaesthetics, which holds that all innovation (including artistic) is a concrete biological process founded on nerve circuitry and neurotransmitter release (pp. 240-1). Up to a point this approach is helpful. In literary criticism it may help us to understand the ubiquity of archetypal mythic configurations - why do the same stories, the same figures recur in Babylonian Epic, medieval romance, James Joyce’s Ulysses? It may help to understand the paradox of musical modernism. Why, ninety years after Schoenberg’s push into atonality, has the public not ‘caught up’ with him? Why is it that the only use for atonal music that has been found in popular culture is in horror movie scores? A friend of mine is writing a book called Reading Dissonance - as he describes it, ‘a subversive history of modern music which deals with the impossibility of discarding the basic human psycho-acoustic reactions to unpleasant sounds which modernism presumed could be simply suppressed’. The fact that the only living ‘classical’ composers who can fill a concert hall are those who have variously distanced themselves from modernist orthodoxies - Arvo Pärt, John Taverner, James MacMillan, Philip Glass - lends validity to such musical ‘bioaesthetics’. However, Wilson’s reductive bioaesthetics is not enough. It does not account for that residue, which to science is merely the ‘not yet known’ but which in art may be the defining characteristic of an artwork. For critical appreciation of the arts we need a much wider range of styles and methods of interpretation than Wilson’s scheme permits. And even if we could stomach the reductivism of Wilson’s approach, there is still the arrogance. Arrogance, I fear, is the other hubristic sin with which scientists stand charged. That damned impudence, to suppose that everything is knowable, had we but world enough and computing power!
I propose instead that the two cultures should meet in a spirit of humility. No one, not even Edward O Wilson, knows why Mozart’s piano concertos are immensely more satisfying and life-enhancing than Schoenberg’s Piano Concerto. In a properly ‘bi-lingual’ culture, those with an arts background would have an insight into scientific method which would better equip them to understand the calculation of risk and ‘safety’, the need for hypothesis, experiment and innovation. They would stop delivering Luddite tirades against technology into their mobile phones while fingering the upholstery of their BMWs. Then, newly humbled, sciences and arts might discover the wisdom of Wordsworth’s consilience:
We have no knowledge, that is, no general principles drawn from the contemplation of particular facts, but what has been built up by pleasure, and exists in us by pleasure alone [...] The knowledge both of the poet and the man of science is pleasure.
Knowledge should be pleasurable, by whatever cultural conduit it reaches us. Even as an outsider, I recognise how scientific research could be pleasurable. I see also how it might inspire awe. You remember that Keats believed his ‘awful rainbow’ to have been unwoven by science. This is a misprision. Science, approached with a certain humility, could actually enhance our sense of awe, as the late Carl Sagan observed:
How is it that hardly any major religion has looked at science and concluded, ‘This is better than we thought! The Universe is much bigger than our prophets said, grander, more subtle, more elegant’? Instead they say, ‘No, no, no! My god is a little god, and I want him to stay that way’. A religion. old or new, that stressed the magnificence of the Universe as revealed by modern science might be able to draw forth reserves of reverence and awe hardly tapped by the conventional faiths.
I want to commend to the attention of the arts community two sciences which might restore a taste for wonder to palates jaded by a twenty-year diet of ‘postmodernism’ and ‘deconstruction’. One is Sagan’s own speciality - astronomy; the other, the science that probably touches the lives of every one of us - computing and information technology. First, astronomy. Growing up in the ’seventies, I like many others was disillusioned by the anti-climax of the moon landings. ‘We went to the moon, and all we did was pee on it’ (as someone said - I forget who). The reorientation of NASA’s priorities after that - and the shrinking budget - has, I think, inaugurated a much more exciting phase in space exploration: unmanned probes visiting the outer planets, the Hubble telescope, the SETI project (scanning the skies for radio signals which might signal intelligent life). Take the images from the Voyager missions and the Galileo orbiter of the moons of Jupiter - sights that no human eye had ever seen before. Io, a world covered in fresh layers of white, red, yellow and black patches, all different kinds of rock and sulphur compounds. Io is volcanically active, far more so than the earth. Europa, coated with a shell of ice which is thought to overlie a near-global ocean that in spots may be no more than 10 km from the surface. On its surface crisscrossing ridges may be the pathways for watery or slushy eruptions from below. The newest frontier in astronomy tells of planets we cannot yet see, but whose existence has been verified. These are planets orbiting other stars. Over thirty have now been discovered. Some of you may have seen the Horizon programme on TV which well conveyed the excitement of the British team who discovered a planet circling Tau Boo, 55 light years from Earth. The world they have discovered is ‘aweful’ enough to wake any Romantic poet from his opium-induced reveries. Tau Boo’s planet orbits 20 times closer to the star than Earth orbits the sun. Blasted by radiation, the atmospheric temperature is about 1,700 degrees C. The reflected starlight has a blue-green hue, caused possibly by sodium vapour above clouds of magnesium silicate, a chemical which forms solid rock on Earth but is vaporised by the temperatures on this alien world.
My second example is the computer revolution. We may not realise it but we are probably living through another Industrial Revolution, with implications as momentous as the first. I would like to think that even now some latterday Humphrey Jennings is compiling a new ‘Pandaemonium’, documenting the ‘coming of the computer as seen by contemporary observers’. Such a documentary would reveal that the ‘literary intellectuals’ greeted this second revolution with a good deal more enthusiasm than they did the first. Hey, computers are just tools, aren’t they, we all use them. How can they inspire awe? We all recognise that our home computer is obsolescent practically before we’ve unpacked it. Computers double in power roughly every 18 months, a statistic that has held remarkably steady for the last 30 years. You don’t have to be a statistician to realise that eventually they will match the computing power of the human brain - and then surpass it. This was predicted in the 1950s by Bronowski’s friend John von Neumann, who commented on the ‘ever-accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue’. ‘Singularity’ is a term from mathematics describing (correct me if I’m wrong) a point at which rules fail, quantities become infinite, the curve rips through the graph paper. We were all struck by the news of the maverick millionaire Craig Venter, who expects to have finished mapping the entire Human Genome by this summer, several years ahead of the publicly funded project directed to the same end. How has he done it? Money, of course, and ‘the most powerful supercomputer outside the Pentagon’. Neumann’s ‘singularity’ should be precisely the subject matter of intelligent fiction for some new-fledged Aldous Huxley. Soon the rate of change will become so rapid that we will be unable to make any confident predictions about the day after tomorrow. We shall feel, in Lady Macbeth’s words, ‘the future in the instant’. The present will be squeezed ever tighter as the future impacts against us like a speeding train. And what, then, of the past? The book culture on which Snow and Huxley, Bronowski and Tippett were nurtured, will end up in the museum. By 2020, according to Microsoft (who else?), 90% of everything we read will be delivered in an electronic form. (As librarians will tell you, some Government departments have already stopped publishing in hard copy). In its highest potential the internet is ‘democracy of the intellect’ in action - millions of voices, millions of opinions. At its worst, well, I have some sympathy (I blush to admit) with a not-so-old fogey writing in the Daily Telegraph:
One thing of which I have become completely sure is that, as an aid to teaching children, the internet is worse than useless. It is an obstacle to education. I know this as the father of young children, and I am astonished that Mr Blair cannot see it too. My teenage son spends hours on the computer, surfing the internet and sending pointless e-mail messages to his friends [...] Every minute that he spends tapping away at the keyboard, and clicking through that colossal database of unreliable information, would be much more profitably spent reading a good book. When my children do use the internet for their homework, they use it to cheat. Told to write an essay on Henry VIII’s wives, they will go to a search engine, tap in ‘Henry VIII’ and ‘wives’ - and then copy out great undigested chunks of whatever rubbish may come up.
In short, triggered by the IT Revolution, the cultural transformations of the next fifty years will be so massive as to render the ‘Two Cultures’ debate, like so much of the intellectual baggage of the last fifty years, redundant. What relationship will then exist between science and the arts?
I offer a modest proposal. Earlier in this talk I looked back to a period around 1800, when industrialisation and the advance of technical mastery threatened to alienate men and women from their connectedness to the natural world. Their reaction found literary expression in Romanticism. I pointed to immediacy of utterance, an attitude of wonder, a pleasure in knowledge as characteristic of that movement. Shelley puts it beautifully in his poem ‘The Daemon of the World’, when he projects a possible future in which humankind gives up its separation from other forms of life and thereby discovers the fullest powers of mind:
All things are void of terror: man has lost
His desolating privilege, and stands
An equal amidst equals: happiness
And science dawn though late upon the earth.
‘Desolating privilege’ - marvellous phrase - is the desire to set ourselves apart from all other phenomena of the material world, to claim special status and to exercise control through knowledge. Shelley spotted the fault line developing between arts and sciences in the nineteenth century and looked forward to its healing. Perhaps that is the obligation we should take upon ourselves, two centuries later: to reinstate such Romantic notions as intuition, imagination and inspiration, and thereby to salvage our humanity in the face of what allegedly lies ahead of us - von Neumann’s ‘singularity’, a future world where exponential progress is driven by greater-than-human artificial intelligence, truly a world fit for Mary Shelley’s Frankenstein.
Go to Notes
Ó Philip M Ward 2000