‘These great concurrences of things’

20160423_162851

One of the main ideas I pursue here is that the invention of writing has radically altered the way we think, not immediately, but eventually, through its impact on speech, which it transforms from one mode of expression among many into our main instrument of thought, which we call Language, in which the spoken form is dominated by the written and meaning is no longer seen as embedded in human activity but rather as a property of words, which appear to have an independent, objective existence. (This notion is examined in the form of a fable here)

This means in effect that the Modern world begins in Classical Greece, about two and a half thousand years ago, and is built on foundations laid by Socrates, Plato and Aristotle; though much that we think of as marking modernity is a lot more recent (some would choose the Industrial Revolution, some the Enlightenment, some the Renaissance) the precondition for all of these – the way of seeing ourselves in the world which they imply – is, I would argue, the change in our thinking outlined above.

This naturally gives rise to the question of how we thought before, which is not a matter of merely historical interest, since we are not talking here about one way of thinking replacing another, but rather a new mode displacing and dominating the existing one, which nevertheless continues alongside, albeit in a low estate, a situation closely analogous to an independent nation that is invaded and colonised by an imperial power.

What interests me particularly is that this ancient mode of thought, being ancient – indeed, primeval – is instinctive and ‘natural’ in the way that speech is (and Language, as defined above, is not). Unlike modern ‘intellectual’ thought, which marks us off from the rest of the animal kingdom (something on which we have always rather plumed ourselves, perhaps mistakenly, as I suggested recently) this instinctive mode occupies much the same ground, and reminds us that what we achieve by great ingenuity and contrivance (remarkable feats of construction, heroic feats of navigation over great distances, to name but two) is done naturally and instinctively by ants, bees, wasps, spiders, swifts, salmon, whales and many others, as a matter of course.

So how does this supposed ‘ancient mode’ of thought work? I am pretty sure that metaphor is at the heart of it. Metaphor consists in seeing one thing in terms of another, or, if you like, in seeing something in the world as expressing or embodying your thought; as such, it is the basic mechanism of most of what we term Art: poetry, storytelling, painting, sculpture, dance, music, all have this transformative quality in which different things are united and seen as aspects of one another, or one is seen as the expression of the other – they become effectively interchangeable.

(a key difference between metaphorical thinking and analytic thinking – our modern mode – is that it unites and identifies where the other separates and makes distinctions – which is why metaphor always appears illogical or paradoxical when described analytically: ‘seeing the similarity in dissimilars’ as Aristotle puts it, or ‘saying that one thing is another’)

This long preamble was prompted by an odd insight I gained the other day when, by a curious concatenation of circumstances, I found myself rereading, for the first time in many years, John Buchan’s The Island of Sheep.

Now Buchan is easy to mock – the values and attitudes of many of his characters are very much ‘of their time’ and may strike us as preposterous, if not worse – but he knows how to spin a yarn, and there are few writers better at evoking the feelings aroused by nature and landscape at various times and seasons. He was also widely and deeply read, a classical scholar, and his popular fiction (which never pretended to be more than entertainment and generally succeeded) has a depth and subtlety not found in his contemporaries.

What struck me in The Island of Sheep were two incidents, both involving the younger Haraldsen. Haraldsen is a Dane from the ‘Norlands‘ – Buchan’s name for the Faeroes. He is a gentle, scholarly recluse who has been raised by his father – a world-bestriding colossus of a man, a great adventurer – to play some leading part in an envisaged great revival of the ‘Northern Race’, a role for which he is entirely unfitted. He inherits from his father an immense fortune, in which he is not interested, and a vendetta or blood-feud which brings him into conflict with some ruthless and unscrupulous men.

Early in the book, before we know who he is, he encounters Richard Hannay and his son Peter John (another pair of opposites). They are out wildfowling and Peter John flies his falcon at an incoming skein of geese; it separates a goose from the flight and pursues it in a thrilling high-speed chase, but the goose escapes by flying low and eventually gaining the safety of a wood. ‘Smith’ (as Haraldsen is then known) is moved to tears, and exclaims
‘It is safe because it was humble. It flew near the ground. It was humble and lowly, as I am. It is a message from Heaven.’
He sees this as an endorsement of the course he has chosen to evade his enemies, by lying low and disguising himself.

Later, however, he takes refuge on Lord Clanroyden’s estate, along with Richard Hannay and his friends, who in their youth in Africa had sworn an oath to old Haraldsen to look after his son, when they were in a tight spot. They attend a shepherd’s wedding and after the festivities there is a great set-to among the various sheepdogs, with the young pretenders ganging up to overthrow the old top-dog, Yarrow, who rather lords it over them. The old dog fights his corner manfully but is hopelessly outnumbered, then just as all seems lost, he turns from defence to attack and sallies out against his opponents with great suddenness and ferocity, scattering them and winning the day.

20160423_165527

Again, Haraldsen is deeply moved:

‘It is a message to me,’ he croaked. ‘That dog is like Samr, who died with Gunnar of Lithend. He reminds me of what I had forgotten.’

He abandons his scheme of running and hiding and resolves to return to his home, the eponymous Island of Sheep, and face down his enemies, thus setting up the climax of the book (it’s not giving too much away to reveal that good triumphs in the end, though of course it’s ‘a dam’ close-run thing’).

Both these incidents have for me an authentic ring: I can well believe that just such ‘seeing as’ played a key role in the way our ancestors thought about the world and their place in it.

It is, of course, just the kind of thing that modern thinking labels ‘mere superstition’ but I think it should not be dismissed so lightly.

The modern objection might be phrased like this: ‘the primitive mind posits a ruling intelligence, an invisible force that controls the world and communicates through signs – bolts of lightning, volcanic eruptions, comets and other lesser but in some way striking events. The coincidence of some unusual or striking occurrence in nature with a human crisis is seen as a comment on it, and may be viewed (if preceded by imploration) as the answer to prayer. We know better: these are natural events with no connection to human action beyond fortuitous coincidence.’

The way I have chosen to phrase this illustrates a classic problem that arises when modern thinking seeks to give an account of ancient or traditional thinking – ‘primitive’ thinking, if you like, since I see nothing pejorative in being first and original. The notion of cause and effect is key to any modern explanation, so we often find that ‘primitive’ thinking is characterised by erroneous notions of causality – basically, a causal connection is supposed where there is none.

For instance, in a talk I heard by the the philosopher John Haldane, he cited a particular behaviour known as ‘tree binding’ in which trees were wounded and bound as a way of treating human wounds – a form of what is called ‘sympathetic magic’, where another object acts as a surrogate for the person or thing we wish to affect (or, to be more precise, ‘wish to be affected’). An account of such behavior in causal terms will always show it to be mistaken and apparently foolish – typical ‘primitive superstition’: ‘They suppose a causal connection between binding the tree’s wound and binding the man’s, and that by healing the one, they will somehow heal the other (which we know cannot work).’

But I would suggest that the tree-binding is not a mistaken scientific process, based on inadequate knowledge – it is not a scientific process at all, and it is an error to describe it in those terms. It is, I would suggest, much more akin to both prayer and poetry. The ritual element – the play-acting – is of central importance.

The tree-binders, I would suggest, are well aware of their ignorance in matters of medicine: they do not know how to heal wounds, but they know that wounds do heal; and they consider that the same power (call it what you will) that heals the wound in a tree also heals the wound in man’s body. They fear that the man may die but hope that he will live, and they know that only time will reveal the outcome.

Wounding then binding the tree seems to me a ritual akin to prayer rather than a misguided attempt at medicine. First and foremost, it is an expression of hope, like the words of reassurance we utter in such cases – ‘I’m sure he’ll get better’. The tree’s wound will heal (observation tells them this) – so, too, might the man’s.

But the real power of the ritual, for me, lies in its flexibility, its openness to interpretation. It is a very pragmatic approach, one that can be tailored to suit any outcome. If the man lives, well and good; that is what everyone hoped would happen. Should the man die, the tree (now identified with him in some sense) remains (with its scar, which does heal). The tree helps reconcile them to the man’s death by showing it in a new perspective: though all they have now is his corpse, the tree is a reminder that this man was more than he seems now: he had a life, spread over time. Also, the continued survival of the tree suggests that in some sense the man, too, or something of him that they cannot see (the life or soul which the tree embodies) may survive the death of his body. The tree can also be seen as saying something about the man’s family (we have the same image ourselves in ‘family tree’, though buried some layers deeper) and how it survives without him, scarred but continuing; and by extension, the same applies to the tribe, which will continue to flourish as the tree does, despite the loss of an individual member.

And the tree ‘says’ all these things because we give it tongue – we make it tell a story, or rather we weave it into one that is ongoing (there are some parallels here to the notion of ‘Elective Causality’ that I discuss elsewhere). As I have argued elsewhere [‘For us, there is only the trying‘] we can only find a sign, or see something as a sign, if we are already looking for one and already think in those terms. Haraldsen, in The Island of Sheep, is troubled about whether he has chosen the right course, and finds justification for it in the stirring sight of the goose evading the falcon; later, still troubled about the rightness of his course, he opts to change it, stirred by the sight of the dog Yarrow turning the tables on his opponents.

His being stirred, I think, is actually the key here. It would be an error to suppose that he is stirred because he sees the goose’s flight and the dog’s bold sally as ‘messages from heaven’; the reverse is actually the case – he calls these ‘messages from heaven’ to express the way in which they stir him. There is a moment when he identifies, first with the fleeing goose, then with the bold dog. What unites him with them in each case is what he feels. But this is not cause and effect, which is always a sequence; rather, this is parallel or simultaneous – the inner feeling and the outward action are counterparts, aspects of the same thing. A much closer analogy is resonance, where a plucked string or a struck bell sets up sympathetic vibration in another.

This is why I prefer Vita Sackville West’s definition of metaphor to Aristotle’s: for him, metaphor is the ability to see the similarity in dissimilar things; for her, (the quote is from her book on Marvell)

‘The metaphysical poets were intoxicated—if one may apply so excitable a word to writers so severely and deliberately intellectual—by the potentialities of metaphor. They saw in it an opportunity for expressing their intimations of the unknown and the dimly suspected Absolute in terms of the known concrete, whether those intimations related to philosophic, mystical, or intellectual experience, to religion, or to love. They were ‘struck with these great concurrences of things’’

A subject to which I shall return.

In the beginning was the word… or was it?

1511

Reflecting on the origin of words leads us into interesting territory. I do not mean the origin of particular words, though that can be interesting too; I mean the notion of words as units, as building blocks into which sentences can be divided.

How long have we had words? The temptation is to say ‘as long as we have had speech’ but when you dig a bit deeper, you strike an interesting vein of thought.

As I have remarked elsewhere [see ‘The Muybridge Moment‘] it seems unlikely that there was any systematic analysis of speech till we were able to write it down, and perhaps there was no need of such analysis. Certainly a great many of the things that we now associate with language only become necessary as a result of its having a written form: formal grammar, punctuation, spelling – the three things that probably generate the most unnecessary heat – are all by-products of the introduction of writing.

The same could be said of words. Till we have to write them down, we have no need to decide where one word ends and another begins: the spaces between words on a page do not reflect anything that is found in speech, where typically words flow together except where we hesitate or pause for effect. We are reminded of this in learning a foreign language, where we soon realise that listening out for individual words is a mistaken technique; the ear needs to attune itself to rhythms and patterns and characteristic constructions.

So were words there all along just waiting to be discovered? That is an interesting question. Though ‘discovery’ and ‘invention’ effectively mean the same, etymologically (both have the sense of ‘coming upon’ or ‘uncovering’) we customarily make a useful distinction between them – ‘discovery’ implies pre-existence – so we discover buried treasure, ancient ruins, lost cities – whereas ‘invention’ is reserved for things we have brought into being, that did not previously exist, like bicycles and steam engines.  (an idea also explored in Three Misleading Oppositions, Three Useful Axioms)

So are words a discovery or an invention?

People of my generation were taught that Columbus ‘discovered’ America, though even in my childhood the theory that the Vikings got their earlier had some currency; but of course in each case they found a land already occupied, by people who (probably) had arrived there via a land-bridge from Asia, or possibly by island-hopping, some time between 42000 and 17000 years ago. In the same way, Dutch navigators ‘discovered’ Australia in the early 17th century, though in British schools the credit is given to Captain Cook in the late 18th century, who actually only laid formal claim in the name of the British Crown to a territory that Europeans had known about for nearly two centuries – and its indigenous inhabitants had lived in for around five hundred centuries.

In terms of discovery, the land-masses involved predate all human existence, so they were there to be ‘discovered’ by whoever first set foot on them, but these later rediscoveries and colonisations throw a different light on the matter. The people of the Old World were well used to imperial conquest as a way of life, but that was a matter of the same territory changing hands under different rulers; the business of treating something as ‘virgin territory’ – though it quite plainly was not, since they found people well-established there – is unusual, and I think it is striking where it comes in human, and particularly European, history. It implies an unusual degree of arrogance and self-regard on the part of the colonists, and it is interesting to ask where that came from.

Since immigration has become such a hot topic, there have been various witty maps circulating on social media, such as this one showing ‘North America prior to illegal immigration’ 2gdVlD0

The divisions, of course, show the territories of the various peoples who lived there before the Europeans arrived, though there is an ironic tinge lent by the names by which they are designated, which for the most part are anglicised. Here we touch on something I have discussed before  [in Imaginary lines: bounded by consent]  – the fact that any political map is a work of the imagination, denoting all manner of territories and divisions that have no existence outside human convention.

Convention could be described as our ability to project or impose our imagination on reality; as I have said elsewhere [The Lords of Convention] it strikes me as a version of the game we play in childhood, ‘let’s pretend’ or ‘make-believe’ – which is not to trivialise it, but rather to indicate the profound importance of the things we do in childhood, by natural inclination, as it were.

Are words conventions, a form we have imposed on speech much as we impose a complex conventional structure on a land-mass by drawing it on a map? The problem is that the notion of words is so fundamental to our whole way of thinking – may, indeed, be what makes it possible – that it is difficult to set them aside.

That is what I meant by my comment about the arrogance and self-regard implied in treating America and Australia as ‘virgin territory’ – its seems to me to stem from a particular way of thinking, and that way of thinking, I suggest, is bound up with the emergence of words into our consciousness, which I think begins about two and a half thousand years ago, and (for Europeans at least) with the Greeks.

I would like to offer a model of it which is not intended to be historical (though I believe it expresses an underlying truth) but is more a convenient way of looking at it. The years from around 470 to 322 BC span the lives of three men: the first, Socrates, famously wrote nothing, but spoke in the market place to whoever would listen; we know of him largely through his pupil, Plato. It was on Plato’s pupil, Aristotle, that Dante bestowed the title ‘maestro di color che sanno’ – master of those that know.

This transition, from the talking philosopher to the one who laid the foundations of all European thought, is deeply symbolic: it represents the transition from the old way of thought and understanding, which was inseparable from human activity – conversation, or ‘language games’ and ‘forms of life’ as Wittgenstein would say – to the new, which is characteristically separate and objective, existing in its own right, on the written page.

The pivotal figure is the one in the middle, Plato, who very much has a foot in both camps, or perhaps more accurately, is standing on the boundary of one world looking over into another newly-discovered. The undoubted power of his writing is derived from the old ways – he uses poetic imagery and storytelling (the simile of the cave, the myth of Er) to express an entirely new way of looking at things, one that will eventually subjugate the old way entirely; and at the heart of his vision is the notion of the word.

Briefly, Plato’s Theory of Forms or Ideas can be expressed like this: the world has two aspects, Appearance and Reality; Appearance is what is made known to us by the senses, the world we see when we look out the window or go for a walk. It is characterised by change and impermanence – nothing holds fast, everything is always in the process of changing into something else, a notion for which the Greeks seemed to have a peculiar horror; in the words of the hymn, ‘change and decay in all around I see’.

Reality surely cannot be like that: Truth must be absolute, immutable (it is important to see the part played in this by desire and disgust: the true state of the world surely could not be this degrading chaos and disorder where nothing lasts). So Plato says this: Reality is not something we can apprehend by the senses, but only by the intellect. And what the intellect grasps is that beyond Appearance, transcending it, is a timeless and immutable world of Forms or Ideas. Our senses make us aware of many tables, cats, trees; but our intellect sees that these are but instances of a single Idea or Form, Table, Cat, Tree, which somehow imparts to them the quality that makes them what they are, imbues them with ‘tableness’ ‘catness’ and ‘treeness’.

This notion beguiled me when I first came across it, aged fourteen. It has taken me rather longer to appreciate the real nature of Plato’s ‘discovery’, which is perhaps more prosaic (literally) but no less potent. Briefly, I think that Plato has discovered the power of general terms, and he has glimpsed in them – as an epiphany, a sudden revelation – a whole new way of looking at the world; and it starts with being able to write a word on a page.

Writing makes possible the relocation of meaning: from being the property of a situation, something embedded in human activity (‘the meaning of a word is its use in the language’) meaning becomes the property of words, these new things that we can write down and look at. The icon of a cat or a tree resembles to some extent an actual cat or tree but the word ‘cat’ looks nothing like a cat, nor ‘tree’ like a tree; in order to understand it, you must learn what it means – an intellectual act. And what you learn is more than just the meaning of a particular word – it is the whole idea of how words work, that they stand for things and can, in many respects, be used in their stead, just as the beads on an abacus can be made to stand for various quantities. What you learn is a new way of seeing the world, one where its apparently chaotic mutability can be reduced to order.

Whole classes of things that seem immensely varied can now be subsumed under a single term: there is a multiplicity of trees and cats, but the one word ‘tree’ or ‘cat’ can be used to stand for all or any of them indifferently. Modelled on that, abstract ideas such as ‘Justice’ ‘Truth’ and ‘The Good’ can be seen standing for some immutable, transcendent form that imbues all just acts with justice and so on. Plato’s pupil Aristotle discarded the poetic clothing of his teacher’s thought, but developed the idea of generalisation to the full: it is to him that we owe the system of classification by genus and species and the invention of formal logic, which could be described as the system of general relations; and these are the very foundation of all our thinking.

In many respects, the foundations of the modern world are laid here, so naturally these developments are usually presented as one of mankind’s greatest advances. However, I would like to draw attention to some detrimental aspects. The first is that this new way of looking at the world, which apprehends it through the intellect, must be learned. Thus, at a stroke, we render natural man stupid (and ‘primitive’ man, to look ahead to those European colonisations, inferior, somewhat less than human). We also establish a self-perpetuating intellectual elite – those who have a vested interest in maintaining the power that arises from a command of the written word – and simultaneously exclude and devalue those who struggle to acquire that command.

The pernicious division into ‘Appearance’ and ‘Reality’ denigrates the senses and all natural instincts, subjugating them to and vaunting the intellect; and along with that goes the false dichotomy of Heart and Head, where the Head is seen as being the Seat of Reason, calm, objective, detached, which should properly rule the emotional, subjective, passionate and too-easily-engaged Heart.

This, in effect, is the marginalising of the old way of doing things that served us well till about two and a half thousand ago, which gave a central place to those forms of expression and understanding which we now divide and rule as the various arts, each in its own well-designed box: poetry, art, music, etc. (a matter discussed in fable form in Plucked from the Chorus Line)

So what am I advocating? that we undo all this? No, rather that we take a step to one side and view it from a slightly different angle. Plato could only express his new vision of things in the old way, so he presents it as an alternative world somewhere out there beyond the one we see, a world of Ideas or Forms, which he sees as the things words stand for, what they point to – and in so doing, makes the fatal step of discarding the world we live in for an intellectual construct; but the truth of the matter is that words do not point to anything beyond themselves; they are the Platonic Forms or Ideas: the Platonic Idea of ‘Horse’ is the word ‘Horse’. What Plato has invented is an Operating System; his mistake is in thinking he has discovered the hidden nature of Reality.

What he glimpsed, and Aristotle developed, and we have been using ever since, is a way of thinking about the world that is useful for certain purposes, but one that has its limitations. We need to take it down a peg or two, and put it alongside those other, older operating systems that we are all born with, which we developed over millions of years. After all, the rest of the world – animal and vegetable – seems to have the knack of living harmoniously; we are the ones who have lost it, and now threaten everyone’s existence, including our own; perhaps it is time to take a fresh look.

The Disintegration of Expression

20160122_102832

The week when a group of scientists have decided to hold the ‘Doomsday Clock’ at three minutes to midnight (though I cannot help feeling that the notion of a clock that can always be reset undermines the idea of time running out) is an apt one to consider the diagram above, which also deals with time, though the message it has to convey concerns not how little time might be left to us  but rather how much has gone before.

The diagram is drawn to different scales and has two related parts. The strip along the bottom with the grey wavy lines represents the last 200,000 years, which is the period our particular species of human, Homo sapiens, has been around (though that is still a small fraction of the human timeline, which streches back some 6.5 million years). The upper part of the diagram represents the last quarter of that time, with today (2016) at the right hand edge, and the jagged left hand edge being 50,000 years ago.

The area to the right of the blue line marked E is the last 5,500 years; it is reperesented on the bottom strip by the coloured portion to the extreme right of the grey strip.

Five and a half thousand years ago saw two significant events, the invention of metalworking and the invention of writing. It therefore marks an important boundary, or rather two: everything to the left of the line marked E (shown at greater length by the wavy line below) is the Stone Age; it is also conventionally regarded as Prehistoric Times, since History is deemed to start with the invention of writing and the possibility of contemporary records.

(It is worth pausing a moment to consider our immediate reaction to the terms ‘Stone Age’ and ‘Prehistoric’ – both are widely used pejoratively, to denote whatever is hopelessly primitive, barbarous and old fashioned, with no place in the modern age)

The red line marked with a star is more recent – 2,500 years ago – and takes us back to the beginning of the Classical period in Greece, the age that saw that most significant generation of teachers and pupils, Socrates, Plato and Aristotle.

Somewhere in that time occurs what I have called the Muybridge Moment  by analogy with Eadweard Muybridge’s invention of stop-motion photography, which enabled him effectively to freeze time and analyse the motion of a galloping horse. In the same way, somewhere from Socrates (who wrote nothing) to Aristotle, whose writings arguably provide the foundation of Western thinking and the modern world, the full potential of writing is realised for the first time: it can freeze the flow of speech, giving it an objective form which can be analysed and codified.

That, for me, is a more significant moment than the invention of writing some three millennia earlier, which though a necessary condition for the development of the modern world was not yet a sufficient one, as its potential had yet to be recognised.

One inference that can be drawn from the diagram is that the farther we go to the left, the likelier it is that any human practice we find will by now have become so ingrained that we regard it as coming naturally to us; it is congenital, something we are born with, or born with an aptitude for (to use a very recent metaphor, we are programmed to do it). The prime case, of course, is speech, which we have presumably engaged in from time immemorial, and which we learn (and teach) without need for any formal training.

The naturalness of speech, however, is disguised to a large extent by the advent of literacy: reading and writing, though immensely advantageous (and a key measure of ‘development’ that we use to judge nations and societies) are by no means natural to us: considerable effort and training is required to master them (and to teach them) and not everyone succeeeds in acquiring them; but to be without them – in a literate society – is to be disabled. When it comes to human expression, we are not content to rely on nature: it must be augmented, even supplanted, by formal instruction.

That point is worth bearing in mind: it is quite likely that other of our natural aptitudes have become overlooked and effectively hidden by the way our system of education has developed.

Let us now consider what the first three lines on the diagram represent: none marks an event or a first beginning; rather they are records of activity that must already have been going on for some time – for thousands, even tens of thousands of years – but of which we have some tangible, dateable evidence at these points.

Flauta_paleolítica

A, some 42,000 years ago, is the date of some bone flutes that have been found in the Swabian Alb region of German. Music, of course, must be older than that: it is probably primeval – the voice is the oldest instrument, though percussion – drumming and rhythmic clapping and stamping – must be a close second. And if we mention rhythm, it is natural to think of dance, and to suppose that it, too, is very ancient, though it leaves little in the way of direct evidence.

(However, there is possible evidence of the controlled use of fire by our ancestor, Homo erectus, dating from 1.5 million years ago, and demonstrable evidence from 0.79 million years ago (790,000 years). Is it at all unreasonable to suppose that dancing around fires, singing and drumming, is equally ancient? Or, for that matter, telling tales around the fire?)

B, 40,000 years ago, is the date of certain carved figures found in the same region of Germany as the flutes, though these again are not a start point but rather an indication of an established human activity; and there are some who find evidence for sculpture much older still (the ‘Venus of Tan-Tan‘ is dated around 300,000 years ago).

C, 30,000 years ago, is the date of some cave-painting found in France, Spain and Indonesia; again, not a start point, but evidence of an already highly developed and skilled human activity.

(I might have included a line a shade to the right of C, around 29,000 years ago, to mark the oldest know ceramics, i.e. fired clay. The striking thing is that its first use is aesthetic, the making of figurines or statuettes In terms of practical application, the oldest pottery vessels we know about are some 9000 years younger, from around 20,000 years ago.)

D, 10,000 years ago, differs from the others in marking a start point – that of civilisation, the habit of living in settled communities supported by agriculture, as opposed to our previous nomadic hunter-gatherer way of life. Jericho claims to be the oldest continuously inhabited human settlement, with beginnings dated some 11000 years ago.

‘Civilisation’ is another word loaded with overtones, though unlike ‘Stone Age’ and ‘Prehistoric’ they are not pejorative: ‘civilised’ is the opposite of ‘barbaric’ – it denotes having all the cultural adjuncts that we esteem highly – education, art, music, literature, and a certain level of human behaviour implying decent treatment, hospitality and respect for others. Which should give us pause, since as our diagram shows, ‘civilisation’ is very much a Stone Age, Prehistoric invention.

DSC0223-2

‘Civilisation’ (in the strict sense of living in settlements supported by agriculture) is one of the earliest examples of what I have called ‘elective indispensables’ – things we manage perfectly well without till we invent them, then adapt our way of living to them so they seem indispensable. A look at surviving nomadic cultures – the Mongols, for example, or our own (sadly beleaguered) travelling folk – soon gives the lie to the notion that hospitality, decency and a good standard of life are the preserve of dwellers in cities; and where has there ever been squalor, degradation and dehumanisation on a par with that found in great cities down the ages and still today?

As for the notion that ‘Civilisation’ is interchangeable with ‘Culture‘ in its narrow sense of ‘those human achievements we value highly such as art, music, poetry’ – the diagram gives the lie to that, too – it is evident that all these things have their origin tens of thousands, probably hundreds of thousands, of years before civilisation came along.

But surely literature – as its very name suggests – belongs to the age of writing (and so of (later) civilisation)?

It is a point worth examining. While the discovery of ceramics was turned first to creative or aesthetic use, and only some thousands of years later to practical applications, the case of Writing is the opposite. It would appear to have come in as an adjunct of number, to enable lists to be made of the things that could be counted – such as the contents of warehouses and treasuries. It was also used for records, of reigns, battles etc., and the promulgation of laws. It took a thousand years for anyone to use it for something we might call literature.

Although there is a case to be made that the invention of writing marks the start of History, that is to suppose that History is merely record-keeping; however, it has a much wider sense, ‘the account that people give of who they are and where they came from’ and here it overlaps to a large extent with ‘Culture’, not in the narrow sense of ‘desirable attainments’ but the broader one of ‘the customs and traditions – the way of life – transmitted from one generation to the next.’

Look again at our diagram. The inference to be drawn from it is not that the people who lived in the time up to the blue line marked E had no sense of who they were or where they came from, but rather that they had a means of transmitting their Culture which had no need of Writing.

Which brings me at last to my somewhat controversial claim that the period up to the red line should be thought of as the Age of Integrated Expression, in contrast to what I have called the Age of Language.

My case is this: what we think of as ‘Language’ is not a continuum with its origin in the very beginnings of human time but actually a radical departure from that continuum, dating back some two and a half thousand years. The major obstacle to our seeing this is that ‘Language’ is, as it were, the lens through which we view the past: it colours how we think of it. (And a small evidence of this is seen in the effect of the words ‘Prehistoric’ ‘Stone Age‘ and ‘Civilisation’ noted above)

What characterises ‘Language’ and marks it off from what went before is its narrowness of focus: it is concerned exclusively with its written and spoken form, which interact yet are to some extent opposite (a point examined here ) Although Speech is far older and comes naturally to us, the dominant partner in this relationship is Writing, as can be seen from the great importance that we attach to formal grammar, standardised spelling and punctuation, all necessary adjuncts of writing (in fact, remedies for its inherent weaknesses) for which Speech has no need at all, though it now strives to conform to them – consider the notions of ‘Standard English’ and ‘Received Pronunciation’.

(these are points I have discussed elsewhere here, here and here )

I would argue that the natural mode of human expression makes use indifferently of all the means we use to express ourselves – speech, certainly, but also facial expression, gesture, bodily posture, movement, rhythm, music, art, sculpture – a range that extends from our immediate selves out into our surroundings. I see no reason to suppose that Speech in particular was deemed any more important than the others: I think that is an illusion fostered by the disintegration which has taken place with the emergence of ‘Language’ which has seen Speech separated and simultaneously elevated in importance but subjugated to Writing, while the other modes are effectively conquered by division, being turned from natural human activities into areas of specialist skill: music, painting, sculpture, dance (and indeed literature).

This ‘Integrated Expression’, I would argue, is the natural vehicle of human culture, the means by which we transmitted our ideas of who we were and where we came from for tens and hundreds of thousands of years. If we were looking back at it through our ‘Language’ shaped lens, we would distinguish Dance and Ritual and Music and Storytelling and Poetry and Art, and doubtless see that they were associated with particular times of year (Solstices and Equinoxes, for instance) and certain places (painted caves, perhaps, or megalithic monuments – to say nothing of campfires). However, the key to grasping it is not separation and distinction but combination and likeness, synthesis rather than analysis – which is also the mechanism of metaphor, the key tool of this older way of thinking as reason and logic (both children of ‘Language’) are of the new.

And the hopeful conclusion is that, although the old way may have been superseded, it still goes on, albeit cloaked and disguised – indeed, a case might be made that all that is vigorous in our present culture stems from these ‘natural’ elements in their various guises – Music, Art, Poetry, Storytelling etc. – rather than from our present education system, chiefly designed as a means of transmitting literacy (and maintaining the ascendancy of the literate).

Literally Seismic

TubGurnard2

Pedantic old gurnard* that I am, I still experience a frisson of annoyance when people (journalists, mostly) say things like ‘the very epicentre of the fighting’ or ‘the epicentre of world trade’. That is because ‘epicentre’ has a precise meaning, which in these cases is ignored: it is properly used of earthquakes, to denote the point on the earth’s surface directly above the seismic event, which actually occurs deep down: that is what the prefix ‘epi’ denotes – it is from the Greek, and means ‘upon’ or ‘over’. We encounter it in epitaph, which literally means ‘above tomb’, hence a headstone or grave-monument, though it has come to mean the writing on such a stone, which might more properly be termed an epigraph, something written above or over, such as the inscription at the head of a longer text – a poem or a novel – or over the entrance to a building.

Irritation that a word is used imprecisely might be compared to the pain that some might feel on seeing (say) a vernier caliper (or even a good screwdriver) used to lever the lid off a tin of paint: here is something designed for a precise purpose – a precision tool, indeed – used in ignorance to perform a basic operation requiring only something crude and simple. Part of the pain in those cases is the fear that the instrument may be damaged – which can hardly be said of a word – but a considerable part is aesthetic: those who know what the proper function of the tool or word is wince to find it used so ignorantly.

Imacon Color Scanner

These days, however, my irritation (a product of my upbringing and early education, which put great weight on the accurate use of words) is liable to be displaced by delight in this evidence that language is a natural thing that belongs to us all equally and to no-one in particular, despite the efforts of those (myself among them) who have claimed authority over it and seen fit to prescribe how it should be used.

If we look closely at what is happening with this use of ‘epicentre’ we find, once more, a species of metaphor. Metaphor is, in my view, the key mechanism in how language works and develops; it consists in transferring a word or term from its original context to a new one, in order to invite a comparison of the two, and from that spark a new meaning or sense in which the word can be used (and we should not overlook the playful element in this, a point I shall return to: there is a close affinity between how metaphors work and how jokes do)

Just as we will use ‘seismic’ to express the importance or impact of some event, likening it to an earthquake, so we borrow ‘epicentre’ to add a touch of the same flavour – this is not just any old centre, it’s the centre of something important, it’s where it all happens, the point from which it all flows ouward. (It is worth pausing to consider the importance we attach to centres as if there was something intrinsically important about them, but surely it is just a customary usage, rather than reflecting any real significance – why should the centre be more important than the periphery, or any other point?)

I have the impression that we meet this use of ‘epicentre’ more often in radio or TV journalism than in print, and that points to another factor that is often overlooked, namely how the sound of a word can affect its use and so its meaning. ‘Epicentre’, opening as it does with that plosive ‘p’ sound in the first syllable, is easier to pronounce emphatically than ‘centre’ with its soft sibilant opening – so it can be made to sound important as well as borrowing importance from its seismic origin.

We see a similar thing in that other bugbear of the self-proclaimed pedant, the use that is made of the word ‘literally’. Sports commentators in particular are apt to say things like ‘the ball literally exploded behind the keeper’ or ‘the home support literally raised the roof’ and this is apt to evoke superior scorn from those who think they know better: ‘O, did it really explode? was there much damage?’ or ‘they actually raised the roof? by how many feet?’

4726924975_6fa8f10b8e_b

But look again: surely if this use of ‘literally’ is to be objected to, it is on the grounds of redundancy rather than anything else – what does ‘they literally raised the roof’ say that is not already expressed by ‘they raised the roof’ ? For all the pedants might wish to say, the addition of ‘literally’ does not turn ‘raised the roof’ or ‘the ball exploded’ from a metaphor to a piece of reporting; it merely serves to reinforce the metaphor – not very well, since both metaphors are rather stale, which is what the commentator senses in his desperate attempt to refresh them, when he might be better advised to drop them altogether.

The ‘shock value’ of a metaphor is its assertion that one thing is another, which evokes the response ‘how can that be?’; then we think about it, and light dawns – ‘I see it now’. Using ‘literally’ in this way is an attempt to renew the ‘shock value’ that ‘raised the roof’ and ‘the ball exploded’ (might have) had when first used.

Again, how it sounds plays a part here: it is a characteristic of English that the insertion of an adverb before a verb has a climactic effect, rather like a drum-roll which prepares us for the delivery of the telling word or phrase by warning of its imminent arrival. We see something similar with the insertion of an adverb such as ‘wholly’ ‘totally’ or ‘utterly’ before what is technically called a complement – thus ‘this is unacceptable’ ‘that is untrue’ or ‘this is deplorable’ become ‘this is wholly unacceptable’ ‘that is totally untrue’ ‘this is utterly deplorable’ – all expressions (try saying them aloud) where our vehemence can be concentrated in the adverb, which is almost spat out. Indeed, in these cases, the force of the utterance is largely transferred to the adverb, which conveys the speaker’s attitude before the final world is even spoken.

A final example is furnished by the unusual word ‘careen’ which is another technical term, though its use must now be rare – it means to tilt on one side, and is a nautical term – ‘to turn a vessel over on its side, especially for repairing or cleaning’ – its derivation is from the Latin carina, keel. I have lately heard it used (in one case by no less a person than Mr Stephen Fry, who occasionally makes wireless programmes about English usage) as a synonym for the verb ‘career’ meaning  ‘to rush headlong’. The etymology of ‘career’ is from carriere, the old French word for a racecourse, derived ultimately from the Latin for wagon, carrus. (cp. curriculum vitae, which literally means ‘little chariot of life’).

The two words are unconnected, but their similarity in appearance and sound makes them easily associated and confused, and it is likely that the dominance of ‘career’ as a noun meaning profession or occupation has made people wary of using the same word as a verb with what seems a quite different meaning (our careers are meant to be carefully plotted and managed, not a blind rush forwards) – hence the adoption of ‘careen’ for that purpose.

So the next time you find yourself on the verge of apoplexy at some linguistic usage, calm down and take a step back: language belongs to everyone, and anyone can change it – and they will, whether you like it or not.

*a gurnard of course is a kind of fish, pictured at the top of the page, but its name (apparently derived from the French grogner, to grunt, which it does when caught) evokes to my ear ‘gurn hard’ suggesting tetchy complaint, which allied to its rather grumpy looks makes it seem a fitting term for a pedantic whinger, even though it has no etymological connection – how language evolves, in action.

The Muybridge Moment

Muybridge-2

The memorable Eadweard Muybridge invented a number of things, including his own name – he was born Edward Muggeridge in London in 1830. He literally got away with murder in 1872 when he travelled some seventy-five miles to shoot dead his wife’s lover (prefacing the act with ‘here’s the answer to the letter you sent my wife’) but was acquitted by the jury (against the judge’s direction) on the grounds of ‘justifiable homicide’. He is best known for the sequence of pictures of a galloping horse shot in 1878 at the behest of Leland Stanford, Governor of California, to resolve the question of whether the horse ever has all four feet off the ground (it does, though not at the point people imagined). To capture the sequence, Muybridge used multiple cameras and devised a means of showing the results which he called a zoopraxoscope, thereby inventing stop-motion photography and the cinema projector, laying the foundations of the motion-picture industry.

The_Horse_in_Motion-anim

(“The Horse in Motion-anim” by Eadweard Muybridge, Animation: Nevit Dilmen – Library of Congress Prints and Photographs Division; http://hdl.loc.gov/loc.pnp/cph.3a45870. Licensed under Public Domain via Commons – https://commons.wikimedia.org/wiki/File:The_Horse_in_Motion-anim.gif#/media/File:The_Horse_in_Motion-anim.gif)

Muybridge’s achievement was to take a movement that was too fast for the human eye to comprehend and freeze it so that each phase of motion could be analysed. It was something that he set out to do – as deliberately and methodically as he set out to shoot Major Harry Larkyns, his wife’s lover.

It is interesting to consider that something similar to Muybridge’s achievement happened a few thousand years ago, entirely by accident and over a longer span of time, but with consequences so far-reaching that they could be said to have shaped the modern world.

We do not know what prompted the invention of writing between five and six thousand years ago, but it was not a desire to transcribe speech and give it a permanent form; most likely it began, alongside numbering, as a means of listing things, such as the contents of storehouses – making records for tax purposes, perhaps, or of the ruler’s wealth – and from there it might have developed as a means of recording notable achievements in battle and setting down laws.

We can be confident that transcribing speech was not the primary aim because that is not something anyone would have felt the need to do. For us, that may take some effort of the imagination to realise, not least because we live in an age obsessed with making permanent records of almost anything and everything, perhaps because it is so easy to do so – it is a commonplace to observe that people now seem to go on holiday not to enjoy seeing new places at first hand, but in order to make a record of them that they can look at once they return home.

And long before that, we had sayings like

vox audita perit, littera scripta manet
(the voice heard is lost; the written word remains)

to serve as propaganda for the written word and emphasise how vital it is to write things down. One of the tricks of propaganda is to take your major weakness and brazenly pass it off as a strength (‘we care what you think’ ‘your opinion matters to us’ ‘we’re listening!’ as banks and politicians say) and that is certainly the case with this particular Latin tag: it is simply not true that the spoken word is lost – people have been remembering speech from time immemorial (think of traditional stories and songs passed from one generation to the next); it is reasonable to suppose that retaining speech is as natural to us as speaking.

If anything, writing was devised to record what was not memorable. Its potential beyond that was only slowly realised: it took around a thousand years for anyone to use it for something we might call ‘literature’. It is not till the classical Greek period – a mere two and a half millennia ago (Homo sapiens is reckoned  at 200,000 years old, the genus Homo at 2.8 million)  – that the ‘Muybridge moment’ arrives, with the realisation that writing allows us to ‘freeze’ speech just as his pictures ‘froze’ movement, and so, crucially, to analyse it.

When you consider all that stems from this, a considerable degree of ‘unthinking’ is required to imagine how things must have been before writing came along. I think the most notable thing would have been that speech was not seen as a separate element but rather as part of a spectrum of expression, nigh-inseparable from gesture and facial expression.  A great many of the features of language which we think fundamental would have been unknown: spelling and punctuation – to which some people attach so much importance – belong exclusively to writing and would not have been thought of at all; even the idea of words as a basic unit of language, the building blocks of sentences, is a notion that only arises once you can ‘freeze’ the flow of speech like Muybridge’s galloping horse and study each phase of its movement; before then, the ‘building blocks’ would have been complete utterances, a string of sounds that belonged together, rather like a phrase in music, and these would invariably have been integrated, not only with gestures and facial expressions, but some wider activity of which they formed part (and possibly not the most significant part).

As for grammar, the rules by which language operates and to which huge importance is attached by some, it is likely that no-one had the least idea of it; after all, speech is even now something we learn (and teach) by instinct, though that process is heavily influenced and overlaid by all the ideas that stem from the invention of writing; but then we have only been able to analyse language in that way for a couple of thousand years; we have been expressing ourselves in a range of ways, including speech, since the dawn of humanity.

When I learned grammar in primary school – some fifty years ago – we did it by parsing and analysis. Parsing was taking a sentence and identifying the ‘parts of speech’ of which it was composed – not just words, but types or categories of word, defined by their function: Noun, Verb, Adjective, Adverb, Pronoun, Preposition, Conjunction, Article.

Analysis established the grammatical relations within the sentence, in terms of the Subject and Predicate. The Subject, confusingly, was not what the sentence was about – which puzzled me at first – but rather ‘the person or thing that performs the action described by the verb’ (though we used the rough-and-ready method of asking ‘who or what before the verb?’). The Predicate was the remainder of the sentence,  what was said about (‘predicated of’) the Subject, and could generally be divided into Verb and Object (‘who or what after the verb’ was the rough and ready method for finding that).

It was not till I went to university that I realised that these terms – in particular, Subject and Predicate – derived from mediaeval Logic, which in turn traced its origin back to Aristotle (whom Dante called maestro di color che sanno – master of those that know) in the days of Classical Greece.

Socrates

Socrates

Plato

Plato

Aristotle

Aristotle

Alexander the Great

Alexander the Great

Aristotle is the third of the trio of great teachers who were pupils of their predecessors: he was a student of Plato, who was a student of Socrates. It is fitting that Aristotle’s most notable pupil was not a philosopher but a King: Alexander the Great, who had conquered much of the known world and created an empire that stretched from Macedonia to India by the time he was 30.

That transition (in a period of no more than 150 years) from Socrates to the conquest of the world, neatly embodies the impact of Classical Greek thought, which I would argue stems from the ‘Muybridge Moment’ when people began to realise the full potential of the idea of writing down speech. Socrates, notably, wrote nothing: his method was to hang around the market place and engage in conversation with whoever would listen; we know him largely through the writings of Plato, who uses him as a mouthpiece for his own ideas. Aristotle wrote a great deal, and what he wrote conquered the subsequent world of thought to an extent and for a length of time that puts Alexander in eclipse.

In the Middle Ages – a millennium and a half after his death – he was known simply as ‘The Philosopher’ and quoting his opinion sufficed to close any argument. Although the Renaissance was to a large extent a rejection of Aristotelian teaching as it had developed (and ossified) in the teachings of the Schoolmen, the ideas of Aristotle remain profoundly influential, and not just in the way I was taught grammar as a boy – the whole notion of taxonomy, classification by similarity and difference, genus and species – we owe to Aristotle, to say nothing of Logic itself, from which not only my grammar lessons but rational thought were derived.

I would argue strongly that the foundations of modern thought – generalisation, taxonomy, logic, reason itself – are all products of that ‘Muybridge Moment’ and are only made possible by the ability to ‘freeze’ language, then analyse it, that writing makes possible.

It is only when you begin to think of language as composed of individual words (itself a process of abstraction) and how those words relate to the world and to each other, that these foundations are laid. Though Aristotle makes most use of it, the discovery of the power of generalisation should really be credited to his teacher, Plato: for what else are Plato’s Ideas or Forms but general ideas, and indeed (though Plato did not see this) those ideas as embodied in words? Thus, the Platonic idea or form of ‘table’ is the word ‘table’ – effectively indestructible and eternal, since it is immaterial, apprehended by the intellect rather than the senses, standing indifferently for any or all particular instances of a table – it fulfils all the criteria*.

Which brings us to Socrates: what was his contribution? He taught Plato, of course; but I think there is also a neat symbolism in his famous response to being told that the Oracle at Delphi had declared him ‘the wisest man in Greece’ – ‘my only wisdom is that while these others (the Sophists) claim to know something, I know that I know nothing.’ As the herald of Plato and Aristotle, Socrates establishes the baseline, clears the ground, as it were: at this point, no-one knows anything; but the construction of the great edifice of modern knowledge in which we still live today was just about to begin.

However, what interests me most of all is what constituted ‘thinking’ before the ‘Muybridge Moment’, before the advent of writing – not least because, whatever it was, we had been doing it for very much longer than the mere two and a half millennia that we have made use of generalisation, taxonomy, logic and reason as the basis of our thought.

How did we manage without them? and might we learn something useful from that?

I think so.

*seeing that ideas are actually words also solves the problem Hume had, concerning general ideas: if ideas are derived from impressions, then is the general idea ‘triangle’ isosceles, equilateral, or scalene or some impossible combination of them all? – no, it is just the word ‘triangle’. Hume’s mistake was in supposing that an Idea was a ‘(faint) copy’ of an impression; actually, it stands for it, but does not resemble it.

The Mechanism of Meaning (it’s all in the mind)

Meaning matters. It is bound up with so many things: understanding and misunderstanding, doubt and certainty, to say nothing of philosophy, poetry, music and art; so it is worth considering the mechanism by which it operates. ‘Mechanism’ is a useful image here: when mechanisms are hidden – as they generally are – their effects can seem mysterious, even magical (as in the marvels of the watchmaker or the stage magician); yet when they are revealed, they offer reassurance: the point of a mechanism is that, unless it is impaired or interfered with, it will go on working in the same way.

Audemars_piguet_1908_montre_poche_640_360_s_c1_center_center magician-performs-a-levitation-trick-on-stage-nita-the-hypnotised-and-suspended-lady

The problem with the mechanism of meaning is that the popular notion of it is misleading: we speak of meaning as something conveyed, like a passenger in a car, or transmitted, like a radio message; we also speak of it as being embodied or contained in things that have it, whether they are sentences, poems, works of art or the like. These two usages combine to suggest that meaning exists independently in some form, and that the business of ‘meaning’ and ‘understanding’ consists of inserting it into and extracting it from whatever is said to have it. That seems like common sense, but as we shall see, when scrutinised it proves problematic.

Wittgenstein points us in another direction with his observation that ‘the meaning of a word is its use in the language’ which he uses alongside ‘language game’ and ‘form of life’ when discussing meaning, to denote the (wider) activity of which language forms a part and from which it derives its meaning.

It strikes me that the basic mechanism of meaning lies in connection : meaning is only found where a connection is made, and that connection is made in the mind of an observer, the one who ascribes meaning. In other words, meaning is not a fixed property of things: a thing in itself, on its own, does not have meaning. But we must be careful here: this is a stick that some will readily grasp the wrong end of – to suggest that a tree or a person (say) ‘has no meaning’ is liable to provoke outrage and earnest outpourings about the inestimable value of trees and people. That is because ‘meaningless’ is a pejorative term, properly used in cases where we expect meaning but do not find it; it might be compared to our use of ‘flightless’ which we apply to certain birds that are exceptions to the general rule; we would not apply it to pigs or gorillas.

gorillangel

We can gain some insight into how meaning works – its mechanism – by considering an allied concept, purpose. Let us suppose some interplanetary traveller at a remote time of a quite different species to ourselves. Somewhere on his travels he comes upon this relic of a long-lost civilisation: a rectangular case constructed of semi-rigid, possibly organic material, which opens to disclose a cellular array – there are some twenty-four rectangular cells of the same organic material, each containing an identical object, rounded, hard and smooth to the touch. He is quite excited by the find as it reminds him of another he has come across – again a cellular array in a case of semi-rigid, possibly organic material, and again each cell containing a smooth, hard, rounded object, though there are differences of detail both in the shape of the cells and the objects. He may submit a learned paper to the Integalactic Open University speculating on the purpose of these strikingly similar discoveries; he is in no doubt that they are variants of the same thing, and share a common purpose, on account of the numerous points of resemblance.

Were we at his side we might smile, since one is a packet of lightbulbs and the other a box of eggs; it is likely that the resemblances that strike him as the best clues to their purpose might elude us altogether, since we would dismiss them as irrelevant – ‘that is just how they happen to be packaged, for ease of transport or storage: it has no bearing on what they are for. As to the slight similarities of shape and texture, that is mere coincidence. These objects are entirely unrelated, and could not be more unlike.’

577e028b29cf98908190de258ad90d73 light-bulb_1467547c

It is worth considering the key difference between us and the interplanetary traveller that allows us to smile at his ill-founded speculation. These are familiar objects to us, and we can connect them at once to a context or situation in which they belong, where they fit in and have purpose; our ‘reading’ of them is entirely different from the alien traveller’s – we disregard all that seems to him most striking, because we know it is of no significance. We see that the apparent similarity has nothing to do with the objects themselves, but the fact that they are both in storage, awaiting use; neither is ‘active’, i.e. in the situation or context where they are used and have purpose.

crack_eggs_1Mains_powered_electric_Lamp

How far an examination of the objects in detail might allow our traveller to deduce, on the one hand, a national grid for distributing electricity from power stations to homes and workplaces rigged with lighting circuits, and the delights of omelettes, fried, poached and scrambled eggs on the other depends on quite how alien he is – if he a gaseous life-form sustained by starlight, he is unlikely to penetrate far into their mystery. On the other hand, if his own existence has ‘forms of life’ or activities similar to ours, he might make much better and even surprisingly accurate guesses.

That, after all, is how we ourselves proceed if we come across artefacts or objects that are unfamiliar: we guess at their purpose by thinking of the kind of the thing they might be, the sort of use they might have, by analogy with our own activities or ‘forms of life’ (and it is no accident that truly mystifying objects are often tentatively described as having ‘possible religious or ritual significance’ since in our own experience this is where many things are found whose use could not easily be guessed; and in this connection consider the use made of everyday objects in burial rites – offerings of food put alongside the dead, or cooking or eating utensils for use on the onward journey).

41

I would suggest that, as far as the mechanism by which they operate goes, ‘purpose’ and ‘meaning’ are the same, since both are defined in the same way, viz. by placing the thing in question in relation to some context, situation or larger activity where it has a place, where it ‘makes sense’, if you like;  (imagine our alien traveller’s reaction to being shown the circumstances in which a lightbulb is used – the mystery of the object disappears once the connections are made – literally, in this case).

This brings out important aspects of meaning that are often overlooked, not least because – as I observed at the outset – they are contradicted by most popular accounts of what meaning is. The first aspect is that meaning is not inherent: no amount of studying or dissecting the object in isolation will discover it – I emphasis ‘in isolation’ because discovering, say, the filament in the light bulb and how it is connected to the fitting at the base will advance our understanding only if we can relate them to other things: if we have no notion of electricity, or that it will make a wire filament glow brightly, then they will tell us nothing.

The second aspect is slightly trickier to explain but of greater significance. If we agree that meaning is not inherent, not something that can be found simply by examining the object no matter how minutely, then we can reasonably ask where it is located. One answer, from what we have said, is that it lies in the relation or connection to the context, situation or ‘form of life’; but I think that is not quite right.

Rather, it consists in being related to, or being connected with – in other words, it exists as the result of an action by the onlooker, and where it exists – where it means – is in that onlooker’s mind. This is not the usual account that is given of meaning, which is generally more like this, from Wikipedia:

‘meaning is what the source or sender expresses, communicates, or conveys in their message to the observer or receiver, and what the receiver infers from the current context.’

At first sight, this might not seem significantly different – we have relation to context, we have a process of inference; the main addition appears to be that the source or sender is taken into account, as well as the receiver. However, there is one slight-seeming but important difference, which is the notion of the meaning as something which retains its identity throughout, and which exists prior to the communication taking place and survives after it – the model that springs readily to mind is the letter, which the sender puts in the envelope, which is then conveyed to the recipient who takes it out and reads it.

IMG01331-20151103-1026

The analogy with the letter is probably what makes this seem a ‘common sense’ account that most people would agree with, but the logic of it gives rise to problems. If we picture the process of sending a letter, we might start with the sender at her desk, pen in hand, poised to write; she puts the message on the page, folds the page and seals it in the envelope then sends it off; at the other end the recipient takes it out, reads it, and ‘gets the message’. What is the difficulty there, you might ask?
It begins to emerge if you try to make the analogy consistent. At first glance, it seems that

meaning=letter
message (that which conveys the meaning) = envelope

But there is a problem here: the message is in the letter, rather than the envelope; in actual fact, the envelope is superfluous – the message could be sent without it, by hand, say, simply as a folded note. Still, that seems trivial – the sender puts her ideas into words, the receiver reads the words and gets her ideas: isn’t that just the same?

Not quite. The question is whether the meaning (or the message) exists before it is put into words; and if so, in what form? Again, this may seem unproblematic: of course the message exists before she writes it down; and indeed she might change her mind and instead of writing, making a phone call and say what she means instead, directly, as it were.

But we must be careful here: the question is not whether the message exists before she writes it down – or even before she speaks it – but before she puts it into words. This is where the image of  the letter in the envelope is at its most misleading: isn’t the meaning just something we put into words, in the same way we put the letter in the envelope?

That is what Wittgenstein calls the ‘private language’ argument – the notion that my thoughts are in my head in some form to which I have privileged access, and which I could choose to give public form if I wish, thereby making them accessible to others. Though this again seems like common sense, when examined closely, it is problematic. It forms the basis of popular notions of telepathy, but trying to imagine what such ‘direct transmission’ would actually consist of highlights the difficulty.

If you convey your thoughts to me, what do I experience? Do I hear your voice speaking in my head? if so, we are back to ‘putting things in words’ and no nearer any prior form our thoughts might take. The temptation is to fall back on images, as if these were somewhow more immediate (a picture is worth a thousand words, after all) but what would they be images of? And how, having received them, would I be able to infer your thought from them? A more illuminating (but no less problematic) possibility is that we might hear your thoughts as a musical phrase which we intuitively understand.

This works to some extent because we are accustomed to the idea that music can consistently evoke definite feelings in us – ‘that passage always makes me feel this way, invariably calls this to mind’ – though we have no idea how: ‘it just does’; so that seems consistent with our finding in it something that someone else has put there; but it still leaves the question of what happens at the other end – how would such a musical message originate?

The options here would seem to be either that the message is originally in some other form which I then embody in the music – which takes us back to where we started: if it’s comprehensible to me in that form, why can’t I convey that directly instead of ‘translating’ it into music? – or else we have to accept that only in expressing it do I find what I am thinking; what we experience prior to that is an urge, a sort of pressure which can only be relieved by giving it expression in some way – whether it is an inarticulate cry of rage, a musical phrase (the terrifying opening of the Dies Irae in Verdi’s Requiem, for instance), an image (Munch’s The Scream, maybe) or the words ‘I am very angry about this!’

The Scream

This brings us by a roundabout route to something I have been trying to articulate for a while  – the key distinction between language as the instrument of thought and as one means of expressing experience; but that is a subject for another article. In the meantime, I would conclude by saying that, if this account of the mechanism of meaning is accurate, then it has some interesting implications. It suggests, for instance, that meaning (like beauty) is in the eye (or mind) of the beholder; that it is not fixed, but variable; that it is impermanent; and – perhaps most importantly – it is inseparable from its context, the ‘form of life’ or wider activity of which it forms a part and on which it depends.

No abiding city

Laurentius_de_Voltolina_001

Things take odd turns sometimes. After my Byzantine Epiphany I felt sure I was on the track of something, yet it proved elusive: after a lot of writing I felt I was still circling round it, unable to pin it down.

Then this morning I woke to the news that (with the General Election just over a week away) David Cameron was pledging, if re-elected, to pass a law that would prevent his government from raising the level of a range of taxes for the duration of the next parliament.

I have to say that this struck me at once as absurd, the notion of a government passing a law to prevent itself doing something: why go to all that trouble? why not just say, ‘we won’t do that’?

There’s the rub, of course – election promises are famously falser than dicers’ oaths; against that background, Mr Cameron feels the need to offer something stronger – no mere manifesto promise, but an actual law! – what could be a stronger guarantee than that?

There’s a paradox here, of course – because politicians’ promises are notoriously unreliable, Mr Cameron says he will pass a law to ensure that he will not go back on his word – and that’s a promise. The whole elaborate structure is built on the same uncertain foundation.

I am reminded of advice from a more reputable source, the Sermon on the Mount:

‘Again, you have heard how it was said to our ancestors, you must not break your oath…
But I say this to you, do not swear at all… all you need say is “Yes” if you mean yes, “No” if you mean no; anything more than this comes from the Evil One.’

You are no better than your word: if that is worth nothing, no amount of shoring-up will rectify the matter; and if it is good, what more do you need?

But there is something deeper here: the key, I think, to the very matter I had been trying to resolve.

Let us start with Mr Cameron’s utterance: it is perhaps best understood as a theatrical gesture. The actor on stage, conscious of the audience’s attention (and also of his distance from them, compared, say, to the huge close-up of the cinema screen) may feel the need to make a gesture which in everyday life would strike us as exaggerated and – well – theatrical. So Mr Cameron, in the feverish atmosphere of an election campaign, feels the need to outbid his opponents – ‘they say they’ll do something? well, I’ll pass a law that will make me do as I say!’

I have to say that even in context it sounds rather silly, but it would be even sillier outside it – so that is the first point, the importance of context to understanding.

The second is this business of making a law and the appearance it offers of transferring the responsibility from the person to something independent and objective – ‘don’t just take my word for it – it’ll be the law!’ It overlooks the fact that legislation is a convention that requires our consent to operate: the laws of the land are not like the laws of physics – they do not compel us in any way; we obey them through choice, not necessity.

(And of course the existence of a range of penalties and agencies of enforcement like the police and the courts are proof of this – you do not need any of that to make things obey the Law of Gravity; you only need threat and compulsion where there is the possibility that people might do otherwise)

These two things – the importance of context to meaning and the attempt to transfer responsibility from the person to something apparently objective and independent – chimed with what I had been struggling to express before.
I had been focusing on the effect that the introduction of writing has on language, and through that, on our whole way of seeing the world.

The gist of my argument was this: from time immemorial, we have had Speech, which is our version of something we observe throughout the animal kingdom – bird song, whale song, the noises of beasts. Then, relatively recently – between five and six thousand years ago – we invent something unique: Writing.

sargon-inscription-ancient-writing-on-plaque-rome

At first it is used for relatively low-grade menial (indeed, prosaic) tasks, such as making lists and records; it is a good thousand years before anyone thinks to employ it for anything we might call ‘literature’. That should be no surprise: where Speech is natural and instinctive, the product of millions of years’ development, writing is awkward and cumbersome, a skill (along with reading) that must be learned, and one not everyone can master.

Speech has all the advantages that go with sound: it has rhythm, rhyme, musicality, pattern; Writing has none of these. But it does have one thing: where speech exists in time and is fleeting, ephemeral, Writing exists in space and has duration; it is objective; it exists in its own right, apart from any context or speaker.

My speech dies with me: when my voice is stilled, it is gone (though it may linger in the memory of others); but my written words will outlast not only me but a hundred generations – they could be around long after any trace or memory of their author is wholly erased.

Thus, from Speech we move to Language – by which I mean the complex thing that arises after Writing is invented. The important thing about Language is its dual nature, and the interaction and tension between its two forms, the written and the spoken. These are (as I discussed before) in many respects antithetical – where Speech is necessarily bound up with a speaker and so with a context – it is always part of some larger human activity – Writing stands on its own, apart from any context, independent of its author, with its own (apparently) objective existence.

(and the differences go deeper – where speech draws on a rich range of devices to overcome its ephemeral character and make itself memorable – rhyme, rhythm, vivid imagery etc – writing (though it can borrow all of them) has no need of any of these, having permanence; the problem it must overcome is lack of context – it cannot rely on what is going on round about to clarify its meaning; it must stand on its own two feet, and aim to be clear, concise, unambiguous, logical.)

What Mr Cameron’s absurd utterance brought home to me was the deceptive nature of Writing’s independence and objectivity, which is more apparent than real. Just as the law he holds out as having some objective, compelling force that is greater than his word is only so because we (as a society) agree to assign that power to it (in this connection, see my earlier post, ‘bounded by consent’) – and ultimately has no greater strength than the original word that promises it – so the objectivity and independence of the written word are not inherent properties but rather qualities we have conferred on it.

The independence and objectivity we assign to language is a kind of trick we play on ourselves, and it is bound up with the matter I discussed in my earlier posts (here, here and here) concerning the ‘carapace’ that we erect between ourselves and Reality – a carapace of ideas on which we confer the title ‘reality’ even though it is a construct of our own.

(It was interesting to realise that my philosophical hero Ludwiig Wittgenstein had made this journey before me: in his early work, e.g. the Tractatus, he is much concerned with his ‘picture theory’ of language, in which a proposition is seen as picturing reality, by having its elements related to one another in a way that corresponds to how the elements of the reality it pictures are related:
‘2.12 A picture is a model of reality.
2.13 In a picture objects have the elements of the picture corresponding to them.
2.14 In a picture the elements of the picture are the representatives of objects.

2.1511 That is how a picture is attached to reality; it reaches right out to it.

2.223 In order to tell whether a picture is true or false we must compare it with reality.’

This model takes for granted the objective nature of language: it is the words, the proposition, that is true or false, and that is established by comparison with the world; we do not seem to play much part.

However, in his later work, Wittgenstein moves to a different position: he now speaks of ‘language games’ and ‘forms of life’; it is only as part of a language game or a form of life – i.e. some human activity – that words have meaning; and indeed, as a general rule, the meaning of a word is its use in the language. He emphatically rejects the idea of a ‘private language’ in which our thinking is done before being translated into words: all that is available to us is the unwieldy, untidy agglomeration that is Language, a public thing that everyone shares and shapes but no-one controls or commands – despite the best efforts of organisations such as L’Academie Francaise)

As is typical of Wittgenstein, this modest-seeming manoeuvre effectively demolishes an edifice of thought that has stood for millennia: its implications are profounder than might at first appear.

If we go back to Plato and his fellow Greeks, we find a horror of mutability (‘change and decay in all around I see’, as the hymn has it) and a yearning for Truth to be something fixed and immutable – hence Plato’s world of Ideas, the unchanging reality that can be apprehended only by the intellect and lies beyond the veil of Appearance which so beguiles our poor, deluded senses.

Language – the complex thing that arises after the invention of the written form – is central to establishing this Platonic world, whose influence has lasted down to the present day, in particular its elevation of the intellect over the senses and its separation of Appearance and Reality.

The quality of Language on which all this hinges is the illusion it gives of being something that exists in its own right: words have meanings and can be used to describe the world; if only we tidied up language, rid it of its anomalies, used it more carefully and logically – freed it from the abusage of everyday speech – made it, in a word, more literate, truer to its written form – then we would be able to express the Truth accurately and without ambiguity, and permanently.

This is the edifice that Wittgenstein shows to be no more than a castle in the air: if meaning exists only in context, as part of some human activity, then all meaning is provisional; nothing is fixed (an idea I have discussed before). Language can never be tidied up and purified, cleansed of its faults, because language is ultimately derived from Speech, which is a living, dynamic thing, constantly changing with the forms of life of those who speak it, and the new ‘language games’ they invent.

The truth of what I have just said is by no means universally accepted; indeed, we have made some pretty determined attempts to contradict it: the first was the use of Latin as a scholarly language after it had ceased to be a living tongue (having transmuted, in the course of time, into the various romance languages – Italian, French, Spanish, Portuguese, Romanian). Latin was the vehicle of academic discourse from the foundation of the first European universities in the eleventh century down to the time of Newton and beyond, a span of some five centuries; it remains the official language of the Roman Catholic church (although mass in the vernacular was introduced with the refoms of Vatican 2 in the early sixties, the Latin mass was not ‘banned’ as popularly supposed – only a specific form, the Tridentine rite, was discontinued; mass is still said in Latin to this day in various places).

It is no surprise to find that the Church – very much bound to the notion of an unchanging Truth – should be one of the last bastions of a purely literate language. In the academic and particularly the scientific world, the role formerly played by Latin has to a large extent been taken over by English, and ‘Academic English’ as a form is diverging from the living language, which in turn is diversifying (with the disappearance of the British Empire and the emergence of former colonies as countries in their own right) in much the same way as Latin transformed into various tongues after Rome fell.

I am sure that there are many today who will view my assertion that all meaning is necessarily provisional with the same horror that the Greeks contemplated the mutability of things, but I think if you consider it steadily, you will see that it is both liberating and refreshing.

In my previous piece I began by talking about the perils of building in stone – namely, that what you make will outlive its capacity to be understood, because although it does not change, the people considering it do. I think this happens all the time with ideas, and especially the ‘big’ ideas, about ‘Life, the Universe and Everything’ – because they are important, we try to fix them for all time, but we overlook the fact that they are the product of a particular time, expressed in the language of that time, and that succeeding generations will see and understand things differently.

Of course the change of outlook and the decay of understanding is never sudden and can be delayed, and that is exactly what written texts do: they give a particular version of something an authority and a form that can last for generations, and which may block any development for a long time.

(That, broadly, is what happened with Scholasticism: the influx (via the Islamic world) of ancient Greek learning – chiefly Aristotle – into mediaeval Europe provided a huge intellectual stimulus initially, as great minds like Thomas Aquinas came to terms with it and assimilated it into the thinking of the day; but so comprehensive did it seem that there was no impulse to move beyond it, so that it began to ossify – the object of university study became to master Aristotle’s works, and the ‘argument from authority’ came into vogue – to settle any dispute it sufficed to quote what Aristotle (often called  simply ‘The Philosopher’) said on the matter – there was no going beyond that. This situation lasted till the Renaissance shook things up once more )

So am I, then, making a straightforward pitch for Relativism and denying the possibility of an Absolute Truth?

Not quite. Rather, this is an argument for ineffability, the idea that ‘Great Truths’ cannot be expressed in words. It is not so much that language is not equal to the job (but might be improved till it was), rather that the greatness of these ‘Great Truths’ (that label is of course inadequate) is such that it necessarily exceeds our ability to comprehend them, so limiting our capacity to express them; though poetry can get closer than prose:

‘Ah, but a man’s reach should exceed his grasp, or what’s a heaven for?’

and Art in general – music, painting, sculpture, dance, poetry – offers a more fruitful approach than philosophy – not to success, but a more rewarding kind of failure; or, as Mr Eliot so aptly expresses it,

‘but there is no competition—
There is only the fight to recover what has been lost
And found and lost again and again: and now, under conditions
That seem unpropitious. But perhaps neither gain nor loss.
For us, there is only the trying. The rest is not our business.’

For us, there is only the trying

Memling_angel_musicians

One thing that being a writer brings home to you is the tentative nature of all writing: it is always an attempt to say something – one that can be more or less successful – and it is always a struggle. And the more difficult the matter, the greater the struggle, because we are conscious of how imperfect our expression is, how far short it falls of what we are trying to say. And what is it that we are trying to express? That is a form of every author’s favourite question, the one that is sure to be asked: ‘where do you get your ideas from?’

The best answer is a vague one: our ideas, our Art – by which I mean stories, music, poetry, painting, dance, whatever we use as modes of expression – are our response to being human, to finding ourselves here and wondering at it. Art arises from what I think of as an ‘internal pressure’ : from time to time there is something ‘inside’ that we want ‘to get out there’ in the sense of giving it a public form that we and others can consider.

But we should not be misled into thinking that we have privileged or prior access to what we express; that is a version of what Wittgenstein calls the ‘private language argument’ where we suppose that we know what we mean ‘in our heads’ and then translate it into words, as if it existed in two forms, a private internal one to which we alone have access, and a public form that we give it. What Wittgenstein contends is that there is only public language, an unruly body of material that we hold in common (and master only in part), which is the only available stuff we have for verbal expression; we have to make the best of it, hence the tentative nature of all utterance and the struggle it involves.

This notion of the struggle to express is a central theme of TS Eliot’s East Coker the second of his Four Quartets.

Eliot speaks of ‘the intolerable wrestle with words and meanings’ and observes that
‘every attempt
Is a wholly new start, and a different kind of failure’
and that
‘each venture
Is a new beginning, a raid on the inarticulate
With shabby equipment always deteriorating’
Furthermore,
‘what there is to conquer
By strength and submission, has already been discovered
Once or twice, or several times, by men whom one cannot hope
To emulate’
and he concludes,
‘For us, there is only the trying. The rest is not our business.’
– which should, I think, be every writer’s (and artists’s) motto.

Eliot’s words connect in my mind with something I heard the estimable David Almond say recently on the radio: ‘Every time a story’s told, it’s for the first time; every time that Orpheus goes down into the Underworld, it’s the first time’. (Almond’s latest book, ‘A Song for Ella Grey’ is inspired by the Orpheus myth (the original title, I believe, was ‘Eurydice Grey’) and of course Orpheus’ descent to the underworld is a potent image of the artistic enterprise, a dangerous delving into the dark mine of the imagination – cp. the ‘Door into the Dark’ in Heaney’s poem ‘The Forge‘)

For me, this notion of the tentative nature of all writing and the perennial nature of storytelling combine to shed light on an area where there is much misunderstanding today: the idea of the sacred text.

To say that all writing is tentative is to assert that there are no privileged texts: none is exempt from this character of being a struggle to say something. So what of texts that are said to be ‘the word of God’ or to have been ‘dictated by angels’? Such expressions must be seen as part of that struggle: they are attempts to express the sacredness of the text, to convey its importance in the scheme of things. One way of putting this is to say that we do not call a text sacred because it is the word of God or was spoken by angels, we call it the word of God (or say it was spoken by angels) because we consider it sacred.

This is a point worth untangling because it can help dispel a great deal of misunderstanding and arid controversy in the matter of religion and belief.

To avoid controversy, let us take a remark that is variously attributed to the theologian Karl Barth and the musicologist and Mozart scholar Alfred Einstein (not to be confused with Albert) : ‘In Heaven, when the angels play for God, they play Bach; when they play for themselves, it is Mozart.’

Now, we might imagine a would-be plain-speaking, blunt common-sense fellow in the style of the Today programme’s John Humphrys butting in at this point to demand, ‘And was this man ever in Heaven? Has he heard the angels playing for God? Was he there when they played for themselves?’ In saying this, he might fancy that he is demolishing the credibility of the statement, but a more reflective listener would incline to think he was missing the point.

For of course this is not a statement about heaven, the angels or God, and does not require a belief in those things for its understanding; it is a statement about the music of Bach and Mozart, and how they stand to one another and to all other music (it is saying that both are paramount, but that while Bach is the more glorious, Mozart is more joyous – or something like that; – for of course that is just my own attempt, my own struggle to convey what is meant here). You cannot controvert it by saying ‘But there is no God! there is no Heaven! There is no such thing as angels!’ but you might challenge it by pressing the claims of some other composer, such as Arvo Part, Josquin des Prez or Hildegard of Bingen.

Sacredness is not an intrinsic quality of anything, be it object or text; rather it is a status we confer on it, a place we give it in a ‘form of life’. (‘Form of life’ is one of the terms that Wittgenstein uses in his discussion of meaning, in particular the meaning of words – the other is ‘language game’. A ‘form of life’ is the context or activity in which a word or expression is used, the place where it has meaning. Religious worship is one instance of a ‘form of life’ – the words and gestures of the Mass, for instance, have a meaning there which they would not have in other circumstances)

By way of illustration, imagine that some explorers come on a curious stone deep in the forest. Subsequent examination shows it to be of extra-terrestrial origin, the remains of a meterorite. A great deal might be determined about its chemical composition and even its place of origin but you could discover nothing that showed it to be sacred.

Then, some time later, the site where it was found is cleared and the remains of ancient buildings discovered. These resemble other buildings known to be associated with religious ceremonies and this is borne out by the discovery of wall-paintings and scrolls which depict an object much like the meteorite at the centre of a cult: it is carried in procession, elevated on a pillar, enclosed in a special building, has sacrifices offered to it and so on.

At this point you might feel confident in asserting that the meteorite was a sacred object, and indeed this could be corroborated by natives of the country, who produce a traditional tale that speaks of a time when the people were in great trouble and saw a brilliant light fall to earth from heaven and so discovered the sacred stone, which then became the object of veneration and the centre of a religious cult.

Some people might conclude that this offers a paradigm for our religious belief: that although we couch it in terms of the sacred and supernatural, it can be shown to have its origin in natural phenomena. ‘These primitive folk had no understanding of what a meteorite was and were profoundly impressed and frightened by it, so they thought it was a sign from God. Of course we know better now.’

But do we? I think conclusions of that sort are flawed and arise from a misplaced application of causality: ‘the spectacle of the meteorite and the awe it induces are the cause; their subsequent religious practice can be seen as the effect.’

To reason thus is to overlook the fact that the story does not start with the meteorite: it starts with the people’s being ‘in great trouble.’ Of course I have just invented that by way of illustration, but the point is valid: we can imagine that there were plenty meteorites shot across the skies before this, but this one came at an opportune time. In other words, it came into a story that was already going on; it was incorporated into a pre-existing ‘form of life’, to use Wittgenstein’s term: what made it a sign was the fact that the people were looking for one; they felt the need of it.

In other words, unlike the mammoths (say) which we can imagine grazing placidly, oblivious, as meteorites blaze across the sky, these people already had the habit of storytelling, of making things up to explain their situation to themselves. It is important to see that, fundamentally, they are in control: it is the people who choose to make the object sacred, to see it as a sign – they confer its status on it by incorporating it in a story. There is no necessity of the kind we normally look for in cause and effect, like the explosion that follows the lighting of a match in a gas-filled room; this is more an instance of what I have elsewhere called ‘elective causality’ where we choose to make something the ground or cause of our subsequent actions.

So am I saying that religion (of whatever kind) is ‘just a story we made up’?

Well, yes and no. When that assertion is made nowadays – as it often is – it is generally by people who mean to dismiss religion as something unnecessary, that has no place in modern society; something we have grown out of. And when that assertion is vehemently denied (as it also is), it is by people who insist on the central importance and continuing relevance of religious belief and practice. Yet in this particular argument both are mistaken, I think.

Let us start by dispensing with that word ‘just’: to say that something is ‘just a story’ or ‘just made up’ is to prejudge the issue; you are signalling from the outset that you consider stories and making things up to be trivial activities, unworthy of serious consideration. That is not the case.

The next thing to consider is whether by saying that something is a story or is made up we devalue it or detract from its credibility. I would say, emphatically, that we do not. Storytelling, and making things up generally – which I take to encompass everything we call Art – is an important human activity, perhaps the most important; and certainly the most characteristic.

Yet it is the case that the same terms we use for these praiseworthy and admirable activities – ‘telling stories’ ‘making things up’ and indeed the whole vocabulary of fabrication – are also used in a pejorative sense to mean ‘telling lies’, a confusing ambivalence I have remarked on before, here.

The fact that it is possible to make false allegations or give a false account of something – to represent the facts as being other than they are – should not mislead us into supposing that the paradigm for storytelling is the news report, the veracity of which is judged by measuring it against external circumstances – if its content corresponds to those circumstances, then it is true and accurate.

Far from being a paradigm, the news report is a special case, a relatively recent development in which the age-old techniques of storytelling – which are as old as humankind – are applied to the particular (and peculiarly modern) activity of news-gathering and journalism (which is why news-editors always want to know ‘what is the story?’ )

The majority of stories are not of this sort. Though the temptation is to suppose that they are stories ‘about something’ (or paintings and photographs ‘of something’) and so must be judged in relation to that ‘something’, they should in fact be judged on their own merits: it is what is in them that makes them good, not how they stand in relation to something else. (We find this easier to grasp in relation to music, which we do not expect to be ‘about something’: the form of stories and pictures misleads us into looking for correspondence with external circumstances).

‘Truth’, when we apply it to art, is something that we ‘get’ and we respond by drawing others’ attention to it: ‘read this, look at that, listen to this’, we say, because we expect them to ‘get it’ too; and when they do, they smile and nod in agreement. No words need be spoken; explanation is superfluous, and indeed largely impossible: if the person does not ‘get it’ then you will not persuade him by reason: the best you can do is ask him to look or listen or read again.

(And of course this ‘truth’ can be faked, too, as happens when someone copies what someone else does, usually for gain (though we can also copy in order to learn). In this case the story (or painting, or piece of music) is ‘unoriginal’ in a very precise sense: it does not originate, or have its source, in the person who created it: it is not the expression of what they think or feel; it did not result from the ‘internal pressure’ I spoke about above; the ‘struggle’ that we started out discussing is absent.

Of course we all copy, and quite legitimately, when we are learning – ‘playing the sedulous ape’, as R L Stevenson called it – but we hope to arrive at a point where our own voice emerges, and our work ceases to be purely derivative and has something of ourselves in it, bears our stamp, has its own character, not someone else’s.)

So when I say that religion is a story, something we have made up, I do not mean to demean or disparage it, but rather to say: this is how it works (and how we, as human beings, work); if you want to understand it better, you need to think about stories and storytelling, how they work, how they express meaning. Read the stories; don’t go looking for the remains of the Ark (or indeed of the True Cross). These are not ‘proof’ or ‘evidence’ any more than a photo of the baby Jesus in the manger would be evidence of the Incarnation. If you want to understand the Incarnation, you have to ask, ‘what on earth could someone mean by that, ‘God became Man’? What were they trying to say?’

The tentative nature of every utterance must always be the starting point: ‘this was written (or painted, or composed) by someone like me, another human being, so I should be able to arrive (though not without effort) at some understanding of what it was they were trying to express, what internal pressure caused this outpouring.’

That is why, as we grow older and our life experience – of both good and ill – becomes richer and more varied, that we find ourselves understanding what eluded us before; why we can suddenly say ‘now I see it!’ with absolute conviction; it is also why some things that impressed us in our salad days, when we were green in judgement, no longer satisfy – we see through them; they no longer ring true. And the big, mysterious things – the ineffable – if we engage with them honestly (and don’t start by thinking we already know), then we will be drawn to what has been said and done by those who have engaged in the same struggle – and may find comfort there.

Force of Habit

‘Mind-forged manacles’, as well as being one of Blake’s most resonant phrases, also shows how well (and succinctly) poetry (and art in general) can express a complex idea that it is difficult to express by standard reasoning.

At the heart of Blake’s phrase is a contradiction, something that is anathema to conventional reason: ‘forging’ is the working of metal by force and, generally, heat; ‘manacles’ are metal shackles used for physical restraint; yet ‘mind’ is immaterial; mental, not physical.

It is precisely in that contradiction that the power of Blake’s metaphor resides: he wants to emphasise the simultaneous strength and weakness of convention, man-made rules, which can bind us as strongly as steel shackles yet are self-imposed and entirely insubstantial – they are discarded not by physical strength but an effort of will, through recognising them for what they are (though that recognition is not enough in itself: it takes a conscious effort of will to break conventions).

(It is important to understand that by ‘convention’ I mean rather more than the trivial, like dressing or speaking in a particular way; I mean the whole vast hinterland of ‘agreed ways of thinking about things’)

The power of the mind is, I  think, generally underestimated and misunderstood, largely because we equate ‘power’ with physical force – so that proof of ‘mental power’ would be something like telekinesis, moving objects at a distance simply by thinking about it.That in turn stems from a narrow view of the world itself, which supposes it to consist only of what is physical: that is the ‘real world’ in which we are so often told we must live – yet the reality is quite the opposite.

Our world consists, to a very great degree, of mental constructs – it is, in other words, mind-forged. Our way of perceiving the world is an ingrained habit of thought more than anything: it is not simply a matter of opening our senses and letting the outside world flood in; our interaction with our surroundings is a continuous act of interpretation, along lines that have largely become instinctive; but as various ingenious experiments show, our minds can be deceived.

For instance, if we watch a mouth making a ‘b’ sound (technically termed a labial plosive: the lips are pressed together then blown apart) then see instead the mouth making a ‘v’ sound (a labio-dental fricative, where the lower lip is first caught behind the top teeth) we will hear a ‘v’ sound, even if the actual sound remains the same; and if the image switches back to the appropriate lip formation, we will hear the sound as a ‘b’ again; this will happen invariably – as long as we attend to the visual cue, it will override and alter the information our ears give us. This is called the McGurk effect – you can try it yourself here.

This also shows how important faces are to us, and how minutely we examine them for information – so it is no surprise that we have the knack of seeing them in chance arrangements (pareidolia is the term, I believe) and also that we interpret things such as cars (with headlights like eyes and radiator-grilles like mouths) as having ‘faces’ (some amusing instances here). We can play with this tendency too: if we take the inverse mould of a face (such as the inside of a mask) we tend to see it as a positive face, which leads to weird effects if it is rotated – we continue to interpret the image as positive, and this causes us to see the rotation happening in the opposite direction to the actual movement (illustrated here).

This is something that has long troubled philosophers – in essence, it is the same thing as the bent stick in water that exercised Plato: what we see (that the stick appears bent where it enters the water) is contradicted by what we know (that the stick is actually straight). This led Plato – and all who followed him – down the path of distrusting the senses, and maintaining a distinction between Appearance (deemed to be deceptive) and Reality (capable of being apprehended only by the intellect). Having spent much of my adult life in thrall to Plato, I now consider that a wrong direction, as I discussed elsewhere.

To be fair, my repudiation of Plato is as much about a change in my own temperament as anything else: when we are young, we are eager that mysteries should be solved; where there is doubt, we yearn for certainty. Now, in age, I find the mystery itself exciting and engagement with it more satisfying than any solution; the one thing sure about certainty is that it is not to be relied on – the best it can offer is a temporary reassurance which allows us to turn to other things. But the real pleasure lies in engagement: as Eliot puts it,

‘For us, there is only the trying. The rest is not our business.’

I used to find it distressing that, shortly before his death, Thomas Aquinas – one of the most formidable intellects of his (or any other) age, a man who thought hard about God all his life and wrote deeply on the matter – had a vision that prompted him to say “All that I have written appears to be as so much straw after the things that have been revealed to me.” Now, I find that reassuring, and it reminds me of Wittgenstein, at the end of the Tractatus:

‘6.54 My propositions are elucidatory in this way: he who understands
me finally recognizes them as senseless, when he has climbed out
through them, on them, over them. (He must so to speak throw
away the ladder, after he has climbed up on it.)
He must surmount these propositions; then he sees the world
rightly.

7 Whereof one cannot speak, thereof one must be silent.’

Wittgenstein got that right, I think: language is not the best medium for engaging with the mystery, at least not the language of philosophy and rational discourse; poetry will get you closer – like the Blake quotation we started with, or Eliot’s Four Quartets – though my own instinct is that music or art are better tools of expression; but ultimately, perhaps, it is silent contemplation that will bring us closest to understanding.

‘Is there light in Gorias?’ – reflections on metaphor and truth

IMG00198-20130214-1246

‘Metaphor: a figure of speech by which a thing is spoken of as being that which it only resembles, as when a ferocious person is called a tiger‘

Chambers Dictionary

Saddling metaphor with a definition like that (which is typical, even down to the threadbare example) is akin to giving it a criminal record – wherever it goes, it will never be trusted: what this definition says is that metaphor is essentially dishonest, at best an exaggeration, at worst a downright lie.

The crucial fault of this definition is that it prejudges the issue: the writer has already decided that a ferocious person is not and cannot be a tiger – he wants to insist on a world where tigers and people are separate and distinct, where that distinction between one thing and another is crucial, a matter of logic: A = A; B = B; therefore A does not and cannot ever equal B.

What metaphor points to is a world where tigers and people can overlap and merge, a world where resemblance and connection is more important than distinction and separation, where A and B can be the same. In other words – and this is a point of fundamental importance – metaphor indicates that logic does not furnish a complete or adequate description of the world.

(This might be likened to people who live in a city and have no understanding of country ways, or people who mistake their own land and culture for the entire world and think that ‘we don’t do that’ means ‘that isn’t done.’)

2006ah4175_tipus_tigerWhat is needed is a new definition, one that is not intrinsically hostile – I would suggest

‘Metaphor: a linguistic device which invites us to consider one thing in terms of another, to clarify or deepen our understanding; one of the key instruments of thought.’
(and for ‘simile’ I would simply say ‘a variety of metaphor’ because there is no importance in the difference between them)

What this definition makes clear is that metaphor is not only an honest enterprise, but an aid and a benefit to thought, something that improves our understanding; but it does more than that – by a slight shift in perspective, this definition does away with a world of mischief.

It prevents the grave error of supposing that the terms ‘symbolic’ and ‘metaphorical’ are opposed to ‘literal’, to their detriment – in other words, that only what is literal is true, and anything else is not – it is mere symbol, just metaphor. My definition does away with the fear that by describing something as ‘metaphorical’ or ‘symbolic’ we are denying that it is true – as such, it should be of great service to theologians.

There is such a thing as literal truth, but like logic, it deals with only one aspect of the world, and quite a small part of it. To have literal truth you must first have letters. By that, I mean that you must have the notion of language existing independently of speech. Speech is particular: the words spoken are mine, yours, someone’s. It is only when we make the great leap of giving speech permanent form through letters that the notion of language as something independent of individual speakers arises, and from that, the concept of literal truth.

Literal truth is not a property of the world, but of words – and strictly speaking, of written words, though in any literate society the spoken language is informed and mediated by its written form. Thus a written account, or a spoken account that can be transcribed (consider why we use court reporters to transcribe all that is said in a court of law) can be literally true, if there is a correspondence between what it says and what happened. If there is a disparity between the account and the event, then the account is judged to be false or untrue.

It should be clear from this that only a limited range of things can be true in this way – descriptions of events that set out to give an accurate account of what happened, such as we might find from a witness in a courtroom, or a reporter at the scene, or a description of an experiment in chemistry. (Other accounts – such as the report of a football match – may consist of a mixture of material, only some of which can be literally true – the score, the time of the goals, the names of the scorers, the teams – while the rest is judgement and opinion. One man’s gripping contest might be another man’s dour tussle; the fact that one team enjoyed seventy percent possession does not contradict the assertion that the other team dominated the game and played the better football – the fact about percentage possession may be literally true, but that the other team dominated the game is a matter of judgement)

Where such descriptions are untrue, some might be false while others might be inaccurate or mistaken – falsehood implies deliberate intent on the part of the reporter, who knows the true state of affairs but chooses to give an inaccurate account for some reason; on the other hand, a careless, inobservant or inexperienced reporter might simply be inaccurate – he failed to see all that was happening, or misinterpreted it; there was no intention to deceive.

‘Facts are chiels that winna ding an downa be disputit’ as Burns wrote; but the point I want to make here is that the realm of the factual is only a small part of our experience – opinion, thought and feeling cover a great deal more, and in that realm ‘truth’ has a different meaning that should not be confused with the correspondence between words and facts that is the definition of ‘literal truth’. There we speak of things ‘ringing true’ and apprehend truth as a quality found in a painting, a story, a piece of music, a poem; and having found it, we do not persuade others of its truth by argument, we simply point and ask ‘do you get it?’.

Verification is fine as far as it goes, but it does not go very far: there is no process for verifying the truth of King Lear or a Beethoven quartet. These are things that are understood in a different way; ‘truth’ means something else here. That is what makes disputes between science and religion so arid and pointless.

(in writing that, and casting about for a suitable analogy, I was reminded of Alan Garner’s story ‘Elidor’ which supplies the title to this piece – in it, the sacred objects that the children bring back with them – cauldron, sacred stone and spear – assume the mundane appearance of a broken teacup, a bit of rock and an iron railing. Concepts of great importance in one realm lose their significance when transported to the other)