‘These great concurrences of things’

20160423_162851

One of the main ideas I pursue here is that the invention of writing has radically altered the way we think, not immediately, but eventually, through its impact on speech, which it transforms from one mode of expression among many into our main instrument of thought, which we call Language, in which the spoken form is dominated by the written and meaning is no longer seen as embedded in human activity but rather as a property of words, which appear to have an independent, objective existence. (This notion is examined in the form of a fable here)

This means in effect that the Modern world begins in Classical Greece, about two and a half thousand years ago, and is built on foundations laid by Socrates, Plato and Aristotle; though much that we think of as marking modernity is a lot more recent (some would choose the Industrial Revolution, some the Enlightenment, some the Renaissance) the precondition for all of these – the way of seeing ourselves in the world which they imply – is, I would argue, the change in our thinking outlined above.

This naturally gives rise to the question of how we thought before, which is not a matter of merely historical interest, since we are not talking here about one way of thinking replacing another, but rather a new mode displacing and dominating the existing one, which nevertheless continues alongside, albeit in a low estate, a situation closely analogous to an independent nation that is invaded and colonised by an imperial power.

What interests me particularly is that this ancient mode of thought, being ancient – indeed, primeval – is instinctive and ‘natural’ in the way that speech is (and Language, as defined above, is not). Unlike modern ‘intellectual’ thought, which marks us off from the rest of the animal kingdom (something on which we have always rather plumed ourselves, perhaps mistakenly, as I suggested recently) this instinctive mode occupies much the same ground, and reminds us that what we achieve by great ingenuity and contrivance (remarkable feats of construction, heroic feats of navigation over great distances, to name but two) is done naturally and instinctively by ants, bees, wasps, spiders, swifts, salmon, whales and many others, as a matter of course.

So how does this supposed ‘ancient mode’ of thought work? I am pretty sure that metaphor is at the heart of it. Metaphor consists in seeing one thing in terms of another, or, if you like, in seeing something in the world as expressing or embodying your thought; as such, it is the basic mechanism of most of what we term Art: poetry, storytelling, painting, sculpture, dance, music, all have this transformative quality in which different things are united and seen as aspects of one another, or one is seen as the expression of the other – they become effectively interchangeable.

(a key difference between metaphorical thinking and analytic thinking – our modern mode – is that it unites and identifies where the other separates and makes distinctions – which is why metaphor always appears illogical or paradoxical when described analytically: ‘seeing the similarity in dissimilars’ as Aristotle puts it, or ‘saying that one thing is another’)

This long preamble was prompted by an odd insight I gained the other day when, by a curious concatenation of circumstances, I found myself rereading, for the first time in many years, John Buchan’s The Island of Sheep.

Now Buchan is easy to mock – the values and attitudes of many of his characters are very much ‘of their time’ and may strike us as preposterous, if not worse – but he knows how to spin a yarn, and there are few writers better at evoking the feelings aroused by nature and landscape at various times and seasons. He was also widely and deeply read, a classical scholar, and his popular fiction (which never pretended to be more than entertainment and generally succeeded) has a depth and subtlety not found in his contemporaries.

What struck me in The Island of Sheep were two incidents, both involving the younger Haraldsen. Haraldsen is a Dane from the ‘Norlands‘ – Buchan’s name for the Faeroes. He is a gentle, scholarly recluse who has been raised by his father – a world-bestriding colossus of a man, a great adventurer – to play some leading part in an envisaged great revival of the ‘Northern Race’, a role for which he is entirely unfitted. He inherits from his father an immense fortune, in which he is not interested, and a vendetta or blood-feud which brings him into conflict with some ruthless and unscrupulous men.

Early in the book, before we know who he is, he encounters Richard Hannay and his son Peter John (another pair of opposites). They are out wildfowling and Peter John flies his falcon at an incoming skein of geese; it separates a goose from the flight and pursues it in a thrilling high-speed chase, but the goose escapes by flying low and eventually gaining the safety of a wood. ‘Smith’ (as Haraldsen is then known) is moved to tears, and exclaims
‘It is safe because it was humble. It flew near the ground. It was humble and lowly, as I am. It is a message from Heaven.’
He sees this as an endorsement of the course he has chosen to evade his enemies, by lying low and disguising himself.

Later, however, he takes refuge on Lord Clanroyden’s estate, along with Richard Hannay and his friends, who in their youth in Africa had sworn an oath to old Haraldsen to look after his son, when they were in a tight spot. They attend a shepherd’s wedding and after the festivities there is a great set-to among the various sheepdogs, with the young pretenders ganging up to overthrow the old top-dog, Yarrow, who rather lords it over them. The old dog fights his corner manfully but is hopelessly outnumbered, then just as all seems lost, he turns from defence to attack and sallies out against his opponents with great suddenness and ferocity, scattering them and winning the day.

20160423_165527

Again, Haraldsen is deeply moved:

‘It is a message to me,’ he croaked. ‘That dog is like Samr, who died with Gunnar of Lithend. He reminds me of what I had forgotten.’

He abandons his scheme of running and hiding and resolves to return to his home, the eponymous Island of Sheep, and face down his enemies, thus setting up the climax of the book (it’s not giving too much away to reveal that good triumphs in the end, though of course it’s ‘a dam’ close-run thing’).

Both these incidents have for me an authentic ring: I can well believe that just such ‘seeing as’ played a key role in the way our ancestors thought about the world and their place in it.

It is, of course, just the kind of thing that modern thinking labels ‘mere superstition’ but I think it should not be dismissed so lightly.

The modern objection might be phrased like this: ‘the primitive mind posits a ruling intelligence, an invisible force that controls the world and communicates through signs – bolts of lightning, volcanic eruptions, comets and other lesser but in some way striking events. The coincidence of some unusual or striking occurrence in nature with a human crisis is seen as a comment on it, and may be viewed (if preceded by imploration) as the answer to prayer. We know better: these are natural events with no connection to human action beyond fortuitous coincidence.’

The way I have chosen to phrase this illustrates a classic problem that arises when modern thinking seeks to give an account of ancient or traditional thinking – ‘primitive’ thinking, if you like, since I see nothing pejorative in being first and original. The notion of cause and effect is key to any modern explanation, so we often find that ‘primitive’ thinking is characterised by erroneous notions of causality – basically, a causal connection is supposed where there is none.

For instance, in a talk I heard by the the philosopher John Haldane, he cited a particular behaviour known as ‘tree binding’ in which trees were wounded and bound as a way of treating human wounds – a form of what is called ‘sympathetic magic’, where another object acts as a surrogate for the person or thing we wish to affect (or, to be more precise, ‘wish to be affected’). An account of such behavior in causal terms will always show it to be mistaken and apparently foolish – typical ‘primitive superstition’: ‘They suppose a causal connection between binding the tree’s wound and binding the man’s, and that by healing the one, they will somehow heal the other (which we know cannot work).’

But I would suggest that the tree-binding is not a mistaken scientific process, based on inadequate knowledge – it is not a scientific process at all, and it is an error to describe it in those terms. It is, I would suggest, much more akin to both prayer and poetry. The ritual element – the play-acting – is of central importance.

The tree-binders, I would suggest, are well aware of their ignorance in matters of medicine: they do not know how to heal wounds, but they know that wounds do heal; and they consider that the same power (call it what you will) that heals the wound in a tree also heals the wound in man’s body. They fear that the man may die but hope that he will live, and they know that only time will reveal the outcome.

Wounding then binding the tree seems to me a ritual akin to prayer rather than a misguided attempt at medicine. First and foremost, it is an expression of hope, like the words of reassurance we utter in such cases – ‘I’m sure he’ll get better’. The tree’s wound will heal (observation tells them this) – so, too, might the man’s.

But the real power of the ritual, for me, lies in its flexibility, its openness to interpretation. It is a very pragmatic approach, one that can be tailored to suit any outcome. If the man lives, well and good; that is what everyone hoped would happen. Should the man die, the tree (now identified with him in some sense) remains (with its scar, which does heal). The tree helps reconcile them to the man’s death by showing it in a new perspective: though all they have now is his corpse, the tree is a reminder that this man was more than he seems now: he had a life, spread over time. Also, the continued survival of the tree suggests that in some sense the man, too, or something of him that they cannot see (the life or soul which the tree embodies) may survive the death of his body. The tree can also be seen as saying something about the man’s family (we have the same image ourselves in ‘family tree’, though buried some layers deeper) and how it survives without him, scarred but continuing; and by extension, the same applies to the tribe, which will continue to flourish as the tree does, despite the loss of an individual member.

And the tree ‘says’ all these things because we give it tongue – we make it tell a story, or rather we weave it into one that is ongoing (there are some parallels here to the notion of ‘Elective Causality’ that I discuss elsewhere). As I have argued elsewhere [‘For us, there is only the trying‘] we can only find a sign, or see something as a sign, if we are already looking for one and already think in those terms. Haraldsen, in The Island of Sheep, is troubled about whether he has chosen the right course, and finds justification for it in the stirring sight of the goose evading the falcon; later, still troubled about the rightness of his course, he opts to change it, stirred by the sight of the dog Yarrow turning the tables on his opponents.

His being stirred, I think, is actually the key here. It would be an error to suppose that he is stirred because he sees the goose’s flight and the dog’s bold sally as ‘messages from heaven’; the reverse is actually the case – he calls these ‘messages from heaven’ to express the way in which they stir him. There is a moment when he identifies, first with the fleeing goose, then with the bold dog. What unites him with them in each case is what he feels. But this is not cause and effect, which is always a sequence; rather, this is parallel or simultaneous – the inner feeling and the outward action are counterparts, aspects of the same thing. A much closer analogy is resonance, where a plucked string or a struck bell sets up sympathetic vibration in another.

This is why I prefer Vita Sackville West’s definition of metaphor to Aristotle’s: for him, metaphor is the ability to see the similarity in dissimilar things; for her, (the quote is from her book on Marvell)

‘The metaphysical poets were intoxicated—if one may apply so excitable a word to writers so severely and deliberately intellectual—by the potentialities of metaphor. They saw in it an opportunity for expressing their intimations of the unknown and the dimly suspected Absolute in terms of the known concrete, whether those intimations related to philosophic, mystical, or intellectual experience, to religion, or to love. They were ‘struck with these great concurrences of things’’

A subject to which I shall return.

It’s not what you think

What do gorillas think about? Or hens?

‘A hen stares at nothing with one eye, then picks it up.’

20160421_115046

 

(in looking up McCaig’s line (from ‘Summer Farm’) just now I came across two curious comments on it:

‘Could refer to a weathervane as an inanimate hen only has one eye. “Nothing” refers to the wind and the weathervane is picking it up.
The one eye can also refer to one perspective.’

Hmm. Or it could be a beautifully observed and exact description of a hen, in characteristic action. Sometimes the surface is what matters)

This thought came to me when I was reflecting on something that happened yesterday. I was walking up Earl’s Dykes, a curiously-named side street in Perth, pondering the possible meanings and implications of two utterances I meant to write an article about; and it struck me that probably no other species on earth engaged in such speculations.

What do gorillas think about, if anything?

2016-04-21 12.16.48

I have loved gorillas a long time – since my brother and I were small boys playing with plastic Britain’s models of them in our old plum tree – and my kind sister gave us a book by George Schaller, The Year of the Gorilla, about his time spent in the Virunga volcanoes observing Mountain Gorillas. It was published in 1964, so I suppose it must have been fifty years or so since that happened. Schaller was one of the first to counter the popular fictional image of the gorilla as a savage and dangerous monster with actual observation that it was gentle, shy, vegetarian and family-oriented, so his book is of great importance in establishing what has now become the mainstream opinion of these beautiful but sadly threatened creatures.

So I do not mean to be churlish in recalling a passage that has stuck with me, and I hope I am not being unfair in recollecting it from memory, since I do not have the book to hand. The gist of it was that Schaller at one point found himself in close proximity to a large group of gorillas; he and they were sheltering from a downpour (I think this is in the chapter titled ‘am I satyr or man?’). He found himself wondering much the same as my opening line: what was going on behind those watchful, somewhat wary eyes? Not much, was his conclusion, and I think there was a line that likened his companions to ‘rather dim relatives in fur coats’ (if that is not so, or my recollection is awry, I apologise).

My point in recalling this is to wonder whether we do well to plume ourselves on what we consider our unique and superior intellect; maybe we should take our singularity in this respect as a warning rather than a mark of distinction. The Hitchhiker’s Guide to the Galaxy (a work I enjoy but do not revere to the extent that some do) proposes (if I recall correctly) that humans are only third in intellectual attainment on our planet, behind mice and dolphins. This is satire, of course, but for me it does not strike quite the right note; I increasingly wonder if our reverence of intellectual attainment is not itself the problem.

Schaller’s gorillas sitting somewhat dolefully in the rain (they are prone to colds and pulmonary ailments) or dappled with sunlight as they feed at leisure may well have no mental preoccupations whatever – but is that not something to be envied rather than despised? Do they not attain effortlessly that same absorption in the moment, that pure existence in the present, that is the aim of meditation, which we humans attain only* through rigorous discipline, quieting the mind with mantras and controlling the body through physical training?

Maybe it is the surface that matters. We have much to unlearn.

*I am in error here, of course: we can attain it by various means – drawing, painting, making music or listening to it.

 

In the beginning was the word… or was it?

1511

Reflecting on the origin of words leads us into interesting territory. I do not mean the origin of particular words, though that can be interesting too; I mean the notion of words as units, as building blocks into which sentences can be divided.

How long have we had words? The temptation is to say ‘as long as we have had speech’ but when you dig a bit deeper, you strike an interesting vein of thought.

As I have remarked elsewhere [see ‘The Muybridge Moment‘] it seems unlikely that there was any systematic analysis of speech till we were able to write it down, and perhaps there was no need of such analysis. Certainly a great many of the things that we now associate with language only become necessary as a result of its having a written form: formal grammar, punctuation, spelling – the three things that probably generate the most unnecessary heat – are all by-products of the introduction of writing.

The same could be said of words. Till we have to write them down, we have no need to decide where one word ends and another begins: the spaces between words on a page do not reflect anything that is found in speech, where typically words flow together except where we hesitate or pause for effect. We are reminded of this in learning a foreign language, where we soon realise that listening out for individual words is a mistaken technique; the ear needs to attune itself to rhythms and patterns and characteristic constructions.

So were words there all along just waiting to be discovered? That is an interesting question. Though ‘discovery’ and ‘invention’ effectively mean the same, etymologically (both have the sense of ‘coming upon’ or ‘uncovering’) we customarily make a useful distinction between them – ‘discovery’ implies pre-existence – so we discover buried treasure, ancient ruins, lost cities – whereas ‘invention’ is reserved for things we have brought into being, that did not previously exist, like bicycles and steam engines.  (an idea also explored in Three Misleading Oppositions, Three Useful Axioms)

So are words a discovery or an invention?

People of my generation were taught that Columbus ‘discovered’ America, though even in my childhood the theory that the Vikings got their earlier had some currency; but of course in each case they found a land already occupied, by people who (probably) had arrived there via a land-bridge from Asia, or possibly by island-hopping, some time between 42000 and 17000 years ago. In the same way, Dutch navigators ‘discovered’ Australia in the early 17th century, though in British schools the credit is given to Captain Cook in the late 18th century, who actually only laid formal claim in the name of the British Crown to a territory that Europeans had known about for nearly two centuries – and its indigenous inhabitants had lived in for around five hundred centuries.

In terms of discovery, the land-masses involved predate all human existence, so they were there to be ‘discovered’ by whoever first set foot on them, but these later rediscoveries and colonisations throw a different light on the matter. The people of the Old World were well used to imperial conquest as a way of life, but that was a matter of the same territory changing hands under different rulers; the business of treating something as ‘virgin territory’ – though it quite plainly was not, since they found people well-established there – is unusual, and I think it is striking where it comes in human, and particularly European, history. It implies an unusual degree of arrogance and self-regard on the part of the colonists, and it is interesting to ask where that came from.

Since immigration has become such a hot topic, there have been various witty maps circulating on social media, such as this one showing ‘North America prior to illegal immigration’ 2gdVlD0

The divisions, of course, show the territories of the various peoples who lived there before the Europeans arrived, though there is an ironic tinge lent by the names by which they are designated, which for the most part are anglicised. Here we touch on something I have discussed before  [in Imaginary lines: bounded by consent]  – the fact that any political map is a work of the imagination, denoting all manner of territories and divisions that have no existence outside human convention.

Convention could be described as our ability to project or impose our imagination on reality; as I have said elsewhere [The Lords of Convention] it strikes me as a version of the game we play in childhood, ‘let’s pretend’ or ‘make-believe’ – which is not to trivialise it, but rather to indicate the profound importance of the things we do in childhood, by natural inclination, as it were.

Are words conventions, a form we have imposed on speech much as we impose a complex conventional structure on a land-mass by drawing it on a map? The problem is that the notion of words is so fundamental to our whole way of thinking – may, indeed, be what makes it possible – that it is difficult to set them aside.

That is what I meant by my comment about the arrogance and self-regard implied in treating America and Australia as ‘virgin territory’ – its seems to me to stem from a particular way of thinking, and that way of thinking, I suggest, is bound up with the emergence of words into our consciousness, which I think begins about two and a half thousand years ago, and (for Europeans at least) with the Greeks.

I would like to offer a model of it which is not intended to be historical (though I believe it expresses an underlying truth) but is more a convenient way of looking at it. The years from around 470 to 322 BC span the lives of three men: the first, Socrates, famously wrote nothing, but spoke in the market place to whoever would listen; we know of him largely through his pupil, Plato. It was on Plato’s pupil, Aristotle, that Dante bestowed the title ‘maestro di color che sanno’ – master of those that know.

This transition, from the talking philosopher to the one who laid the foundations of all European thought, is deeply symbolic: it represents the transition from the old way of thought and understanding, which was inseparable from human activity – conversation, or ‘language games’ and ‘forms of life’ as Wittgenstein would say – to the new, which is characteristically separate and objective, existing in its own right, on the written page.

The pivotal figure is the one in the middle, Plato, who very much has a foot in both camps, or perhaps more accurately, is standing on the boundary of one world looking over into another newly-discovered. The undoubted power of his writing is derived from the old ways – he uses poetic imagery and storytelling (the simile of the cave, the myth of Er) to express an entirely new way of looking at things, one that will eventually subjugate the old way entirely; and at the heart of his vision is the notion of the word.

Briefly, Plato’s Theory of Forms or Ideas can be expressed like this: the world has two aspects, Appearance and Reality; Appearance is what is made known to us by the senses, the world we see when we look out the window or go for a walk. It is characterised by change and impermanence – nothing holds fast, everything is always in the process of changing into something else, a notion for which the Greeks seemed to have a peculiar horror; in the words of the hymn, ‘change and decay in all around I see’.

Reality surely cannot be like that: Truth must be absolute, immutable (it is important to see the part played in this by desire and disgust: the true state of the world surely could not be this degrading chaos and disorder where nothing lasts). So Plato says this: Reality is not something we can apprehend by the senses, but only by the intellect. And what the intellect grasps is that beyond Appearance, transcending it, is a timeless and immutable world of Forms or Ideas. Our senses make us aware of many tables, cats, trees; but our intellect sees that these are but instances of a single Idea or Form, Table, Cat, Tree, which somehow imparts to them the quality that makes them what they are, imbues them with ‘tableness’ ‘catness’ and ‘treeness’.

This notion beguiled me when I first came across it, aged fourteen. It has taken me rather longer to appreciate the real nature of Plato’s ‘discovery’, which is perhaps more prosaic (literally) but no less potent. Briefly, I think that Plato has discovered the power of general terms, and he has glimpsed in them – as an epiphany, a sudden revelation – a whole new way of looking at the world; and it starts with being able to write a word on a page.

Writing makes possible the relocation of meaning: from being the property of a situation, something embedded in human activity (‘the meaning of a word is its use in the language’) meaning becomes the property of words, these new things that we can write down and look at. The icon of a cat or a tree resembles to some extent an actual cat or tree but the word ‘cat’ looks nothing like a cat, nor ‘tree’ like a tree; in order to understand it, you must learn what it means – an intellectual act. And what you learn is more than just the meaning of a particular word – it is the whole idea of how words work, that they stand for things and can, in many respects, be used in their stead, just as the beads on an abacus can be made to stand for various quantities. What you learn is a new way of seeing the world, one where its apparently chaotic mutability can be reduced to order.

Whole classes of things that seem immensely varied can now be subsumed under a single term: there is a multiplicity of trees and cats, but the one word ‘tree’ or ‘cat’ can be used to stand for all or any of them indifferently. Modelled on that, abstract ideas such as ‘Justice’ ‘Truth’ and ‘The Good’ can be seen standing for some immutable, transcendent form that imbues all just acts with justice and so on. Plato’s pupil Aristotle discarded the poetic clothing of his teacher’s thought, but developed the idea of generalisation to the full: it is to him that we owe the system of classification by genus and species and the invention of formal logic, which could be described as the system of general relations; and these are the very foundation of all our thinking.

In many respects, the foundations of the modern world are laid here, so naturally these developments are usually presented as one of mankind’s greatest advances. However, I would like to draw attention to some detrimental aspects. The first is that this new way of looking at the world, which apprehends it through the intellect, must be learned. Thus, at a stroke, we render natural man stupid (and ‘primitive’ man, to look ahead to those European colonisations, inferior, somewhat less than human). We also establish a self-perpetuating intellectual elite – those who have a vested interest in maintaining the power that arises from a command of the written word – and simultaneously exclude and devalue those who struggle to acquire that command.

The pernicious division into ‘Appearance’ and ‘Reality’ denigrates the senses and all natural instincts, subjugating them to and vaunting the intellect; and along with that goes the false dichotomy of Heart and Head, where the Head is seen as being the Seat of Reason, calm, objective, detached, which should properly rule the emotional, subjective, passionate and too-easily-engaged Heart.

This, in effect, is the marginalising of the old way of doing things that served us well till about two and a half thousand ago, which gave a central place to those forms of expression and understanding which we now divide and rule as the various arts, each in its own well-designed box: poetry, art, music, etc. (a matter discussed in fable form in Plucked from the Chorus Line)

So what am I advocating? that we undo all this? No, rather that we take a step to one side and view it from a slightly different angle. Plato could only express his new vision of things in the old way, so he presents it as an alternative world somewhere out there beyond the one we see, a world of Ideas or Forms, which he sees as the things words stand for, what they point to – and in so doing, makes the fatal step of discarding the world we live in for an intellectual construct; but the truth of the matter is that words do not point to anything beyond themselves; they are the Platonic Forms or Ideas: the Platonic Idea of ‘Horse’ is the word ‘Horse’. What Plato has invented is an Operating System; his mistake is in thinking he has discovered the hidden nature of Reality.

What he glimpsed, and Aristotle developed, and we have been using ever since, is a way of thinking about the world that is useful for certain purposes, but one that has its limitations. We need to take it down a peg or two, and put it alongside those other, older operating systems that we are all born with, which we developed over millions of years. After all, the rest of the world – animal and vegetable – seems to have the knack of living harmoniously; we are the ones who have lost it, and now threaten everyone’s existence, including our own; perhaps it is time to take a fresh look.

The Lords of Convention

‘The present king of France is bald’ seems to present a logical problem that ‘the cat is on the table’ does not – there is no present king of France, so how can we assert that he is bald? and is the sentence true or false?

But I am much more interested in the second sentence: ‘the cat is on the table’ – what does it mean?

fa728226aacb800d3412c62368829d5e

(‘Cat on a Table’ by John Shelton, 1923-1993)

Can it mean, for instance., ‘it’s your cat, I hold you responsible for its behaviour’?

Consider:

Scene: a sunny flat. A man sprawls at ease on the sofa. To him, from the neighbour room, a woman.

Woman: The cat is on the table.

(Man rolls his eyes, sighs, gets up reluctantly)

Should you want to grasp the difference between the philosophy of the early Wittgenstein, as expressed in the Tractatus Logico-Philosophicus, and his later philsophy, as expressed in Philosophical Investigations (and I accept that not everyone does) then this example epitomises it. It also pins down – or at least, develops further – thoughts I have been having lately about meaning, objectivity and the impact of the invention of writing on thought.

The form of the question in the second paragraph above is curious: ‘what does it mean?’ – where ‘it’ refers to the sentence. The clear implication is that meaning is a property of the sentence, of words – an assertion that may not strike us as strange, till we set it alongside another that we might ask – ‘what do you mean?’

I would suggest that the first question only becomes possible once language has a written form: before that, no-one would think to ask it, because there would be no situation in which you could come across words that were not being spoken by someone in a particular situation – such as the scene imagined above. Suppose we alter it slightly:

Woman: The cat is on the table.
Man: What do you mean?
Woman: What do you mean, what do I mean? I mean the cat is on the table.
Man: What I mean is, the cat is under the sideboard, eating a mouse – look!

The words spoken here all have their meaning within the situation, as it were (what Wittgenstein would call the Language Game or the Form of Life) and the question of their having their own, separate meaning simply does not arise; if we seek clarification, we ask the person who spoke – the meaning of the words is held to be something they intend (though it is open to interpretation, since a rich vein of language is saying one thing and meaning another, or meaning more than we say – just as in our little scene, the line about the cat is far less about description of an event, far more about an implied criticism of the owner through the behaviour of his pet – which in turn is probably just a token of some much deeper tension or quarrel between the two).

Only when you can have words written on a page, with no idea who wrote them or why, do we start to consider that the meaning might reside in the words themselves, that the sentence on the page might mean something of itself, without reference to anything (or anyone) else.

This relocation of meaning – from the situation where words are spoken, to the words themselves – is, at the very least, a necessary condition of Western philosophy, by which I mean the way of thinking about the world that effectively starts with Plato and stretches all the way to the early Wittgenstein, whose Tractatus can be viewed as a succinct summary of it, or all that matters in it;  and perhaps it is more than a necessary condition – it may be the actual cause of Western philosophy.

The crucial shift, it seems to me, lies in the objectification of language, and so of meaning, which becomes a matter of how words relate to the world, with ourselves simply interested bystanders; and this objectification only becomes possible, as I have said, when speech is given an objective form, in writing.

If you were inclined to be censorious, you might view this as an abnegation of responsibility: we are the ones responsible for meaning, but we pass that off on language – ‘not us, guv, it’s them words wot done it.’ However, I would be more inclined to think of it as an instance of that most peculiar and versatile human invention, the convention. Indeed, a convention could be defined as an agreement to invest some external thing with power, or rather to treat it as if it had power – a power that properly belongs to (and remains with) us.

(The roots of convention are worth thinking about. I trace them back to childhood, and the game of ‘make-believe’ or ‘let’s pretend’ which demonstrates a natural facility for treating things as if they existed (imaginary friends) or as if they have clearly defined roles and rules they must follow (the characters in a game a child plays with dolls and other objects it invests with life and character). Is it any wonder that a natural facility we demonstrate early in childhood (cp. speech) should play an important part in adult life? In fact, should we not expect it to?)

It is convenient to act as if meaning is a property of words, and is more or less fixed (and indeed is something we can work to clarify and fix, by study). It facilitates rapid and efficient thought, because if words mean the things they denote, then we can, in a sense, manipulate the world by manipulating words; and this is especially so once we have mastered the knack of thinking in words, i.e. as a purely mental act, without having to write or read them in physical form.

We can perhaps appreciate the power of this more fully if we consider how thinking must have been done before – and though this is speculation, I think it is soundly based. I would argue that before the advent of writing no real analysis of speech was possible: we simply lacked any means of holding it still in order to look at it. An analytic approach to language sees it as something built up from various components – words of different sorts – which can be combined in a variety of ways to express meaning. It also sees it as something capable of carrying the whole burden of expression, though this is a species of circular argument – once meaning is defined as a property of words, then whatever has meaning must be capable of being expressed in words, and whatever cannot be expressed in words must be meaningless.

Without the analytic approach that comes with writing, expression is something that a person does, by a variety of means – speech, certainly, but also gesture, facial expression, bodily movement, song, music, painting, sculpture. And what do they express? in a word, experience – that is to say, the fact of being in the world; expression, in all its forms, is a response to Life (which would serve, I think, as a definition of Art).

Such expression is necessarily subjective, and apart from the cases where it involves making a physical object – a sculpture or painting, say – it is inseparable from the person and the situation that gives rise to it. Viewed from another angle, it has a directness about it: what I express is the result of direct contact with the world, through the senses – nothing mediates it  (and consider here that Plato’s first step is to devalue and dismiss the senses, which he says give us only deceptive Appearance; to perceive true Reality, we must turn to the intellect).

Compare that with what becomes possible once we start thinking in words: a word is a marvel of generalisation – it can refer to something, yet has no need of any particular detail – not colour, size, shape or form: ‘cat’ and ‘tree’ can stand indifferently for any cat, any tree, and can be used in thought to represent them, without resembling them in any respect.

‘A cat sat on a table under a tree’

might be given as a brief to an art class to interpret, and might result in twenty different pictures; yet the sentence would serve as a description of any of them – it seems to capture, in a way, some common form that all the paintings share – a kind of underlying reality of which each of them is an expression; and that is not very far off what Plato means when he speaks of his ‘Forms’ or ‘Ideas’ (or Wittgenstein, when he says ‘a logical picture of facts is a thought’ (T L-P 3) ).

While this way of thinking – I mean using words as mental tokens, language as an instrument of thought – undoubtedly has its advantages (it is arguably the foundation on which the modern world is built), it has been purchased at a price: the distancing and disengagement from reality, which is mediated through language, and the exclusion of all other forms of expression as modes of thought (effectively, the redefinition of thought as ‘what we do with language in our heads’); the promotion of ‘head’ over ‘heart’ by the suppression of the subject and the denigration of subjectivity (which reflects our actual experience of the world) in favour of objectivity, which is a mere convention, an adult game of make-believe –

all this points to the intriguing possibility, as our dissatisfaction grows with the way of life we have thus devised, that we might do it differently if we chose, and abandon the tired old game for a new one.

Where to Find Talking Bears, or The Needless Suspension of Disbelief

polar bear child stroking tube

Something I have been struggling to pin down is a clear expression of my thoughts on the oft-quoted dictum of Coleridge, shown in its original context here:

‘it was agreed, that my endeavours should be directed to persons and characters supernatural, or at least romantic, yet so as to transfer from our inward nature a human interest and a semblance of truth sufficient to procure for these shadows of imagination that willing suspension of disbelief for the moment, which constitutes poetic faith.’

This strikes me as a curious instance of something that has become a commonplace – you can almost guarantee to come across it in critical discussion of certain things, chiefly film and theatre – despite the fact that it completely fails to stand up to any rigorous scrutiny. It is, in a word, nonsense.

But there is another strand here, which may be part of my difficulty. This dictum, and its popularity, strike me as a further instance of something I have grown increasingly aware of in my recent thinking, namely the subjugation of Art to Reason. By this I mean the insistence that Art is not only capable of, but requires rational explanation – that its meaning can and should be clarified by writing and talking about it in a certain way (and note the crucial assumption that involves, namely that art has meaning).

This seems to me much like insisting that everyone say what they have to say in English, rather than accepting that there are languages other than our own which are different but equally good.

But back to Coleridge. If the ‘willing suspension of disbelief for the moment’ is what ‘constitutes poetic faith,’ then all I can say is that it must be an odd sort of faith that consists not in believing something – or indeed anything – but rather in putting aside one’s incredulity on a temporary basis: ‘when I say I believe in poetry, what I mean is that I actually find it incredible, but I am willing to pretend I don’t in order to read it.’

That is the pernicious link – that this suspension of disbelief is a necessary prerequisite of engaging with poetry, fiction or indeed Art as a whole; we see it repeated (as gospel) in these quotations, culled at random from the internet:

‘Any creative endeavor, certainly any written creative endeavor, is only successful to the extent that the audience offers this willing suspension as they read, listen, or watch. It’s part of an unspoken contract: The writer provides the reader/viewer/player with a good story, and in return, they accept the reality of the story as presented, and accept that characters in the fictional universe act on their own accord.’

(‘Any creative endeavour’ ? ‘is only successful’ ? Come on!)

‘In the world of fiction you are often required to believe a premise which you would never accept in the real world. Especially in genres such as fantasy and science fiction, things happen in the story which you would not believe if they were presented in a newspaper as fact. Even in more real-world genres such as action movies, the action routinely goes beyond the boundaries of what you think could really happen.

In order to enjoy such stories, the audience engages in a phenomenon known as “suspension of disbelief”. This is a semi-conscious decision in which you put aside your disbelief and accept the premise as being real for the duration of the story.’
(‘required to believe’ ? ‘in order to enjoy’? Really?)

The implication is that we spend our waking lives in some sort of active scepticism, measuring everything we encounter against certain criteria before giving it our consideration; and when we come on any work of art – or at least one that deals with ‘persons and characters supernatural, or at least romantic’ – we immediately find it wanting, measured against reality, and so must give ourselves a temporary special dispensation to look at it at all.

This is rather as if, on entering a theatre, we said to ourselves ‘these fellows are trying to convince me that I’m in Denmark, but actually it’s just a stage set and they are actors in costumes pretending to be other people – Hamlet, Claudius, Horatio, Gertrude; of course it doesn’t help that instead of Danish they speak a strange sort of English that is quite unlike the way people really talk.’

The roots of this confusion go back what seems a long way, to classical Greece (about twenty-five centuries) though in saying that we should remember that artistic expression is a great deal older (four hundred centuries at least; probably much, much more). I have quoted the contest between Zeuxis and Parrhasius before:

…when they had produced their respective pieces, the birds came to pick with the greatest avidity the grapes which Zeuxis had painted. Immediately Parrhasius exhibited his piece, and Zeuxis said, ‘Remove your curtain that we may see the painting.’ The painting was the curtain, and Zeuxis acknowledged himself conquered, by exclaiming ‘Zeuxis has deceived birds, but Parrhasius has deceived Zeuxis himself.’

– Lempriere’s Classical Dictionary

This is the epitome of the pernicious notion that art is a lie, at its most successful where it is most deceptive: thus Plato banishes it from his ideal state, because in his world it is at two removes from Reality. Plato’s Reality (which he also identifies with Truth) is the World of Forms or Ideas, apprehended by the intellect; the world apprehended by the senses is Appearance, and consists of inferior copies of Ideas; so that Art, which imitates Appearance, is but a copy of a copy, and so doubly inferior and untrustworthy.

Aristotle takes a different line on Appearance and Reality (he is willing to accept the world of the sense as Reality) but continues the same error with his theory of Mimesis, that all art is imitation – which, to use Aristotle’s own terminology, is to mistake the accident for the substance, the contingent for the necessary.

To be sure, some art does offer a representation of reality, and often with great technical skill; and indeed there are works in the tradition of Parrhasius that are expressly intended to deceive – trompe l’oeil paintings, which in the modern era can achieve astonishing effects

but far from being the pinnacle of art (though they are demonstrations of great technical skill) these are a specialist subset of it, and in truth a rather minor one, a sort of visual joke.

Insofar as any work of art resembles reality there will always be the temptation to measure it against reality and judge it accordingly, and this is particularly so of the visual arts, especially cinema, though people will apply the same criterion to fiction and poetry.

They are unlikely to do so in the case of music, however, and this exception is instructive. Even where music sets out to be specifically representative (technically what is termed ‘program(me) music’, I believe) and depict some scene or action – for instance Britten’s ‘Sea Interludes’ –it still does not look like the thing it depicts (for the simple reason that it has no visual element). Music is so far removed in character from what it depicts that we do not know where to start in making a comparison – we see at once that it is a different language, if you like.

The Sea Interludes are extraordinarily evocative, yet we would not call them ‘realistic’, something we might be tempted to say of a photo-realistic depiction of a seascape compared to one by Turner, say:

SEASCAPE-MARINE-PAINTING-FIRST-LIGHT-SURF
(original source here)  Tom Nielsen – ‘First light surf’

Screenshot 2015-12-09 18.40.27

(JMW Turner, ‘Seascape with storm coming on’ 1840)

Of all the different forms of Art, it is cinema that has gone furthest down this erroneous path – with the rise of CGI, almost anything can be ‘realised’ in the sense of presenting it in fully rounded, fully detailed form, and the revival of 3D imagery in its latest version and various other tricks are all geared to the same end of making it seem as if you were actually there in the action, as if that were the ultimate goal.

Yet even with the addition of scent and taste – the only senses yet to be catered for in film – the illusion is only temporary and never complete: we are always aware at some level that it is an illusion, and indeed the more it strives to be a perfect illusion the more aware we are of its illusory nature (we catch ourselves thinking ‘these special effects are amazing!’).

On the other hand, a black and white film from decades ago can so enrapture us that we are completely engaged with it to the exclusion of all else – we grip the arms of our seat and bite our lip when the hero is in peril, we shed tears at the denouement, we feel hugely uplifted at the joyous conclusion – but none of this is because we mistake what we are seeing for reality; it has to do with the engagement of our feelings.

In marked contrast to the cinema, the theatre now rarely aims at a realistic presentation; on the contrary, the wit with which a minimum of props can be used for a variety of purposes (as the excellent Blue Raincoat production of The Poor Mouth did with four chairs and some pig masks) can be part of the pleasure we experience, just as the different voices and facial expressions used by a storyteller can. It is not the main pleasure, of course, but it helps clarify the nature of the error that Coleridge makes.

How a story is told – the technique with which it is presented, whether it be on stage, screen or page – is a separate thing from the story itself. Take, for instance, these two fine books by Jackie Morris

East-of-Suncover-1024x351

wilds

 

East of the Sun, West of the Moon‘ and ‘The Wild Swans‘ are traditional tales; in retelling them, Jackie Morris puts her own stamp on them, not only with her own words and beautiful illustrations, but also with some changes of detail and action (for more about the writing of East of the Sun, see here).

The nature of these changes is interesting. It is like retuning a musical instrument: certain notes that jarred before now ring true; the tales are refreshed – their spirit is not altered but enhanced.

This ‘ringing true’ is an important concept in storytelling and in Art generally (I have discussed it before, in this fable). On the face of it, both these tales are prime candidates for Coleridge’s pusillanimous ‘suspension of disbelief’: in one, a talking bear makes a pact with a girl which she violates, thus failing to free him from the enchantment laid on him (he is actually a handsome prince); in consequence, the girl must find her way to the castle East of the Sun, West of the Moon, an enterprise in which she is aided by several wise women and the four winds; there she must outwit a troll-maiden. In the other, a sister finds her eleven brothers enchanted into swans by the malice of their stepmother, and can only free them by taking a vow of silence and knitting each of them shirts of stinging nettles.

After all, it will be said, you don’t meet with talking bears, any more than you do with boys enchanted into swans, in the Real World, do you?

Hm. I have to say that I view the expression ‘Real World’ and those who use it with deep suspicion: it is invariably employed to exclude from consideration something which the speaker does not like and fears to confront. As might be shown in a Venn diagram, what people mean by the ‘Real World’ is actually a subset of the World, one that is expressly defined to rule out the possibility of whatever its proponents wish to exclude:

Screenshot 2015-12-09 19.18.50

In other words, all they are saying is ‘you will not find talking bears or enchanted swans if you look in a place where you don’t find such things.’

Cue howls of protest: ‘you don’t meet talking bears walking down the street, do you?’ Well, it depends where you look: if you look at the start of East of the Sun, you will meet a talking bear walking through the streets of a city. Further howls: ‘But that’s just a story!’

polar bear child stroking tube

(Some people met this bear on the London underground but I don’t think it spoke )

Well, no – it isn’t just a story; it’s a story – and stories and what is in them are as much part of the world as belisha beacons, horse-blankets and the Retail Price Index. The World, after all, must include the totality of human experience. The fact that we do not meet with talking bears in the greengrocer’s (and has anyone ever said we might?) does not preclude the possibility of meeting them in stories, which is just where you’d expect to find them (for a similar point, see Paxman and the Angels).

The Muybridge Moment

Muybridge-2

The memorable Eadweard Muybridge invented a number of things, including his own name – he was born Edward Muggeridge in London in 1830. He literally got away with murder in 1872 when he travelled some seventy-five miles to shoot dead his wife’s lover (prefacing the act with ‘here’s the answer to the letter you sent my wife’) but was acquitted by the jury (against the judge’s direction) on the grounds of ‘justifiable homicide’. He is best known for the sequence of pictures of a galloping horse shot in 1878 at the behest of Leland Stanford, Governor of California, to resolve the question of whether the horse ever has all four feet off the ground (it does, though not at the point people imagined). To capture the sequence, Muybridge used multiple cameras and devised a means of showing the results which he called a zoopraxoscope, thereby inventing stop-motion photography and the cinema projector, laying the foundations of the motion-picture industry.

The_Horse_in_Motion-anim

(“The Horse in Motion-anim” by Eadweard Muybridge, Animation: Nevit Dilmen – Library of Congress Prints and Photographs Division; http://hdl.loc.gov/loc.pnp/cph.3a45870. Licensed under Public Domain via Commons – https://commons.wikimedia.org/wiki/File:The_Horse_in_Motion-anim.gif#/media/File:The_Horse_in_Motion-anim.gif)

Muybridge’s achievement was to take a movement that was too fast for the human eye to comprehend and freeze it so that each phase of motion could be analysed. It was something that he set out to do – as deliberately and methodically as he set out to shoot Major Harry Larkyns, his wife’s lover.

It is interesting to consider that something similar to Muybridge’s achievement happened a few thousand years ago, entirely by accident and over a longer span of time, but with consequences so far-reaching that they could be said to have shaped the modern world.

We do not know what prompted the invention of writing between five and six thousand years ago, but it was not a desire to transcribe speech and give it a permanent form; most likely it began, alongside numbering, as a means of listing things, such as the contents of storehouses – making records for tax purposes, perhaps, or of the ruler’s wealth – and from there it might have developed as a means of recording notable achievements in battle and setting down laws.

We can be confident that transcribing speech was not the primary aim because that is not something anyone would have felt the need to do. For us, that may take some effort of the imagination to realise, not least because we live in an age obsessed with making permanent records of almost anything and everything, perhaps because it is so easy to do so – it is a commonplace to observe that people now seem to go on holiday not to enjoy seeing new places at first hand, but in order to make a record of them that they can look at once they return home.

And long before that, we had sayings like

vox audita perit, littera scripta manet
(the voice heard is lost; the written word remains)

to serve as propaganda for the written word and emphasise how vital it is to write things down. One of the tricks of propaganda is to take your major weakness and brazenly pass it off as a strength (‘we care what you think’ ‘your opinion matters to us’ ‘we’re listening!’ as banks and politicians say) and that is certainly the case with this particular Latin tag: it is simply not true that the spoken word is lost – people have been remembering speech from time immemorial (think of traditional stories and songs passed from one generation to the next); it is reasonable to suppose that retaining speech is as natural to us as speaking.

If anything, writing was devised to record what was not memorable. Its potential beyond that was only slowly realised: it took around a thousand years for anyone to use it for something we might call ‘literature’. It is not till the classical Greek period – a mere two and a half millennia ago (Homo sapiens is reckoned  at 200,000 years old, the genus Homo at 2.8 million)  – that the ‘Muybridge moment’ arrives, with the realisation that writing allows us to ‘freeze’ speech just as his pictures ‘froze’ movement, and so, crucially, to analyse it.

When you consider all that stems from this, a considerable degree of ‘unthinking’ is required to imagine how things must have been before writing came along. I think the most notable thing would have been that speech was not seen as a separate element but rather as part of a spectrum of expression, nigh-inseparable from gesture and facial expression.  A great many of the features of language which we think fundamental would have been unknown: spelling and punctuation – to which some people attach so much importance – belong exclusively to writing and would not have been thought of at all; even the idea of words as a basic unit of language, the building blocks of sentences, is a notion that only arises once you can ‘freeze’ the flow of speech like Muybridge’s galloping horse and study each phase of its movement; before then, the ‘building blocks’ would have been complete utterances, a string of sounds that belonged together, rather like a phrase in music, and these would invariably have been integrated, not only with gestures and facial expressions, but some wider activity of which they formed part (and possibly not the most significant part).

As for grammar, the rules by which language operates and to which huge importance is attached by some, it is likely that no-one had the least idea of it; after all, speech is even now something we learn (and teach) by instinct, though that process is heavily influenced and overlaid by all the ideas that stem from the invention of writing; but then we have only been able to analyse language in that way for a couple of thousand years; we have been expressing ourselves in a range of ways, including speech, since the dawn of humanity.

When I learned grammar in primary school – some fifty years ago – we did it by parsing and analysis. Parsing was taking a sentence and identifying the ‘parts of speech’ of which it was composed – not just words, but types or categories of word, defined by their function: Noun, Verb, Adjective, Adverb, Pronoun, Preposition, Conjunction, Article.

Analysis established the grammatical relations within the sentence, in terms of the Subject and Predicate. The Subject, confusingly, was not what the sentence was about – which puzzled me at first – but rather ‘the person or thing that performs the action described by the verb’ (though we used the rough-and-ready method of asking ‘who or what before the verb?’). The Predicate was the remainder of the sentence,  what was said about (‘predicated of’) the Subject, and could generally be divided into Verb and Object (‘who or what after the verb’ was the rough and ready method for finding that).

It was not till I went to university that I realised that these terms – in particular, Subject and Predicate – derived from mediaeval Logic, which in turn traced its origin back to Aristotle (whom Dante called maestro di color che sanno – master of those that know) in the days of Classical Greece.

Socrates

Socrates

Plato

Plato

Aristotle

Aristotle

Alexander the Great

Alexander the Great

Aristotle is the third of the trio of great teachers who were pupils of their predecessors: he was a student of Plato, who was a student of Socrates. It is fitting that Aristotle’s most notable pupil was not a philosopher but a King: Alexander the Great, who had conquered much of the known world and created an empire that stretched from Macedonia to India by the time he was 30.

That transition (in a period of no more than 150 years) from Socrates to the conquest of the world, neatly embodies the impact of Classical Greek thought, which I would argue stems from the ‘Muybridge Moment’ when people began to realise the full potential of the idea of writing down speech. Socrates, notably, wrote nothing: his method was to hang around the market place and engage in conversation with whoever would listen; we know him largely through the writings of Plato, who uses him as a mouthpiece for his own ideas. Aristotle wrote a great deal, and what he wrote conquered the subsequent world of thought to an extent and for a length of time that puts Alexander in eclipse.

In the Middle Ages – a millennium and a half after his death – he was known simply as ‘The Philosopher’ and quoting his opinion sufficed to close any argument. Although the Renaissance was to a large extent a rejection of Aristotelian teaching as it had developed (and ossified) in the teachings of the Schoolmen, the ideas of Aristotle remain profoundly influential, and not just in the way I was taught grammar as a boy – the whole notion of taxonomy, classification by similarity and difference, genus and species – we owe to Aristotle, to say nothing of Logic itself, from which not only my grammar lessons but rational thought were derived.

I would argue strongly that the foundations of modern thought – generalisation, taxonomy, logic, reason itself – are all products of that ‘Muybridge Moment’ and are only made possible by the ability to ‘freeze’ language, then analyse it, that writing makes possible.

It is only when you begin to think of language as composed of individual words (itself a process of abstraction) and how those words relate to the world and to each other, that these foundations are laid. Though Aristotle makes most use of it, the discovery of the power of generalisation should really be credited to his teacher, Plato: for what else are Plato’s Ideas or Forms but general ideas, and indeed (though Plato did not see this) those ideas as embodied in words? Thus, the Platonic idea or form of ‘table’ is the word ‘table’ – effectively indestructible and eternal, since it is immaterial, apprehended by the intellect rather than the senses, standing indifferently for any or all particular instances of a table – it fulfils all the criteria*.

Which brings us to Socrates: what was his contribution? He taught Plato, of course; but I think there is also a neat symbolism in his famous response to being told that the Oracle at Delphi had declared him ‘the wisest man in Greece’ – ‘my only wisdom is that while these others (the Sophists) claim to know something, I know that I know nothing.’ As the herald of Plato and Aristotle, Socrates establishes the baseline, clears the ground, as it were: at this point, no-one knows anything; but the construction of the great edifice of modern knowledge in which we still live today was just about to begin.

However, what interests me most of all is what constituted ‘thinking’ before the ‘Muybridge Moment’, before the advent of writing – not least because, whatever it was, we had been doing it for very much longer than the mere two and a half millennia that we have made use of generalisation, taxonomy, logic and reason as the basis of our thought.

How did we manage without them? and might we learn something useful from that?

I think so.

*seeing that ideas are actually words also solves the problem Hume had, concerning general ideas: if ideas are derived from impressions, then is the general idea ‘triangle’ isosceles, equilateral, or scalene or some impossible combination of them all? – no, it is just the word ‘triangle’. Hume’s mistake was in supposing that an Idea was a ‘(faint) copy’ of an impression; actually, it stands for it, but does not resemble it.

The Mechanism of Meaning (it’s all in the mind)

Meaning matters. It is bound up with so many things: understanding and misunderstanding, doubt and certainty, to say nothing of philosophy, poetry, music and art; so it is worth considering the mechanism by which it operates. ‘Mechanism’ is a useful image here: when mechanisms are hidden – as they generally are – their effects can seem mysterious, even magical (as in the marvels of the watchmaker or the stage magician); yet when they are revealed, they offer reassurance: the point of a mechanism is that, unless it is impaired or interfered with, it will go on working in the same way.

Audemars_piguet_1908_montre_poche_640_360_s_c1_center_center magician-performs-a-levitation-trick-on-stage-nita-the-hypnotised-and-suspended-lady

The problem with the mechanism of meaning is that the popular notion of it is misleading: we speak of meaning as something conveyed, like a passenger in a car, or transmitted, like a radio message; we also speak of it as being embodied or contained in things that have it, whether they are sentences, poems, works of art or the like. These two usages combine to suggest that meaning exists independently in some form, and that the business of ‘meaning’ and ‘understanding’ consists of inserting it into and extracting it from whatever is said to have it. That seems like common sense, but as we shall see, when scrutinised it proves problematic.

Wittgenstein points us in another direction with his observation that ‘the meaning of a word is its use in the language’ which he uses alongside ‘language game’ and ‘form of life’ when discussing meaning, to denote the (wider) activity of which language forms a part and from which it derives its meaning.

It strikes me that the basic mechanism of meaning lies in connection : meaning is only found where a connection is made, and that connection is made in the mind of an observer, the one who ascribes meaning. In other words, meaning is not a fixed property of things: a thing in itself, on its own, does not have meaning. But we must be careful here: this is a stick that some will readily grasp the wrong end of – to suggest that a tree or a person (say) ‘has no meaning’ is liable to provoke outrage and earnest outpourings about the inestimable value of trees and people. That is because ‘meaningless’ is a pejorative term, properly used in cases where we expect meaning but do not find it; it might be compared to our use of ‘flightless’ which we apply to certain birds that are exceptions to the general rule; we would not apply it to pigs or gorillas.

gorillangel

We can gain some insight into how meaning works – its mechanism – by considering an allied concept, purpose. Let us suppose some interplanetary traveller at a remote time of a quite different species to ourselves. Somewhere on his travels he comes upon this relic of a long-lost civilisation: a rectangular case constructed of semi-rigid, possibly organic material, which opens to disclose a cellular array – there are some twenty-four rectangular cells of the same organic material, each containing an identical object, rounded, hard and smooth to the touch. He is quite excited by the find as it reminds him of another he has come across – again a cellular array in a case of semi-rigid, possibly organic material, and again each cell containing a smooth, hard, rounded object, though there are differences of detail both in the shape of the cells and the objects. He may submit a learned paper to the Integalactic Open University speculating on the purpose of these strikingly similar discoveries; he is in no doubt that they are variants of the same thing, and share a common purpose, on account of the numerous points of resemblance.

Were we at his side we might smile, since one is a packet of lightbulbs and the other a box of eggs; it is likely that the resemblances that strike him as the best clues to their purpose might elude us altogether, since we would dismiss them as irrelevant – ‘that is just how they happen to be packaged, for ease of transport or storage: it has no bearing on what they are for. As to the slight similarities of shape and texture, that is mere coincidence. These objects are entirely unrelated, and could not be more unlike.’

577e028b29cf98908190de258ad90d73 light-bulb_1467547c

It is worth considering the key difference between us and the interplanetary traveller that allows us to smile at his ill-founded speculation. These are familiar objects to us, and we can connect them at once to a context or situation in which they belong, where they fit in and have purpose; our ‘reading’ of them is entirely different from the alien traveller’s – we disregard all that seems to him most striking, because we know it is of no significance. We see that the apparent similarity has nothing to do with the objects themselves, but the fact that they are both in storage, awaiting use; neither is ‘active’, i.e. in the situation or context where they are used and have purpose.

crack_eggs_1Mains_powered_electric_Lamp

How far an examination of the objects in detail might allow our traveller to deduce, on the one hand, a national grid for distributing electricity from power stations to homes and workplaces rigged with lighting circuits, and the delights of omelettes, fried, poached and scrambled eggs on the other depends on quite how alien he is – if he a gaseous life-form sustained by starlight, he is unlikely to penetrate far into their mystery. On the other hand, if his own existence has ‘forms of life’ or activities similar to ours, he might make much better and even surprisingly accurate guesses.

That, after all, is how we ourselves proceed if we come across artefacts or objects that are unfamiliar: we guess at their purpose by thinking of the kind of the thing they might be, the sort of use they might have, by analogy with our own activities or ‘forms of life’ (and it is no accident that truly mystifying objects are often tentatively described as having ‘possible religious or ritual significance’ since in our own experience this is where many things are found whose use could not easily be guessed; and in this connection consider the use made of everyday objects in burial rites – offerings of food put alongside the dead, or cooking or eating utensils for use on the onward journey).

41

I would suggest that, as far as the mechanism by which they operate goes, ‘purpose’ and ‘meaning’ are the same, since both are defined in the same way, viz. by placing the thing in question in relation to some context, situation or larger activity where it has a place, where it ‘makes sense’, if you like;  (imagine our alien traveller’s reaction to being shown the circumstances in which a lightbulb is used – the mystery of the object disappears once the connections are made – literally, in this case).

This brings out important aspects of meaning that are often overlooked, not least because – as I observed at the outset – they are contradicted by most popular accounts of what meaning is. The first aspect is that meaning is not inherent: no amount of studying or dissecting the object in isolation will discover it – I emphasis ‘in isolation’ because discovering, say, the filament in the light bulb and how it is connected to the fitting at the base will advance our understanding only if we can relate them to other things: if we have no notion of electricity, or that it will make a wire filament glow brightly, then they will tell us nothing.

The second aspect is slightly trickier to explain but of greater significance. If we agree that meaning is not inherent, not something that can be found simply by examining the object no matter how minutely, then we can reasonably ask where it is located. One answer, from what we have said, is that it lies in the relation or connection to the context, situation or ‘form of life’; but I think that is not quite right.

Rather, it consists in being related to, or being connected with – in other words, it exists as the result of an action by the onlooker, and where it exists – where it means – is in that onlooker’s mind. This is not the usual account that is given of meaning, which is generally more like this, from Wikipedia:

‘meaning is what the source or sender expresses, communicates, or conveys in their message to the observer or receiver, and what the receiver infers from the current context.’

At first sight, this might not seem significantly different – we have relation to context, we have a process of inference; the main addition appears to be that the source or sender is taken into account, as well as the receiver. However, there is one slight-seeming but important difference, which is the notion of the meaning as something which retains its identity throughout, and which exists prior to the communication taking place and survives after it – the model that springs readily to mind is the letter, which the sender puts in the envelope, which is then conveyed to the recipient who takes it out and reads it.

IMG01331-20151103-1026

The analogy with the letter is probably what makes this seem a ‘common sense’ account that most people would agree with, but the logic of it gives rise to problems. If we picture the process of sending a letter, we might start with the sender at her desk, pen in hand, poised to write; she puts the message on the page, folds the page and seals it in the envelope then sends it off; at the other end the recipient takes it out, reads it, and ‘gets the message’. What is the difficulty there, you might ask?
It begins to emerge if you try to make the analogy consistent. At first glance, it seems that

meaning=letter
message (that which conveys the meaning) = envelope

But there is a problem here: the message is in the letter, rather than the envelope; in actual fact, the envelope is superfluous – the message could be sent without it, by hand, say, simply as a folded note. Still, that seems trivial – the sender puts her ideas into words, the receiver reads the words and gets her ideas: isn’t that just the same?

Not quite. The question is whether the meaning (or the message) exists before it is put into words; and if so, in what form? Again, this may seem unproblematic: of course the message exists before she writes it down; and indeed she might change her mind and instead of writing, making a phone call and say what she means instead, directly, as it were.

But we must be careful here: the question is not whether the message exists before she writes it down – or even before she speaks it – but before she puts it into words. This is where the image of  the letter in the envelope is at its most misleading: isn’t the meaning just something we put into words, in the same way we put the letter in the envelope?

That is what Wittgenstein calls the ‘private language’ argument – the notion that my thoughts are in my head in some form to which I have privileged access, and which I could choose to give public form if I wish, thereby making them accessible to others. Though this again seems like common sense, when examined closely, it is problematic. It forms the basis of popular notions of telepathy, but trying to imagine what such ‘direct transmission’ would actually consist of highlights the difficulty.

If you convey your thoughts to me, what do I experience? Do I hear your voice speaking in my head? if so, we are back to ‘putting things in words’ and no nearer any prior form our thoughts might take. The temptation is to fall back on images, as if these were somewhow more immediate (a picture is worth a thousand words, after all) but what would they be images of? And how, having received them, would I be able to infer your thought from them? A more illuminating (but no less problematic) possibility is that we might hear your thoughts as a musical phrase which we intuitively understand.

This works to some extent because we are accustomed to the idea that music can consistently evoke definite feelings in us – ‘that passage always makes me feel this way, invariably calls this to mind’ – though we have no idea how: ‘it just does’; so that seems consistent with our finding in it something that someone else has put there; but it still leaves the question of what happens at the other end – how would such a musical message originate?

The options here would seem to be either that the message is originally in some other form which I then embody in the music – which takes us back to where we started: if it’s comprehensible to me in that form, why can’t I convey that directly instead of ‘translating’ it into music? – or else we have to accept that only in expressing it do I find what I am thinking; what we experience prior to that is an urge, a sort of pressure which can only be relieved by giving it expression in some way – whether it is an inarticulate cry of rage, a musical phrase (the terrifying opening of the Dies Irae in Verdi’s Requiem, for instance), an image (Munch’s The Scream, maybe) or the words ‘I am very angry about this!’

The Scream

This brings us by a roundabout route to something I have been trying to articulate for a while  – the key distinction between language as the instrument of thought and as one means of expressing experience; but that is a subject for another article. In the meantime, I would conclude by saying that, if this account of the mechanism of meaning is accurate, then it has some interesting implications. It suggests, for instance, that meaning (like beauty) is in the eye (or mind) of the beholder; that it is not fixed, but variable; that it is impermanent; and – perhaps most importantly – it is inseparable from its context, the ‘form of life’ or wider activity of which it forms a part and on which it depends.

Storypower: Quigley’s Ineffable Escapade

under a twilight canopy

The solution to the problem of life is seen in the vanishing of the problem.
(Is not this the reason why those who have found after a long period of doubt that the sense of life has become clear to them have been unable to say what constitutes that sense?)’ (Wittgenstein, Tractatus, 6.521)

That remarkable book, The Third Policeman by Flann O’Brien, is to my mind a work of genius, but that is by the way. An episode from it came to mind just now when I was reflecting on the Wittgenstein quote above that I used to close my previous piece, though it resonates even more strongly with another, the one that closes the same work:

What we cannot speak about we must pass over in silence’ (Tractatus, 7 )

Quigley’s Balloon

(to set the scene, this converation takes place between the nameless narrator and the sergeant of police, who are inspecting the scaffold on which the narrator is to be hanged:)

Up here I felt that every day would be the same always, serene and chilly, a band of wind isolating the earth of men from the far-from-understandable enormities of the girdling universe. Here on the stormiest autumn Monday there would be no wild leaves to brush on any face, no bees in the gusty wind. I sighed sadly.

‘Strange enlightenments are vouchsafed,’ I murmured, ‘to those who seek the higher places.’

I do not know why I said this strange thing. My own words were also soft and light as if they had no breath to liven them. I heard the Sergeant working behind me with coarse ropes as if he were at the far end of a great hall instead of at my back and then I heard his voice coming back to me softly called across a fathomless valley:

‘I heard of a man once,’ he said, ‘that had himself let up into the sky in a balloon to make observations, a man of great personal charm but a divil for reading books. They played out the rope till he was disappeared completely from all appearances, telescopes or no telescopes, and then they played out another ten miles of rope to make sure of first-class observations. When the time-limit for the observations was over they pulled down the the balloon again but lo and behold there was no man in the basket and his dead body was never found afterwards lying dead or alive in any parish ever afterwards.’

Here I heard myself give a hollow laugh, standing there with a high head and two hands still on the wooden rail.

‘But they were clever enough to think of sending up the balloon again a fortnight later and when they brought it down the second time lo and behold the man was sitting in the basket without a feather out of him if any of my information can be believed at all.

‘So they asked where he was and what had kept him but he gave them no satisfaction, he only let out a laugh and went home and shut himself in his house and told his mother to say he was not at home and not receiving visitors or doing any entertaining. That made the people very angry and inflamed their passions to a degree that is not recognized by the law. So they held a private meeting that was attended by every member of the general public apart from the man himself and they decided to get out their shotguns the next day and break into the man’s house and give him a severe threatening and tie him up and heat pokers in the fire to make him tell what happened in the sky the time he was up inside it.

‘But between that and the next morning there was a stormy night in between, a loud windy night that strained the trees in their deep roots and made the roads streaky with broken branches, a night that played a bad game with root-crops. When the boys reached the home of the balloonman the next morning, lo and behold the bed was empty and no trace of him was ever found afterwards dead or alive, naked or with an overcoat. And when they got back to where the balloon was, they found the wind had torn it up out of the ground with the rope spinning loosely in the windlass and it invisible to the naked eye in the middle of the clouds. They pulled in eight miles of rope before they got it down but lo and behold the basket was empty again. They all said that the man had gone up in it and stayed up but it is an insoluble conundrum, his name was Quigley and he was by all accounts a Fermanagh man.’
(The Third Policeman, pp137-9, slightly abridged)

This sent my thoughts on two different tracks: the first was an idea that I expressed in an earlier piece on the notion that we have devised a carapace that protects us from direct experience of reality:

‘The renunciation of self is central to much religious teaching, and it is interesting to consider that the price of experiencing reality (of the kind that humankind cannot bear very much) might well be a loss of identity, of our sense of who and what we are.’

The term ‘life-changing experience’ is rather bandied about these days, and can seem no more than the tag-line for a holday advert, but if an experience is truly life-changing, then we cannot expect to return from it unscathed; and it is in the very nature of such experiences that they may be incommunicable to those who have not shared them – if your complete frame of reference is altered (or exchanged for another) then on what basis can you communicate?

The second line of thought was that the O’Brien piece is yet another demonstration of the power of story (and poetry likewise) to convey what would be considered difficult and complex ideas if expressed in standard philosophical language in an easily accessible (and vividly memorable) form (‘Quigley’s balloon’ would make an excellent picture book, or equally (and most appropriately, given its theme of ineffability) a short and wordless animation)

We should not wonder at that, of course: we have only been expressing ourselves in philosophical terms for some 25 centuries; 25 millennia would not be even half the time we have been using stories (and their central method, metaphor) – which, as a way of thinking about things, are probably as old as humanity itself.

No abiding city

Laurentius_de_Voltolina_001

Things take odd turns sometimes. After my Byzantine Epiphany I felt sure I was on the track of something, yet it proved elusive: after a lot of writing I felt I was still circling round it, unable to pin it down.

Then this morning I woke to the news that (with the General Election just over a week away) David Cameron was pledging, if re-elected, to pass a law that would prevent his government from raising the level of a range of taxes for the duration of the next parliament.

I have to say that this struck me at once as absurd, the notion of a government passing a law to prevent itself doing something: why go to all that trouble? why not just say, ‘we won’t do that’?

There’s the rub, of course – election promises are famously falser than dicers’ oaths; against that background, Mr Cameron feels the need to offer something stronger – no mere manifesto promise, but an actual law! – what could be a stronger guarantee than that?

There’s a paradox here, of course – because politicians’ promises are notoriously unreliable, Mr Cameron says he will pass a law to ensure that he will not go back on his word – and that’s a promise. The whole elaborate structure is built on the same uncertain foundation.

I am reminded of advice from a more reputable source, the Sermon on the Mount:

‘Again, you have heard how it was said to our ancestors, you must not break your oath…
But I say this to you, do not swear at all… all you need say is “Yes” if you mean yes, “No” if you mean no; anything more than this comes from the Evil One.’

You are no better than your word: if that is worth nothing, no amount of shoring-up will rectify the matter; and if it is good, what more do you need?

But there is something deeper here: the key, I think, to the very matter I had been trying to resolve.

Let us start with Mr Cameron’s utterance: it is perhaps best understood as a theatrical gesture. The actor on stage, conscious of the audience’s attention (and also of his distance from them, compared, say, to the huge close-up of the cinema screen) may feel the need to make a gesture which in everyday life would strike us as exaggerated and – well – theatrical. So Mr Cameron, in the feverish atmosphere of an election campaign, feels the need to outbid his opponents – ‘they say they’ll do something? well, I’ll pass a law that will make me do as I say!’

I have to say that even in context it sounds rather silly, but it would be even sillier outside it – so that is the first point, the importance of context to understanding.

The second is this business of making a law and the appearance it offers of transferring the responsibility from the person to something independent and objective – ‘don’t just take my word for it – it’ll be the law!’ It overlooks the fact that legislation is a convention that requires our consent to operate: the laws of the land are not like the laws of physics – they do not compel us in any way; we obey them through choice, not necessity.

(And of course the existence of a range of penalties and agencies of enforcement like the police and the courts are proof of this – you do not need any of that to make things obey the Law of Gravity; you only need threat and compulsion where there is the possibility that people might do otherwise)

These two things – the importance of context to meaning and the attempt to transfer responsibility from the person to something apparently objective and independent – chimed with what I had been struggling to express before.
I had been focusing on the effect that the introduction of writing has on language, and through that, on our whole way of seeing the world.

The gist of my argument was this: from time immemorial, we have had Speech, which is our version of something we observe throughout the animal kingdom – bird song, whale song, the noises of beasts. Then, relatively recently – between five and six thousand years ago – we invent something unique: Writing.

sargon-inscription-ancient-writing-on-plaque-rome

At first it is used for relatively low-grade menial (indeed, prosaic) tasks, such as making lists and records; it is a good thousand years before anyone thinks to employ it for anything we might call ‘literature’. That should be no surprise: where Speech is natural and instinctive, the product of millions of years’ development, writing is awkward and cumbersome, a skill (along with reading) that must be learned, and one not everyone can master.

Speech has all the advantages that go with sound: it has rhythm, rhyme, musicality, pattern; Writing has none of these. But it does have one thing: where speech exists in time and is fleeting, ephemeral, Writing exists in space and has duration; it is objective; it exists in its own right, apart from any context or speaker.

My speech dies with me: when my voice is stilled, it is gone (though it may linger in the memory of others); but my written words will outlast not only me but a hundred generations – they could be around long after any trace or memory of their author is wholly erased.

Thus, from Speech we move to Language – by which I mean the complex thing that arises after Writing is invented. The important thing about Language is its dual nature, and the interaction and tension between its two forms, the written and the spoken. These are (as I discussed before) in many respects antithetical – where Speech is necessarily bound up with a speaker and so with a context – it is always part of some larger human activity – Writing stands on its own, apart from any context, independent of its author, with its own (apparently) objective existence.

(and the differences go deeper – where speech draws on a rich range of devices to overcome its ephemeral character and make itself memorable – rhyme, rhythm, vivid imagery etc – writing (though it can borrow all of them) has no need of any of these, having permanence; the problem it must overcome is lack of context – it cannot rely on what is going on round about to clarify its meaning; it must stand on its own two feet, and aim to be clear, concise, unambiguous, logical.)

What Mr Cameron’s absurd utterance brought home to me was the deceptive nature of Writing’s independence and objectivity, which is more apparent than real. Just as the law he holds out as having some objective, compelling force that is greater than his word is only so because we (as a society) agree to assign that power to it (in this connection, see my earlier post, ‘bounded by consent’) – and ultimately has no greater strength than the original word that promises it – so the objectivity and independence of the written word are not inherent properties but rather qualities we have conferred on it.

The independence and objectivity we assign to language is a kind of trick we play on ourselves, and it is bound up with the matter I discussed in my earlier posts (here, here and here) concerning the ‘carapace’ that we erect between ourselves and Reality – a carapace of ideas on which we confer the title ‘reality’ even though it is a construct of our own.

(It was interesting to realise that my philosophical hero Ludwiig Wittgenstein had made this journey before me: in his early work, e.g. the Tractatus, he is much concerned with his ‘picture theory’ of language, in which a proposition is seen as picturing reality, by having its elements related to one another in a way that corresponds to how the elements of the reality it pictures are related:
‘2.12 A picture is a model of reality.
2.13 In a picture objects have the elements of the picture corresponding to them.
2.14 In a picture the elements of the picture are the representatives of objects.

2.1511 That is how a picture is attached to reality; it reaches right out to it.

2.223 In order to tell whether a picture is true or false we must compare it with reality.’

This model takes for granted the objective nature of language: it is the words, the proposition, that is true or false, and that is established by comparison with the world; we do not seem to play much part.

However, in his later work, Wittgenstein moves to a different position: he now speaks of ‘language games’ and ‘forms of life’; it is only as part of a language game or a form of life – i.e. some human activity – that words have meaning; and indeed, as a general rule, the meaning of a word is its use in the language. He emphatically rejects the idea of a ‘private language’ in which our thinking is done before being translated into words: all that is available to us is the unwieldy, untidy agglomeration that is Language, a public thing that everyone shares and shapes but no-one controls or commands – despite the best efforts of organisations such as L’Academie Francaise)

As is typical of Wittgenstein, this modest-seeming manoeuvre effectively demolishes an edifice of thought that has stood for millennia: its implications are profounder than might at first appear.

If we go back to Plato and his fellow Greeks, we find a horror of mutability (‘change and decay in all around I see’, as the hymn has it) and a yearning for Truth to be something fixed and immutable – hence Plato’s world of Ideas, the unchanging reality that can be apprehended only by the intellect and lies beyond the veil of Appearance which so beguiles our poor, deluded senses.

Language – the complex thing that arises after the invention of the written form – is central to establishing this Platonic world, whose influence has lasted down to the present day, in particular its elevation of the intellect over the senses and its separation of Appearance and Reality.

The quality of Language on which all this hinges is the illusion it gives of being something that exists in its own right: words have meanings and can be used to describe the world; if only we tidied up language, rid it of its anomalies, used it more carefully and logically – freed it from the abusage of everyday speech – made it, in a word, more literate, truer to its written form – then we would be able to express the Truth accurately and without ambiguity, and permanently.

This is the edifice that Wittgenstein shows to be no more than a castle in the air: if meaning exists only in context, as part of some human activity, then all meaning is provisional; nothing is fixed (an idea I have discussed before). Language can never be tidied up and purified, cleansed of its faults, because language is ultimately derived from Speech, which is a living, dynamic thing, constantly changing with the forms of life of those who speak it, and the new ‘language games’ they invent.

The truth of what I have just said is by no means universally accepted; indeed, we have made some pretty determined attempts to contradict it: the first was the use of Latin as a scholarly language after it had ceased to be a living tongue (having transmuted, in the course of time, into the various romance languages – Italian, French, Spanish, Portuguese, Romanian). Latin was the vehicle of academic discourse from the foundation of the first European universities in the eleventh century down to the time of Newton and beyond, a span of some five centuries; it remains the official language of the Roman Catholic church (although mass in the vernacular was introduced with the refoms of Vatican 2 in the early sixties, the Latin mass was not ‘banned’ as popularly supposed – only a specific form, the Tridentine rite, was discontinued; mass is still said in Latin to this day in various places).

It is no surprise to find that the Church – very much bound to the notion of an unchanging Truth – should be one of the last bastions of a purely literate language. In the academic and particularly the scientific world, the role formerly played by Latin has to a large extent been taken over by English, and ‘Academic English’ as a form is diverging from the living language, which in turn is diversifying (with the disappearance of the British Empire and the emergence of former colonies as countries in their own right) in much the same way as Latin transformed into various tongues after Rome fell.

I am sure that there are many today who will view my assertion that all meaning is necessarily provisional with the same horror that the Greeks contemplated the mutability of things, but I think if you consider it steadily, you will see that it is both liberating and refreshing.

In my previous piece I began by talking about the perils of building in stone – namely, that what you make will outlive its capacity to be understood, because although it does not change, the people considering it do. I think this happens all the time with ideas, and especially the ‘big’ ideas, about ‘Life, the Universe and Everything’ – because they are important, we try to fix them for all time, but we overlook the fact that they are the product of a particular time, expressed in the language of that time, and that succeeding generations will see and understand things differently.

Of course the change of outlook and the decay of understanding is never sudden and can be delayed, and that is exactly what written texts do: they give a particular version of something an authority and a form that can last for generations, and which may block any development for a long time.

(That, broadly, is what happened with Scholasticism: the influx (via the Islamic world) of ancient Greek learning – chiefly Aristotle – into mediaeval Europe provided a huge intellectual stimulus initially, as great minds like Thomas Aquinas came to terms with it and assimilated it into the thinking of the day; but so comprehensive did it seem that there was no impulse to move beyond it, so that it began to ossify – the object of university study became to master Aristotle’s works, and the ‘argument from authority’ came into vogue – to settle any dispute it sufficed to quote what Aristotle (often called  simply ‘The Philosopher’) said on the matter – there was no going beyond that. This situation lasted till the Renaissance shook things up once more )

So am I, then, making a straightforward pitch for Relativism and denying the possibility of an Absolute Truth?

Not quite. Rather, this is an argument for ineffability, the idea that ‘Great Truths’ cannot be expressed in words. It is not so much that language is not equal to the job (but might be improved till it was), rather that the greatness of these ‘Great Truths’ (that label is of course inadequate) is such that it necessarily exceeds our ability to comprehend them, so limiting our capacity to express them; though poetry can get closer than prose:

‘Ah, but a man’s reach should exceed his grasp, or what’s a heaven for?’

and Art in general – music, painting, sculpture, dance, poetry – offers a more fruitful approach than philosophy – not to success, but a more rewarding kind of failure; or, as Mr Eliot so aptly expresses it,

‘but there is no competition—
There is only the fight to recover what has been lost
And found and lost again and again: and now, under conditions
That seem unpropitious. But perhaps neither gain nor loss.
For us, there is only the trying. The rest is not our business.’

The Exploration of Inner Space III: What Plato’s got to do with it

chimborazo-3Chimborazo, Ecuador

WHEN I was but thirteen or so
I went into a golden land,
Chimborazo, Cotopaxi
Took me by the hand.

Turner’s poem is called ‘Romance’ and it records an experience most of us have felt at some point in childhood, the enchantment that arises from the potent combination of exotic names and far-off places, usually the result of reading books. I first heard it (my father was a great reciter of verse and lodged many poems in my head long before I ever read them, though I think it may have been my brother made me aware of this one) when I was still at primary school and I remember being perturbed by the second verse:

My father died, my brother too,
They passed like fleeting dreams,
I stood where Popocatapetl
In the sunlight gleams.

How could that happen and he not notice? I wondered. The fourth verse resonated with me: in those far-off days, we thought nothing of walking considerable distances to school, and being the youngest, I was often on my own, my brothers having moved on to the Big School, and I was certainly a dreamer:

I walked in a great golden dream
To and fro from school—
Shining Popocatapetl
The dusty streets did rule.

volcano-popocatepetl-by-jakub-hejtmanek-wallpaperShining Popocatapetl, Mexico – photo by Jakub Hejtmanek

This poem came to mind on my morning walk when I was trying to recall when I first read Plato. I was about fourteen; it was a summer holiday in Barra, with no television. We learned to play cribbage and I read Plato’s Republic. So, not the misty heights of the South American volcanoes but a golden land of a different sort, the bright morning sunshine of the Mediterranean and ancient Greece, ‘when all the world was young’ – and not Romance, but Philosophy.

I say it was not Romance, and yet I wonder. For all his stern strictures on the ‘deception’ of art and poetry (which he would banish from his ideal state, unless it could be used for propaganda purposes) Plato is at his most persuasive when he is at his most poetic: the Simile of the Cave, where the prisoner starts out shackled in darkness, watching the play of shadows on the wall, but escapes to the upper world and gazes at last on the Sun of Truth, remains one of the most potent invitations to the study of philosophy.

Central to Plato’s thought is his Theory of Forms (or Ideas). This posits a world of immutable Forms which are what really exists – that is Reality; the world we perceive with our senses is deceptive Appearance, a mere shadow, whose contents stand to the World of Forms as the copy to the original. Thus, there are many tables, but each is an instance or expression of the single Idea or Form, ‘Table’.

As a teenager I found this beguiling, but I think an ambivalence was always there: although Plato plainly states that this World of Forms can be apprehended only by the intellect and not by the senses, his own presentation of it is so vivid (The Simile of the Cave and the Myth of Ur, which is an account of metempsychosis, following the journey of the soul after death into the timeless world of Forms thence to rebirth in another body, where its ‘acquisition’ of knowledge is actually ‘anamnesis’ or remembering its sojourn between death and birth) that it lends it the quality of concrete reality; to my teenage mind it was super-real: it had all the the vividness of the sensible world, only more so; as such it was a continuation by other means of fairyland and all the mysterious realms that had succeeded it in the stories of my childhood – it was the secret realm that lies hidden behind everyday reality, attainable only to the fortunate few.

(Another strand that was important to me was the compatibility of Plato’s thought with my religious beliefs – which should be no surprise, given that Platonism was the first big philosophical influence on Catholic thought, long before Aquinas assimilated Aristotle)

It has taken a good four decades and more for my perspective on Plato to shift. I think his way of looking at things retains a great deal of potency but is mistaken (or rather misleading) in two key particulars. The first of these is the elevation of the intellect with a concomitant denigration of the senses. I am beginning to think that this may have been a major wrong turning in Western thought and that its effects have been almost wholly pernicious. Plato may not be the first but he is certainly the foremost in establishing the antithesis between Appearance and Reality, effectively relegating the senses (and with them the emotions) to an insignificant and untrustworthy sideshow: the senses cannot be trusted; the intellect alone apprehends Truth. That is something that has bedevilled Western thought ever since; it could be summed up as the triumph of Head over Heart.

The second fault is not in his description but his labelling of it. There is a world that is apprehended by the intellect and a world apprehended by the senses, but it is the latter that is Real and Original, the former that is artificial and derivative.

(At this point, a curious things happened. Casting about for a suitable image to convey that Plato’s way of looking at things was a complete inversion of how they actually are, I recalled a particular optical illusion, where a hollow mask is rotated and we see it as a positive, convex face whichever side we see, and to accommodate this, we reverse the direction of its rotation. I recalled that I had used it in a previous piece I had written (Force of Habit) but when I checked, the link was broken. Searching for another version I came on this but what really excited me was the note at the end:

‘this illusion often fails to work on people suffering with schizophrenia; they are able to see the hollow mask for what it is. In this case the raw visual information (bottom-up processing) is not over-ridden by higher cognitive processes (top-down processing). Some psychologists believe that this dominance of bottom-up processing over top-down processing contributes to the sense of dissociation from reality.’

Top-down processing suggests that we form our perceptions starting with a larger object, concept, or idea before working our way toward more detailed information. In other words, top-down processing happens when we work from the general to the specific; the big picture to the tiny details.’

Screenshot 2015-03-23 13.13.06

This, couched in different language, is just what Plato proposes in his Theory of Forms: the Form or Idea is general, and specific instances are derived from it (interestingly, it was Plato’s pupil Aristotle who devised the system of classification using Genus and Species, where things are grouped together according to their common or general characteristics, and subdivided according to their specific or detailed differences – a way of looking at things that seems so ‘natural’ that we forget that it was an invention).

Biological-Taxonomy

However, what excited me even more than this unexpected sidelight on Plato was how well the idea that the ‘dominance of bottom-up processing over top-down processing contributes to the sense of dissociation from reality’ fitted with the notion I floated in my last piece  namely that some (perhaps much) ‘mental illness’ has its roots in an inability to learn the conventional way of seeing the world that most of us have adopted. Of course, from my point of view, I would insist on the inverted commas round ‘reality’ here, and I would resist the superiority implied in describing ‘top-down processing’ as ‘higher cognitive processes’.

In other words, Plato’s Theory of Forms – or ‘top-down processing’ if you prefer – is the very ‘carapace’ that we interpose between ourselves and reality, as discussed in my previous articles [here and here]. It is worth exploring this idea further.

The first thing to say is that we must remember, first and foremeost, that what we are discussing here are not actual things but ways of seeing – what Plato (and all who have followed) are offering is a way of looking at the world, a way of thinking about it – ‘seeing it as’ .

In saying this, I do not mean that before Plato no-one saw it this way and that since then everyone has learned consciously to do so. What Plato has made explicit and others (principally Aristotle) have refined is a technique, a way of dealing with the world, of operating in it, that was doubtless already implicit in much of our behaviour (though it would be interesting to know to what extent).

At the heart of this technique is abstraction, or the power to generalise, the trick of ignoring (superficial) difference and homing in on (underlying) similarity. This is certainly a very powerful tool: it enables us to use general terms, group things under the same head: ‘tree’ for all and any tree, ‘car’ ‘man’ ‘insect’ and so on. We can imagine that without it our mental processes might be very cumbersome; certainly our language would be. (I have discussed an aspect of this before, in relation to number, here).

I remember as a youth having an interesting discussion with an elderly Australian jesuit, Fr. John Flynn, an eminent islamic scholar among other things. His brother was a man of the same cut and had compiled one of the first dictionaries of the Australian Aboriginal tongue. One point that has stuck in my mind was that (apparently) they had no single verb ‘to wash’ but used a different word depending on what was being washed – the feet, the hands, the head, some article*. Fr Flynn cited this as evidence of ‘primitive’ thinking and I remember arguing that it might rather have been that, for them, more significance attached to the difference between the specific acts than to the similarity of the action, so that to suppose that washing the feet was like washing the face might strike them as ludicrous or possibly indecent.

This calls to mind what is said in the excerpt quoted above about one set of cognitive processes ‘over-riding’ another and the attitude this implies. We could say that the Aboriginal Australian (in the instance cited) has not developed the ‘higher’ cognitive processes that enable him to see that all acts of washing are essentially the same, share the same general form; but equally the Aboriginal Australian could retort that our debased ‘Western’ way of looking at things is a bit like having bad eyesight – we can no longer distinguish critical details. It is we who are deficient: we have forgotten how to see.

I find that idea exciting. It resonates with other things that I feel are bound up with this whole area of discussion, the question of what constitutes Reality and how best to perceive it. One is the celebrated ability of the Aboriginal Australian to ‘read’ the landscape and navigate without any of the aids that ‘Westerners’ require; when it comes to reading our surroundings, it is we who are illiterate. (When we lost this ability is an interesting matter to consider. There has recently been something of a revival in Britain of the idea of reading a landscape in this way (as here, for instance) and it is a commonplace that those who work close to nature and depend on it for their livelihood – shepherds, farmers, fishermen, say – are much more skilled in gleaning information from their surroundings.)

Another point of resonance is the experience of learning to draw, and developing skill in art generally; one of the first things you have to be aware of is the extent to which we allow concepts to interpose between us and the thing we are looking at. The simple exercise of drawing a familiar object – a cup, say – soon brings this home. We know what a cup is – we have the idea of it ‘in our head’: it seems superfluous to provide an example; we could draw one from our imagination. And to begin with, that is what we do. We fail to see the specific cup that is in front of us and draw our idea of it instead; we need to learn various techniques for seeing past the concept to the actual object, which is a pattern of light, shade and colour. (One such is the technique of ‘negative space’ where instead of attending to the object, you look at the space round about it (see some interesting applications here)).

Here I think we are approaching the heart of the matter, which is the possibility that we have evolved a way of seeing the world that has proved so useful and beneficial in so many respects that we have become blind to its shortcomings (it is, in fact, a form of elective indispensability, an idea I discuss in an earlier piece). The consequence is that when we experience difficulty as a result of these shortcomings – as I think is increasingly the case – we fail to recognise the source. We resemble, if you like, people who have become increasingly wearied and burdened by a heavy back-pack and try every method to make it easier to carry – walking sticks, different diet, improved fitness – save the obvious one of taking it off.

*I am open to correction here, as I am recalling something from forty years ago.