Heart-thought

When I was young and studying philosophy at Edinburgh University I remember becoming excited about the figurative use of prepositions; they seemed to crop up everywhere, openly and in disguise as Latin prefixes, in uses that clearly were not literal. Reasoning from the fact that the meaning of any preposition could be demonstrated using objects and space, I concluded that a world of objects and space was implied in all our thinking, and that this might act as a limit on what and how we thought.

2016-04-27 14.26.34

What strikes me about this now is not so much the idea as the assumptions on which it is based: I have made Language in its full-blown form my starting point, which is a bit like starting a history of transport with the motor-car. As I have suggested before, what we think of as ‘Language’ is a relatively recent development, arising from the invention of writing and the influence it has exerted on speech, simultaneously elevating it above all other forms of expression and subjugating it to the written form. It is the written form that gives language an objective existence, independent of human activity, and relocates ‘meaning’ from human activity (what Wittgenstein terms ‘language games’ or ‘forms of life’) to words themselves; and alongside this, it makes possible the systematic anaylsis of speech [as discussed in The Muybridge Moment].

In that earlier theory of mine I took for granted a number of things which I now think were mistaken. The first, as I have said, is that the milieu which gives rise to the figurative use of words is the developed form of language described above; that is to confuse the identification and definition of something with its origin, rather as if I were to suppose that a new species of monkey I had discovered had not existed before I found and named it.

Bound up with this is the model of figurative language which I assumed, namely that figurative use was derived from literal use and dependent upon it, and that literal use was prior and original – in other words, that we go about the world applying names like labels to what we see about us (the process of ‘ostensive definition’ put forward by St Augustine, and quoted by Wittgenstein at the start of his Philosophical Investigations) and only afterwards develop the trick of ‘transferring’ these labels to apply to other things (the word ‘metaphor’ in Greek is the direct equivalent of ‘transfer’ in Latin – both suggest a ‘carrying over or across’).

Points to note about this model are that it is logically derived and that it presents metaphorical thinking as an intellectual exercise – it is, as Aristotle describes it, ‘the ability to see the similarity in dissimilar things.’

The logic appears unassailable: clearly, if metaphor consists in transferring a word from its literal application and applying it elsewhere, so that the sense of the original is now understood as applying to the new thing, then the literal use must necessarily precede the metaphorical and the metaphorical be wholly dependent on and derived from it: to say of a crowd that it surged forward is to liken its action to that of a wave, but we can only understand this if we have the original sense of ‘surge’ as a starting point.

However, there is a difficulty here. It is evident that there can be no concept of literal use and literal meaning till there are letters, since the literal meaning of ‘literal’ is ‘having to do with letters’. Only when words can be written down can we have an idea of a correspondence between the words in the sentence and the state of affairs that it describes (what Wittgenstein in the Tractatus calls the ‘picture theory’ of language). If what we term metaphors were in use before writing was invented – and I am quite certain that they were – then we must find some other explanation of them than the ‘transfer model’ outlined above, with its assumption that literal use necessarily precedes metaphorical and the whole is an intellectual process of reasoned comparison.

The root of the matter lies in the fact already mentioned, that only with the invention of a written form does the systematic analysis of speech become possible, or indeed necessary. Before then (as I suggest in ‘The Disintegration of Expression‘) speech was one facet or mode of expression, quite likely not the most important (I would suggest that various kinds of body language, gesture and facial expression were possibly more dominant in conveying meaning). It was something that we used by instinct and intuition rather than conscious reflection, and it would always have been bound up with some larger activity, for the simple reason that there was no means of separating it (the nearest approach would be a voice speaking in the dark, but that is still a voice, with all the aesthetic qualities that a voice brings, and also by implication a person; furthermore, it is still firmly located in time, at that moment, for those hearers, in that situation. Compare this with a written sentence, where language for the first time is able to stand on its own, independent of space and time and not associated with any speaker).

In other words, when metaphor was first defined, it was in terms of a literate language, and was seen primarily as a use we make of words. (Given the definition supplied by Belloc’s schoolboy, that ‘a metaphor is just a long Greek word for a lie’, there is an illuminating parallel to be drawn here with lying, which might be defined as ‘making a false statement, one that is not literally true’. This again puts the focus on words, and makes lying primarily a matter of how words are used and what they mean. The words or the statement are seen as what is false, but actually it is the person – hence the old expression ‘the truth is not in him’. Deceit consists in creating a false appearance, in conveying a false impression: words are merely instrumental, and though certainly useful – as a dagger is for murder – are by no means necessary. We can lie by a look or an action; we can betray with a kiss.)

There is a great liberation in freeing metaphor from the shackles that bind it to literal language (and to logic, with which it is at odds, since it breaks at least two of the so-called ‘laws of thought’ – it violates the law of identity, which insists that ‘A is A’, by asserting that A is B, and by the same token, the law of contradiction, which insists that you cannot have A and not-A, by asserting that A is not-A). It allows us to see it from a wholly new perspective, and does away with the need to see it either as an intellectual act (‘seeing the similarity in dissimilars’) or as something that necessarily has to do with words or even communication; I would suggest that metaphor is primarily a way of looking at the world, and so is first and foremost a mode of thought, but one that operates not through the intellect and reason but through intuition and feelings.

To illustrate this, I would like to take first an example I came up with when I was trying to envisage how metaphor might have evolved. Two brothers, out in the bush, come on a lion, at a safe distance, so that they can admire its noble mien and powerful grace without feeling threatened. One brother smiles and says ‘mother!’ The other, after an initial look of puzzlement, nods his head in affirmation and laughs.

The explanation I furnished to accompany this is that their mother is a formidable and beautiful woman and that the first brother, seeing the lion, is reminded of her, and by naming her, invites his brother to make the same comparison that has already occurred to him, which he does after a moment’s puzzlement, and the two take pleasure in this new and unexpected – yet apt – use of the word.

20160427_143706

I think that the focus here is wrong: it is still concerned to make metaphor about words, and to see it primarily as a way of communicating ideas.

I would now like to alter the story slightly. A man on his own in the bush catches sight of the lion (from a safe distance, as before). On seeing it, he is moved: the sight of it stirs him, fills him with a mixture of awe and delight. And it is not what he sees, but rather what he feels, that calls his mother to mind: the feeling that the lion induces him is the same as he has felt in the presence of his mother. That is where the identification takes place, in the feeling: the outer circumstances might differ (the lion in the bush, his mother in the village) but the inner feeling is the same. If we think of an experience as combining an external objective component with an internal subjective one (and I am carefully avoiding any notion of cause and effect here) then the origin of metaphor lies in experiences where the external objective component differs but the internal subjective component is the same.

Why am I wary of saying ‘the sight of the lion causes the same feelings that the sight of his mother does’ ? Because it strikes me as what I would call a ‘mixed mode’ of thinking: it imports the notion of causality, a modern and analytic way of thinking, into an account of an ancient and synthetic way of thinking, thus imposing an explanation rather than simply describing. (This is difficult territory because causality is so fundamental to all our explanations, based as they are on thinking that makes use of literate language as its main instrument)

What I want to say is this: causal explanations impose a sequence – one thing comes first – the cause – and elicits the other, the effect. So if we stick with the man and the lion we would analyse it like this: ‘sense data arrive in the man’s brain through his eyes by the medium of light, and this engenders a physical response (spine tingling, hair standing on end, a frisson passing over the body) which the man experiences as a feeling of awe and delight.’

We can demonstrate by reason that the lion, or the sight of it, is the cause and the emotion the effect, because if we take the lion away (for instance, before the man comes on it) the man does not experience the emotion (although he may experience ‘aftershocks’ once it has gone, as he recalls the sight of it).

But there is a fault here. If we leave the lion but substitute something else for the man – an antelope, say, or a vulture – does it still have the same effect? It is impossible to say for sure, though we may infer something from how each behaves – the antelope, at the sight (and quite probably the scent) of the lion might bound away in the opposite direction, while the vulture (sensing the possibility of carrion near by or in the offing) might well move closer.

My point is that the analysis of cause and effect is rather more complex  than I have presented it here, which is much as David Hume makes it out to be, with his analogy with one billiard ball striking another; as Schopenhauer points out, what causes the window to shatter is not the stone alone, but the fact of its being thrown with a certain force and direction combined with the brittleness of the glass (and if the stone is thrown by a jealous husband through his love rival’s window, then we might need to include his wife’s conduct and the construction he puts upon it in the causal mix). Change any one of these and the result is different.

My being human is as much a precondition for the feelings I experience in the presence of a lion as the lion is, and I think that this is a case where, as Wordsworth puts it, ‘we murder to dissect’ – it is much more enlightening to consider the experience as a single simultaneous event with, as I have suggested, an inner and an outer aspect that are effectively counterparts. So the lion is the embodiment of the man’s feelings but so is his mother, and the lion and his mother are identified by way of the feelings that both embody; and the feelings are in some sense the inner nature or meaning of both the lion and the mother (think here of all the songs and poetry and music that have been written where the lover tries to give expression to his feelings for his beloved). This interchangeability and the identity of different things or situations through a common feeling aroused in each case is the foundation of metaphor and, I think, the key ‘mechanism’ of Art.

(This has an interesting parallel with the philosophy of Schopenhauer, as expressed in the title of his work Die Welt als Wille und Vortsellung, variously translated as ‘The World as Will and Representation’ or ‘The World as Will and Idea’. In this he borrows from Eastern philosophy to present the world as having a dual aspect – objectively, as it appears to others and subjectively, as it is in itself. Its objective aspect, Representation, is made known to us via our senses, and is the same world of Objects and Space with which this discussion began; we cannot by definition see what it is like in itself since it only ever appears as object, but once we realise that we ourselves are objects in the ‘World as Representation,’ we can gain a special insight by ‘turning our eyes inward’ as it were, and contemplating our own inner nature, which we know not by seeing but by being it.

And what do we find? For Schopenhauer, it is the Will; and the revelation is that this is not an individual will – my will as opposed to yours – it is the same Will that is the inner nature of everything, the blind will to exist, to come into being and to remain in being. (This bears a striking resemblance to the position advanced by evolutionary biologists such as Richard Dawkins, for whom humankind is effectively a by-product of our genetic material’s urge to perpetuate itself).)

I would diverge from Schopenhauer – and the evolutionary biologists – in their pessimistic and derogatory account of the inner nature of things, on two grounds. The first is that it makes us anomalous. Schopenhauer asserts that ‘in us alone, the Will comes to consciousness’ but is unable to explain why this should be so, while his only solution to the revelation that all things are just the urges of a blind and senseless will is effectively self-annihilation (not a course he chose to pursue himself, as it happens – he lived to be 72). There is a lack of humility here that I find suspect, a desire still to assert our uniqueness and importance in a senseless world. If the Will is indeed the inner nature of all things (and that is questionable) why should we consider ourselves the highest manifestation of it?

2016-04-27 14.24.26

The second ground is the nature of the feelings that I describe, which are the opposite of pessimistic: they are uplifting, feelings of awe, elation and delight. There is a fashion nowadays for explaining everything in terms of genetic inheritance or evolutionary advantage (‘stress is a manifestation of the fight-or-flight reaction’ for instance, or any number of explanations which couch our behaviour in terms of advertising our reproductive potential) but I have yet to come across any satisfactory explanation in the same terms of why we should feel elated in the presence of beauty, whether it is a person, an animal, a landscape, the sea or (as Kant puts it) ‘the starry heavens over us*’. The characteristic feature of such experiences is ‘being taken out of yourself’ (which is what ‘ecstasy’ means) a feeling of exaltation or rapture, of temporarily losing any sense of yourself and feeling absorbed in some greater whole.

I would venture that this disinterested delight is the single most important aspect of human experience and is (in Kantian phrase) ‘worthy of all attention.’

*The full quotation is not without interest: “Two things fill the mind with ever new and increasing admiration and awe, the more often and steadily we reflect upon them: the starry heavens above me and the moral law within me. I do not seek or conjecture either of them as if they were veiled obscurities or extravagances beyond the horizon of my vision; I see them before me and connect them immediately with the consciousness of my existence.” (Critique of Practical Reason)

The Muybridge Moment

Muybridge-2

The memorable Eadweard Muybridge invented a number of things, including his own name – he was born Edward Muggeridge in London in 1830. He literally got away with murder in 1872 when he travelled some seventy-five miles to shoot dead his wife’s lover (prefacing the act with ‘here’s the answer to the letter you sent my wife’) but was acquitted by the jury (against the judge’s direction) on the grounds of ‘justifiable homicide’. He is best known for the sequence of pictures of a galloping horse shot in 1878 at the behest of Leland Stanford, Governor of California, to resolve the question of whether the horse ever has all four feet off the ground (it does, though not at the point people imagined). To capture the sequence, Muybridge used multiple cameras and devised a means of showing the results which he called a zoopraxoscope, thereby inventing stop-motion photography and the cinema projector, laying the foundations of the motion-picture industry.

The_Horse_in_Motion-anim

(“The Horse in Motion-anim” by Eadweard Muybridge, Animation: Nevit Dilmen – Library of Congress Prints and Photographs Division; http://hdl.loc.gov/loc.pnp/cph.3a45870. Licensed under Public Domain via Commons – https://commons.wikimedia.org/wiki/File:The_Horse_in_Motion-anim.gif#/media/File:The_Horse_in_Motion-anim.gif)

Muybridge’s achievement was to take a movement that was too fast for the human eye to comprehend and freeze it so that each phase of motion could be analysed. It was something that he set out to do – as deliberately and methodically as he set out to shoot Major Harry Larkyns, his wife’s lover.

It is interesting to consider that something similar to Muybridge’s achievement happened a few thousand years ago, entirely by accident and over a longer span of time, but with consequences so far-reaching that they could be said to have shaped the modern world.

We do not know what prompted the invention of writing between five and six thousand years ago, but it was not a desire to transcribe speech and give it a permanent form; most likely it began, alongside numbering, as a means of listing things, such as the contents of storehouses – making records for tax purposes, perhaps, or of the ruler’s wealth – and from there it might have developed as a means of recording notable achievements in battle and setting down laws.

We can be confident that transcribing speech was not the primary aim because that is not something anyone would have felt the need to do. For us, that may take some effort of the imagination to realise, not least because we live in an age obsessed with making permanent records of almost anything and everything, perhaps because it is so easy to do so – it is a commonplace to observe that people now seem to go on holiday not to enjoy seeing new places at first hand, but in order to make a record of them that they can look at once they return home.

And long before that, we had sayings like

vox audita perit, littera scripta manet
(the voice heard is lost; the written word remains)

to serve as propaganda for the written word and emphasise how vital it is to write things down. One of the tricks of propaganda is to take your major weakness and brazenly pass it off as a strength (‘we care what you think’ ‘your opinion matters to us’ ‘we’re listening!’ as banks and politicians say) and that is certainly the case with this particular Latin tag: it is simply not true that the spoken word is lost – people have been remembering speech from time immemorial (think of traditional stories and songs passed from one generation to the next); it is reasonable to suppose that retaining speech is as natural to us as speaking.

If anything, writing was devised to record what was not memorable. Its potential beyond that was only slowly realised: it took around a thousand years for anyone to use it for something we might call ‘literature’. It is not till the classical Greek period – a mere two and a half millennia ago (Homo sapiens is reckoned  at 200,000 years old, the genus Homo at 2.8 million)  – that the ‘Muybridge moment’ arrives, with the realisation that writing allows us to ‘freeze’ speech just as his pictures ‘froze’ movement, and so, crucially, to analyse it.

When you consider all that stems from this, a considerable degree of ‘unthinking’ is required to imagine how things must have been before writing came along. I think the most notable thing would have been that speech was not seen as a separate element but rather as part of a spectrum of expression, nigh-inseparable from gesture and facial expression.  A great many of the features of language which we think fundamental would have been unknown: spelling and punctuation – to which some people attach so much importance – belong exclusively to writing and would not have been thought of at all; even the idea of words as a basic unit of language, the building blocks of sentences, is a notion that only arises once you can ‘freeze’ the flow of speech like Muybridge’s galloping horse and study each phase of its movement; before then, the ‘building blocks’ would have been complete utterances, a string of sounds that belonged together, rather like a phrase in music, and these would invariably have been integrated, not only with gestures and facial expressions, but some wider activity of which they formed part (and possibly not the most significant part).

As for grammar, the rules by which language operates and to which huge importance is attached by some, it is likely that no-one had the least idea of it; after all, speech is even now something we learn (and teach) by instinct, though that process is heavily influenced and overlaid by all the ideas that stem from the invention of writing; but then we have only been able to analyse language in that way for a couple of thousand years; we have been expressing ourselves in a range of ways, including speech, since the dawn of humanity.

When I learned grammar in primary school – some fifty years ago – we did it by parsing and analysis. Parsing was taking a sentence and identifying the ‘parts of speech’ of which it was composed – not just words, but types or categories of word, defined by their function: Noun, Verb, Adjective, Adverb, Pronoun, Preposition, Conjunction, Article.

Analysis established the grammatical relations within the sentence, in terms of the Subject and Predicate. The Subject, confusingly, was not what the sentence was about – which puzzled me at first – but rather ‘the person or thing that performs the action described by the verb’ (though we used the rough-and-ready method of asking ‘who or what before the verb?’). The Predicate was the remainder of the sentence,  what was said about (‘predicated of’) the Subject, and could generally be divided into Verb and Object (‘who or what after the verb’ was the rough and ready method for finding that).

It was not till I went to university that I realised that these terms – in particular, Subject and Predicate – derived from mediaeval Logic, which in turn traced its origin back to Aristotle (whom Dante called maestro di color che sanno – master of those that know) in the days of Classical Greece.

Socrates

Socrates

Plato

Plato

Aristotle

Aristotle

Alexander the Great

Alexander the Great

Aristotle is the third of the trio of great teachers who were pupils of their predecessors: he was a student of Plato, who was a student of Socrates. It is fitting that Aristotle’s most notable pupil was not a philosopher but a King: Alexander the Great, who had conquered much of the known world and created an empire that stretched from Macedonia to India by the time he was 30.

That transition (in a period of no more than 150 years) from Socrates to the conquest of the world, neatly embodies the impact of Classical Greek thought, which I would argue stems from the ‘Muybridge Moment’ when people began to realise the full potential of the idea of writing down speech. Socrates, notably, wrote nothing: his method was to hang around the market place and engage in conversation with whoever would listen; we know him largely through the writings of Plato, who uses him as a mouthpiece for his own ideas. Aristotle wrote a great deal, and what he wrote conquered the subsequent world of thought to an extent and for a length of time that puts Alexander in eclipse.

In the Middle Ages – a millennium and a half after his death – he was known simply as ‘The Philosopher’ and quoting his opinion sufficed to close any argument. Although the Renaissance was to a large extent a rejection of Aristotelian teaching as it had developed (and ossified) in the teachings of the Schoolmen, the ideas of Aristotle remain profoundly influential, and not just in the way I was taught grammar as a boy – the whole notion of taxonomy, classification by similarity and difference, genus and species – we owe to Aristotle, to say nothing of Logic itself, from which not only my grammar lessons but rational thought were derived.

I would argue strongly that the foundations of modern thought – generalisation, taxonomy, logic, reason itself – are all products of that ‘Muybridge Moment’ and are only made possible by the ability to ‘freeze’ language, then analyse it, that writing makes possible.

It is only when you begin to think of language as composed of individual words (itself a process of abstraction) and how those words relate to the world and to each other, that these foundations are laid. Though Aristotle makes most use of it, the discovery of the power of generalisation should really be credited to his teacher, Plato: for what else are Plato’s Ideas or Forms but general ideas, and indeed (though Plato did not see this) those ideas as embodied in words? Thus, the Platonic idea or form of ‘table’ is the word ‘table’ – effectively indestructible and eternal, since it is immaterial, apprehended by the intellect rather than the senses, standing indifferently for any or all particular instances of a table – it fulfils all the criteria*.

Which brings us to Socrates: what was his contribution? He taught Plato, of course; but I think there is also a neat symbolism in his famous response to being told that the Oracle at Delphi had declared him ‘the wisest man in Greece’ – ‘my only wisdom is that while these others (the Sophists) claim to know something, I know that I know nothing.’ As the herald of Plato and Aristotle, Socrates establishes the baseline, clears the ground, as it were: at this point, no-one knows anything; but the construction of the great edifice of modern knowledge in which we still live today was just about to begin.

However, what interests me most of all is what constituted ‘thinking’ before the ‘Muybridge Moment’, before the advent of writing – not least because, whatever it was, we had been doing it for very much longer than the mere two and a half millennia that we have made use of generalisation, taxonomy, logic and reason as the basis of our thought.

How did we manage without them? and might we learn something useful from that?

I think so.

*seeing that ideas are actually words also solves the problem Hume had, concerning general ideas: if ideas are derived from impressions, then is the general idea ‘triangle’ isosceles, equilateral, or scalene or some impossible combination of them all? – no, it is just the word ‘triangle’. Hume’s mistake was in supposing that an Idea was a ‘(faint) copy’ of an impression; actually, it stands for it, but does not resemble it.

The Mind’s Eye: Digital Camera or Camera Obscura?

HAMLET
My father!–methinks I see my father.

HORATIO
Where, my lord?

HAMLET
In my mind’s eye, Horatio.

CO

From this we know that the ‘inward eye which is the bliss of solitude’ was a current notion in Shakespeare’s day, and doubtless it is a great deal older than that; yet a conversation I had today reminded me of something I struggled to put into words (rather badly) a long time ago.

I was writing, I recall, about David Hume, quite possibly in an exam, maybe in an essay. Hume, at that time, was a source of great annoyance to me: I found his logic unassailable, yet I was sure he was wrong. The consequence was that I rather foolishly picked fights with him at every turn, when all that the university authorities required was that I demonstrate an adequate familiarity with his work.

Allan Ramsay, David Hume, 1711 - 1776. Historian and philosopher

Anyway, what I was taking issue with on this occasion was Hume’s notion that

‘All the perceptions of the human mind resolve themselves into two distinct kinds, which I shall call IMPRESSIONS and IDEAS. The difference betwixt these consists in the force and liveliness with which they strike upon the mind, and make their way into our thought and consciousness. Those perceptions, which enter with most force and violence, we may name impressions…  By ideas I mean the faint images of these in thinking and reasoning’

(those of you familiar with Hume will recognise that as the opening sentences of his Treatise of Human Nature – I really didn’t get very far at all before I started objecting)

I remember conducting an experiment (and this makes me think it must have been an essay, though to be sure, I might have repeated the same argument in an exam): I stared at something – it was a brown le Creuset saucepan, in my recollection, but we’ll come back to that – and then I closed my eyes and attempted to attend to the ‘faint image’ of it in my mind.

I insisted at the time that there was nothing to be seen.

There was something – I could have drawn a picture from it, or described it  – a small, heavy cast-iron pan, brown on the outside, creamy off-white within, the handle tapered towards the pot, rounded at the end and pierced with a hole which gave access to the hollow interior. But an image? a picture?

It wasn’t like seeing, I insisted: my argument with Hume was that he said that ideas were faint copies of impressions – like seeing, only less vividly, more feebly – and I felt that was not the case.

(I remember rehearsing a similar argument with my late brother Brendan in a slightly different form – my case then was that we had learned to talk about dreams as if they were essentially visual, when in fact they weren’t – but we had drink taken and our faculties were impaired, not least by the fact we thought them enhanced)

When I think back on this now, I feel I was being hard on Hume, though I think I have a point worth considering. I can still call that saucepan to mind, but I am no no longer sure if it was indeed what I conducted my experiment with – my recollection is that it was, but in the intervening years I have grown less willing to rely on the accuracy of memory; I accept that it invents.

But what does this ‘calling to mind’ involve? Davy Hume did not have the advantage I have of being familiar with faded images – an old photograph, a photocopy made when the toner is running low – so it is easy for me to conjure such a thing and say that an ‘idea’ is not like that – for me, at any rate. And though it is hard on Hume, I think I was right in this respect – a faded photograph, a feeble photocopy emphasises the relation with the original – it is an inferior duplicate, the same kind of thing, but not so well-realised – and that is something that Hume wants to say, that ideas are mere copies of impressions, and the reception of them by the mind is entirely passive: impressions enter via our senses with ‘force and violence’ and leave an impression behind, as a seal upon wax (though Hume expressly repudiates the image implied in the word – another point on which I took issue with him – if he did not mean ‘to express the manner, in which our lively perceptions are produced in the soul’ then why not use some other word? he could have called them ‘humes’).

schopenhauer1

Schopenhauer explodes this view (building on Kant’s insight) by pointing out that perception is not mere passive reception, but an intellectual act – we do not receive impressions ready made, like postcards through a letter box; we create them by processing the data – it is more akin to taking a photograph (whether digitally or otherwise) – light is focused by a lens to produce a pattern on a receptive surface, and that pattern is then converted into a form that we can comprehend.

But back to our experiments. If I try to attend to what I ‘see’ when I call something to mind, I am not at all sure how to describe it. Two things strike me as noteworthy: there is always visual input alongside it, whether my eyes are open or shut (if they are shut, I have the dark inside of my eyelids, sometimes marked with afterimages of any bright light that was there when my eyes were open; if they are open, my ‘idea’ is not in any way superimposed on what I see – it does not exist in the same space – though attending to one distracts from the other); and I cannot hold it still – it seems always to ‘slip away’ just when I most want to ‘look’ at it (though equally I can ‘move around’ it:  it is not two- dimensional, and is more like a model than a picture). Yet in spite of these difficulties I can confidently produce verbal descriptions or drawings of what I am ‘calling to mind’ – and indeed using words to express what I have in mind strengthens and clarifies whatever it is.

It strikes me that the comparison I have made above may hold the answer: in digital photography, the image exists in another form which is clearly non-visual – it is digitized, binary code – yet it still corresponds exactly to the visual image, to which it can be converted at any time.  What I have been saying about ‘ideas’ above – that they have all the properties of a visual image (we can use them the same way, can convert them – by words or lines – into actual pictures) yet lack the obvious one – we can’t actually see them – has an exact parallel in digitized images stored on a card: our camera can read them, and convert them into a form that our eye can read; but with ideas, it is our brain (or mind) that ‘reads’ them.

In other words, we ‘store’ our ideas in a different form, but one we can comprehend directly (just as if we called something in French to mind, but gave the sense of it in English). The distinction can be made clearer if we consider an image cast by a lens: I remember as a child being delighted when our father showed us how to produce an image on the wall using a magnifying glass – there it was, exact in every detail, but miniature and inverted. The principles of the camera obscura have been known since antiquity, and it is not unlikely that Hume might have seen them demonstrated; enough was known about the physiology of the eye for him to form a notion of an image being exactly reproduced inside the head. What Hume (and my youthful self) lacked was a model of how that image might exist in a wholly different form yet remain readable.

So now I would like to say that all the data we draw on for our interior life – whether that is memory, dreams, imagining – is stored and read in a different form. It is only when we come to talk about it that we naturally fall into using the language of the senses: we speak, like Hamlet, of seeing in our mind’s eye, but it is a different kind of seeing.

And having arrived, with some labour, at the end of the journey I embarked on in that essay of thirty-odd years ago, I am unsure if it matters or not.

Vanishing Point and the Golden Rule (by way of Immanuel Kant)

I remember once becoming absurdly excited in Princes St. Gardens in Edinburgh – that was just where I chanced to be, not the cause of the excitement – when I realised that an interesting thing happens if you number the dimensions in the reverse of the conventional order.

My brother had once explained the concept of the fourth dimension to me by saying that it was ‘at right-angles to the third’ which he elucidated by explaining that, as a point drawn out in one direction gives a line, so that line, moved at right angles to itself, generates a plane surface, which in turn generates a volume when moved at right angles to itself; hence the next dimension, the fourth, must involve performing an identical operation with a cube.

This I found pleasantly bamboozling: I could follow the first three steps of the procedure perfectly, yet with the fourth there seemed nowhere left to go; yet I convinced myself that dwellers in a linear or planar universe would experience the same difficulty at stages two and three. Later, when the same brother suggested that the fourth dimension was Time, I formed some notion of a cube moving through space and leaving a sort of cubic trail behind it – which was the direction I was headed in when I started thinking about Duchamp’s Nude Descending a Staircase, here . There is a similar though less marked suggestion of movement in time in this picture, where Miss Kate Ward has admirably captured the headlong speed of my Dursley-Pedersen:

Dursley-Pedersen at speed

John Ward at speed on his Dursley Pedersen, photographed by his daughter Kate

But in Princes Street Gardens it struck me that if we begin with volume as the first dimension – and picture a cube, say – we can abstract a second dimension from that by attending to one of its surfaces and ignoring the rest; then by the same method, we can consider the edge of that surface in isolation to give us a line – from where, if we wish, we can go to its end, and arrive at a point – but that is, indeed, the end, and there is literally no point in going beyond it.

I still consider that a neat manoeuvre, though nowadays I am more interested in the point you end up with than in the doing away with the need for any further dimensions.

Duchamp’s nude shows a temporal sequence as a spatial one, and that is consistent with our normal usage, which is to talk about time in spatial terms – we speak of a length of time, a short time, a space of time and so on, and we use ‘before’ and ‘after’ – which denote spatial relations – to speak of temporal ones.

It is clear that the ideas of space, time and movement are bound together: indeed we use movement through space (which takes time) as a way of thinking about time, even where no movement through space is involved – and since we developed film technology, we have reinforced this with time-lapse photography, so that we see flowers opening and closing, buildings being erected, even carcases decaying – though in fact there is movement there, without change of location, though it is normally too slow for us to perceive as such – we only notice that there has been a significant change from an earlier state to a later one, without detecting it from moment to moment.

But could you arrive at the notion of movement (and so of time) in a world of immutable Forms, such as Plato envisages in his Republic? The Forms are timeless and unchanging, but if they are distinct and separate (and extended, for how else could we picture them?) it would seem that they must occupy space and there must be distance between them; and where there is distance, there is surely the possibility of movement, and so of time, even if there is none actually?

However, we have introduced another element here, the one that Berkeley drew to our attention: a perceiver. To picture a world of timeless forms is to do so from some point of view, a particular location within that world, and it would seem to be from there that the notion of movement arises – and indeed, might we go further and suggest that it is location that gives rise to the concept of space?

Does a point imply space? in other words, is even a point – location without extension – sufficient to require the whole of infinite space for it to be located in, or is that just a product of our thinking being tied to a three-dimensional model to start with?

But there is some sleight-of-mind here: we might picture a point in space from which we look outward, and so gain some sense of depth and distance, but in what direction are we looking? more to the point, what are we looking with? It seems we have smuggled in an eye, albeit an invisible one – though you might well ask how you can have an eye without a lens, an iris, an optic nerve – and some sort perceptive apparatus at the other end of that nerve.

This leads me to Kant, and his observation that ‘space, time and causality are the mode and manner of our perception.’ Hume – whom Kant spoke of as having roused him from his dogmatic slumber – had suggested that the only way we could arrive at an idea of causality was through observing invariable succession – we see that A is invariably followed by B and in time conclude that A causes B. This is not a very satisfactory account, and it is thoroughly demolished by Schopenhauer, who points out that the most familiar and invariable succession of night and day does not lead us to suppose that one causes the other.

It is, however, the best you can do if you want to insist – as Hume did – on the empirical principle, i.e. that all knowledge is derived from experience, via the senses. What Hume failed to grasp, as Schopenhauer pointed out, is that perception is an intellectual act, not the passive process Hume supposed it to be – the mind works on and arranges the data supplied by the senses – in fact it makes sense of them.

The stepping stone between Hume and Schopenhauer is Kant, who realised that causality was not something we derived from experience, but rather a pre-requisite for making sense of it – it is, if you like, part of our intellectual apparatus; and it is not alone – he couples it with Time and Space, grouping all three together as ‘the mode and manner of our perception’ and putting forward the notion that, far from being derived from experience, these three are the means by which we make sense of experience itself. In this, they have been likened to a set of spectacles that we cannot remove, even when we realise that they  condition all that we see. We are therefore in the frustrating position of knowing that the world as we know it exists only for us (or those similarly equipped) which is the truth that Berkeley realised with his esse est percipi ; what the world is actually like in itself (i.e. unconditioned by the apparatus of space, time and causality) we cannot imagine.

This is what started, like a hare from its form, the ding-an-sich or thing-in-itself which Schopenhauer eventually ran to ground (with the aid of oriental philosophy). Schopenhauer’s brilliant move is to point out that, while the world as it is known to us via the senses is indeed much as Kant suggests, a world of representation, of objects-for-the-subject, conditioned by our intellectual apparatus, that is not the sole aspect we have access to – if we turn our eyes inward, as it were, we become aware that there is, in ourselves, a sort of privileged glimpse of the inner nature of the world, the very thing-in-itself – namely, the Will.

For Schopenhauer, we as individuals are self-conscious outposts of a single and otherwise blind and unconscious Will which is manifested everywhere and in everything and whose sole aim is to exist. This has echoes in our own day in the ideas of the evolutionary biologists, that we are in effect the mere by-product of our genes’ determination to reproduce ad infinitum. It is not of course original to Schopenhauer, who derived it from his reading of ancient Eastern, particularly Indian, philosophy.

Schopenhauer draws pessimistic conclusions from this: effectively, all that self-consciousness has done is make us helpless spectators, aware that we are embodiments of a will that we cannot control but which rather drives us: all we can do, at best, is to quiet the will, to turn away from existence (though Arthur himself was happy to keep going to what was then the ripe old age of 72). Nietzsche, who saw himself as a disciple of Schopenhauer, takes the idea in a more sinister direction: we can embrace our situation, and recognise that the will takes us beyond good and evil – if we are strong, we should follow its promptings wherever it leads us and glory in it, rejecting the ‘slave mentality’ of judaeo-christian thought, which he saw as essentially a conspiracy of the weak to keep power from the strong. You do not need to go far beyond Nietzsche (he died, insane, in 1900) to find the horrors to which that ultimately led.

Yet this interpretation seems to turn on a needlessly pessimistic interpretation of the nature of the Will coupled to an erroneous aggrandisement of our own status – both Schopenhauer and Nietzsche assume that, in coming to consciousness in Man, the Will has attained its highest form of existence (though Nietzsche might qualify that by adding ‘so far’). However, if we look more closely at the judaeo-christian tradition which Nietzsche dismisses and at the Eastern thought that Schopenhauer derived his ideas from, we find a different way of looking at it – having looked into ourselves, and seen that we are embodiments of a single will (and might that will not be higher, rather than lower, and ourselves just waking into it, not yet fully comprehending it?) we can then look outwards again and see others not as different from ourselves but essentially the same – which is the foundation of compassion and the Golden Rule which characterises all the great belief systems – ‘treat others as you would have them treat you’ – or, if you prefer, love your neighbour as yourself  – (because that is what, in effect, he is). (For an excellent and inspiring 10’ talk on the Golden rule, click here. )

Elective Causality

‘Myself when young did eagerly frequent

Doctor and Saint and heard great argument

About it and about: but evermore

Came out by that same door as in I went.’

(however, let us keep Omar Khayyam for another day)

Myself when young was much annoyed by David Hume, particularly his account of causality, so I am very grateful to Schopenhauer for showing me the error of his ways; but that too is a tale for another place.

What I want to consider today is the idea that you can choose to make something a cause or ground for your actions, a notion I have labelled ‘elective causality.’

On one level, of course, this seems paradoxical – when we speak of ‘cause’ in philosophy, we presuppose necessity – the effect is that which follows necessarily from the cause, the cause that which necessarily (and invariably)produces the effect: you cannot have the one without the other; we use this as a powerful tool in all sorts of reasoning.

(Though Aristotle offers a very interesting analysis of cause which I will look at elsewhere)

But here is another kind of cause: my son dies, by his own hand – what am I to make of that? The surprising discovery is that you can make of it what you wish. You could make it a matter for shame, a family disgrace, never to be alluded to, something best forgotten – I’m sure that has happened in reality; certainly it is a commonplace in stories of a certain period.

Or you could make it a ground for savage misanthropy, for hating the world as a stupid and meaningless place and human existence itself as something equally stupid and meaningless – and people have done that too, I am sure.

Or you might say: the only thing I can do is try to live better because of him, to let him be at my side, prompting me to take the better course, to do the daring or adventurous thing, even just to make the effort, for his sake.

It seems to me that there is a causal link here, and a strong one (being forged from love in grief) and yet at the same time it is something freely chosen (which is why it must be perpetually renewed, though I suppose that habit will strengthen it).

I’m sure there must be plenty cases of this – certainly there are in stories (Michael Henchard, in the Mayor of Casterbridge, makes his shame over the drunken sale of his wife at the hiring fair the ground for reforming his life)(I am also reminded of the Ninevites, who listened to Jonah (when he eventually mustered the courage to turn up) and repented).

It is interesting territory: people will look at a life and say ‘that event changed him’ or ‘that was a turning point’ – and what followed could be good or bad – losing someone you love might drive you to despair and ruin (“he really went to pieces after his wife died”) or equally it could be the occasion of improvement (“he’s a changed man since that happened – hasn’t touched a drop, devotes himself to charitable causes”).

I suspect that we are more inclined to see the operation of the will in the good cases than the bad, and that is reflected in the language we use: when the outcome is a bad one, we say someone was ‘driven’ to despair, suicide or the like, which makes it seem against the will; but if the outcome is good, we speak of the person’s ‘waking up’ ‘having his eyes opened’ – which seems to suggest a two stage-process: you come to see something, and as a result, you alter course. Though, to be sure, we do also speak of a person’s being changed – “he’s a changed man since that happened” – which does suggest an external force.

I am reminded that Shakespeare has an interesting take on the same idea, which he puts in the mouth of Edmund in King Lear:

‘This is the excellent foppery of the world, that,
when we are sick in fortune,–often the surfeit
of our own behavior,–we make guilty of our
disasters the sun, the moon, and the stars: as
if we were villains by necessity; fools by
heavenly compulsion; knaves, thieves, and
treachers, by spherical predominance; drunkards,
liars, and adulterers, by an enforced obedience of
planetary influence; and all that we are evil in,
by a divine thrusting on: an admirable evasion
of whoremaster man, to lay his goatish
disposition to the charge of a star!’

All in all, an interesting subject for reflection, to which I will return.