The Actual Colour of the Sun

‘The sun is actually white, it just appears yellow to us through the Earth’s atmosphere.’

This is a line that appeared on Facebook a while ago, courtesy of my friend Else Cederborg, who posts all sorts of curious and interesting things.

It is a common form of argument that most will readily understand and generally accept without thinking too hard about it. Yet the sun is not actually white, nor does it just appear yellow, facts you can easily check for yourself.

Depending on the time of day and the atmospheric conditions, the sun is variously deep blood red, orange, dusky pink and (behind a veil of mist or thin cloud) a sort of milky white; much of the time it has an incandescent brilliance which can hardly be called a colour since we cannot bear to look at it directly. Though it may appear yellow, the usual place to encounter a yellow sun is in a representation of it: a child’s drawing, for instance (against a blue sky above the square white house, surrounded by green grass, with its four four-paned windows and red triangular roof with single chimney belching smoke) or else in the icons used in weather forecasting.

How we come to accept an argument that runs counter to all experience, and perversely insists on a condition for actuality (being seen outside the Earth’s atmosphere) which few of us will ever experience, is a case worth examining.

At the heart of it is that curious human thing, a convention (I say ‘human’ though it might be that other creatures have conventions, but that is for another time). A convention, effectively, is an agreement to treat things as other than they are – it is a form of pretence. This definition might seem surprising, since what is usually emphasised in conventions is their arbitrary character – the typical example being which side of the road we drive on, which varies from country to country.

However, what makes the rule of the road a convention is that we endow it with force, a force that it does not actually have: we say that you must drive on the left (or the right); that you have to – even though any one of us can show this not to be the case. Of course, if you engage in such a demonstration, you might find yourself liable to a range of legal sanctions, or worse still, involved in an accident. The fact that the rule has to be backed by force proves that it has no intrinsic force; on the other hand, the risk of accident shows that the pretence is no mere whim, but sound common sense: conventions are there because they are useful.

A great deal of human behaviour is conventional if you look at it closely: things are deemed to be the case which are not, actually (international borders is a good example – again, these are arbitrary, but it is the power with which they are endowed that makes them conventions). In this regard, it is worth recalling a remark that Wittgenstein makes somewhere (Philosophical Investigations, I think) to the effect that ‘explanation has to come to an end somewhere’. In other words, despite what we tell children, ‘just because’ is an answer. Conventions are so because we say they are so.

So what colour is the sun?

That takes us to the boldest and most fundamental of conventions, the one that underpins our standard way of thinking about the world, for which we have Plato to thank. Plato insists at the outset on discrediting the senses, saying that they deceive us, giving us Appearance only, not Reality.

This is a move of breathtaking boldness, since effectively it dismisses all experience as ‘unreal’ – as a starting position for philosophy, it ought to be quite hopeless, since if we cannot draw on experience, where are we to begin? If reality is not what we actually experience, then what can it be?

However, there is a different angle we can consider this from, which makes it easier to understand how it has come to be almost universally adopted. What Plato is proposing (though he does not see this himself) is that we should view the world in general terms: that we should not allow ourselves to be distracted by the individual and particular, but should see things as multiple instances of a single Idea or Form; or, as Aristotle develops it, as members of a class which can be put under a single general heading: trees, flowers, cats, horses, at one level, plants and quadrupeds at another, and so on.

Implied in this is the elimination of the subject: the world is not as it appears to me or to you or to any particular individual (since particular individuals exist only at the specific level) the world is to be considered as objective, as it is ‘in itself’, i.e. as a general arrangement that exists independently of any observation.

This is a very useful way of looking at things even though it involves a contradiction. Its utility is demonstrated by the extraordinary progress we have made in the 2.500 years or so since we invented it. It is the basis of science and the foundation of our systems of law and education.

Effectively, it starts by accepting the necessary plurality of subjective experience: we see things differently at different times – the sun is sometimes red, sometimes pink, sometimes incandescent; different people have different experiences: one man’s meat is another man’s poison. In the event of dispute, who is to have priority? That problem looks insoluble (though the extent to which it is an actual problem might be questioned).

So, we must set subjective experience and all that goes with it to one side, and rule it out as the basis of argument. Instead, we must suppose that the world has a form that exists independently of us and is not influenced by our subjective observation: we posit a state of ‘how things actually are in themselves’ which is necessarily the same and unchanging, and so is the same for everyone regardless of how it might appear.

This is how we arrive at the position stated at the outset, that the sun is ‘actually’ white, and that its ‘actual state’ should be taken as ‘how it appears from space,’ even though we live on earth and seldom leave it.

There is a confusion here, which has surfaced from time to time in the history of philosophy since Plato’s day. The strict Platonist would dismiss the whole question of the sun’s true colour in terms akin to Dr Johnson’s, on being asked which of two inferior poets (Derrick or Smart) was the greater: ‘Sir, there is no settling the point of precedency between a louse and a flea.’ Colour belongs to the world of Appearance, not the world of Forms or Ideas.

However, as the insertion of the argument about the earth’s atmosphere shows, we feel the need of a reason to dismiss the evidence of our own eyes (and this is where that little sleight-of-mind about ‘just appearing yellow’ comes in – so that we are not tempted to look too closely at the sun as it actually appears, we are fobbed off with a conventional representation of it that we have been accepting since childhood). The argument is that the atmosphere acts as a filter, much as if we looked at a white rose through coloured glass and variously made it appear green or red or blue: so just as we agree the rose is ‘actually’ white, so too the sun’s ‘actual’ colour must be as it appears unfiltered.

This, however, is specious. Leaving aside the fact that the rose is certainly not white (as you will soon discover if you try to paint a picture of it) at least our choice of default colour can be justified by adding ‘under normal conditions of light’ which most people will accept; but ‘as it appears outside the earth’s atmosphere’ can hardly be called ‘normal conditions of light.’

What has happened here is that the subjective ‘filter’ which Plato wished to circumvent by eliminating the subject – ‘let us ignore the fact that someone is looking, and consider only what he sees’ – has reappeared as an actual filter, so that the whole appearance/reality division has been inserted into the objective world by an odd sort of reverse metaphor. The need to give priority to one of the many possible states of the sun is still felt, and it is solved by arbitrarily selecting ‘the sun as it appears from space’ as its actual state, because that seems logical (though in fact it is not).

The subjectivity of colour in particular, and the subject-dependence of all perception in general, is like a submerged reef that appears at various periods in the sea of philosophy. Locke is troubled by colour, which he wants to class as a ‘secondary quality’ since it evidently does not inhere in the object, as he supposes the primary qualities – solidity, extension, motion, number and figure – to do. Colour is ranked with taste, smell and sound as secondary, which is just a restatement of Plato’s rejection of the senses in other terms.

Berkeley, however, sees that this is a specious distinction and gets to the heart of the matter with his axiom esse est percipi – ‘to be is to be perceived’. Everything we know requires some mind to perceive it: a point of view is implicit in every account we give. There can be no objective reality independent of a subject (as the terms themselves imply, since each is defined in terms of the other).

The response to this is illuminating. Berkeley’s dictum is rightly seen as fatal to the notion of an objectively real world, but instead of accepting this and downgrading Plato’s vision to a conventional and partial account employed for practical purposes, every effort is made to preserve it as the sole account of how things are, the one true reality.

Berkeley’s own position is to suppose that only ideas in minds exist, but that everything exists as an idea in the mind of God, so there is a reality independent of our minds, though not of God’s (a neat marriage of philosophy and orthodox christianity – Berkeley did become a bishop in later life. The Californian university town is named after him).

Kant’s solution is to assert that there is an independent reality – the ding-an-sich, or ‘thing-in-itself’ but logically it must be inaccessible to us: we know it is there (presumably as an article of faith) but we cannot know what it is like.

Schopenhauer’s solution is the most ingenious, and to my mind the most satisfying. He agrees that we cannot know the ding-an-sich as a general rule, but there is one notable exception: ourselves. We are objects in the world and so are known like everything else, via the senses – this is Plato’s world of Appearance, which for Schopenhauer is the World as Representation (since it is re-presented to our minds via our senses). But because we are conscious and capable of reflecting, we are also aware of ourselves as subjects – not via the senses (that would make us objects) but directly, by being us. (To put it in terms of pronouns: you know me and I know you, but I am I; I am ‘me’ to you (and in the mirror) but I am ‘I’ to myself)

And what we are aware of, Schopenhauer says, is our Will; or rather, that our inner nature, as it were, is will: the will to exist, the urge to be, and that this is the inner nature of all things – the elusive ding-an-sich: they are all objective manifestations of a single all-pervasive Will (which happens in us uniquely to have come to consciousness).

This idea is not original to Schopenhauer but is borrowed from Eastern, specifically Hindu and Buddhist philosophy, which belongs to a separate (and possibly earlier) tradition than Platonic thought does (it is interesting to note that a very similar way of looking at things has been likewise borrowed by evolutionary biologists, such as Richard Dawkins, with the notion of ‘the selfish gene’ taking the place of Schopenhauer’s ‘Will’ as the blind and indifferent driving force behind all life).

I find Schopenhauer’s account satisfactory (though not wholly so) because it is a genuine attempt to give an account of the World as we experience it, one that reconciles all its elements (chiefly us as subjects with our objective perceptions) rather than the conventional account of Plato (and all who have followed him) which proceeds by simply discounting the subject altogether, effectively dismissing personal experience and reducing us to passive observers. Although the utility of this convention cannot be denied (in certain directions, at any rate) its inherent limitations make it inadequate to consider many of the matters that trouble us most deeply, such as those that find expression in religion and art; and if, like those who wish to tell us that the sun is ‘actually’ white, we mistake the conventional account for an actual description of reality*, we end up by dismissing the things that trouble us most deeply as ‘merely subjective’ and ‘not real’ – which is deleterious, since they continue to trouble us deeply, but that trouble has now been reclassified as an imaginary ailment.

*perhaps I should say ‘the world’ or ‘what there is’. There is a difficulty with ‘reality’ and ‘real’ since they are prejudiced in favour of the convention of an objective world: this becomes clearer when you consider that they could be translated as ‘thingness’ and ‘thing-like’. The paradigm for ‘reality’ then becomes the stone that Dr Johnson kicked with such force that he rebounded from it, i.e. any object in the world, so that the subject and the subjective aspect of experience is already excluded.

Heart-thought

When I was young and studying philosophy at Edinburgh University I remember becoming excited about the figurative use of prepositions; they seemed to crop up everywhere, openly and in disguise as Latin prefixes, in uses that clearly were not literal. Reasoning from the fact that the meaning of any preposition could be demonstrated using objects and space, I concluded that a world of objects and space was implied in all our thinking, and that this might act as a limit on what and how we thought.

2016-04-27 14.26.34

What strikes me about this now is not so much the idea as the assumptions on which it is based: I have made Language in its full-blown form my starting point, which is a bit like starting a history of transport with the motor-car. As I have suggested before, what we think of as ‘Language’ is a relatively recent development, arising from the invention of writing and the influence it has exerted on speech, simultaneously elevating it above all other forms of expression and subjugating it to the written form. It is the written form that gives language an objective existence, independent of human activity, and relocates ‘meaning’ from human activity (what Wittgenstein terms ‘language games’ or ‘forms of life’) to words themselves; and alongside this, it makes possible the systematic anaylsis of speech [as discussed in The Muybridge Moment].

In that earlier theory of mine I took for granted a number of things which I now think were mistaken. The first, as I have said, is that the milieu which gives rise to the figurative use of words is the developed form of language described above; that is to confuse the identification and definition of something with its origin, rather as if I were to suppose that a new species of monkey I had discovered had not existed before I found and named it.

Bound up with this is the model of figurative language which I assumed, namely that figurative use was derived from literal use and dependent upon it, and that literal use was prior and original – in other words, that we go about the world applying names like labels to what we see about us (the process of ‘ostensive definition’ put forward by St Augustine, and quoted by Wittgenstein at the start of his Philosophical Investigations) and only afterwards develop the trick of ‘transferring’ these labels to apply to other things (the word ‘metaphor’ in Greek is the direct equivalent of ‘transfer’ in Latin – both suggest a ‘carrying over or across’).

Points to note about this model are that it is logically derived and that it presents metaphorical thinking as an intellectual exercise – it is, as Aristotle describes it, ‘the ability to see the similarity in dissimilar things.’

The logic appears unassailable: clearly, if metaphor consists in transferring a word from its literal application and applying it elsewhere, so that the sense of the original is now understood as applying to the new thing, then the literal use must necessarily precede the metaphorical and the metaphorical be wholly dependent on and derived from it: to say of a crowd that it surged forward is to liken its action to that of a wave, but we can only understand this if we have the original sense of ‘surge’ as a starting point.

However, there is a difficulty here. It is evident that there can be no concept of literal use and literal meaning till there are letters, since the literal meaning of ‘literal’ is ‘having to do with letters’. Only when words can be written down can we have an idea of a correspondence between the words in the sentence and the state of affairs that it describes (what Wittgenstein in the Tractatus calls the ‘picture theory’ of language). If what we term metaphors were in use before writing was invented – and I am quite certain that they were – then we must find some other explanation of them than the ‘transfer model’ outlined above, with its assumption that literal use necessarily precedes metaphorical and the whole is an intellectual process of reasoned comparison.

The root of the matter lies in the fact already mentioned, that only with the invention of a written form does the systematic analysis of speech become possible, or indeed necessary. Before then (as I suggest in ‘The Disintegration of Expression‘) speech was one facet or mode of expression, quite likely not the most important (I would suggest that various kinds of body language, gesture and facial expression were possibly more dominant in conveying meaning). It was something that we used by instinct and intuition rather than conscious reflection, and it would always have been bound up with some larger activity, for the simple reason that there was no means of separating it (the nearest approach would be a voice speaking in the dark, but that is still a voice, with all the aesthetic qualities that a voice brings, and also by implication a person; furthermore, it is still firmly located in time, at that moment, for those hearers, in that situation. Compare this with a written sentence, where language for the first time is able to stand on its own, independent of space and time and not associated with any speaker).

In other words, when metaphor was first defined, it was in terms of a literate language, and was seen primarily as a use we make of words. (Given the definition supplied by Belloc’s schoolboy, that ‘a metaphor is just a long Greek word for a lie’, there is an illuminating parallel to be drawn here with lying, which might be defined as ‘making a false statement, one that is not literally true’. This again puts the focus on words, and makes lying primarily a matter of how words are used and what they mean. The words or the statement are seen as what is false, but actually it is the person – hence the old expression ‘the truth is not in him’. Deceit consists in creating a false appearance, in conveying a false impression: words are merely instrumental, and though certainly useful – as a dagger is for murder – are by no means necessary. We can lie by a look or an action; we can betray with a kiss.)

There is a great liberation in freeing metaphor from the shackles that bind it to literal language (and to logic, with which it is at odds, since it breaks at least two of the so-called ‘laws of thought’ – it violates the law of identity, which insists that ‘A is A’, by asserting that A is B, and by the same token, the law of contradiction, which insists that you cannot have A and not-A, by asserting that A is not-A). It allows us to see it from a wholly new perspective, and does away with the need to see it either as an intellectual act (‘seeing the similarity in dissimilars’) or as something that necessarily has to do with words or even communication; I would suggest that metaphor is primarily a way of looking at the world, and so is first and foremost a mode of thought, but one that operates not through the intellect and reason but through intuition and feelings.

To illustrate this, I would like to take first an example I came up with when I was trying to envisage how metaphor might have evolved. Two brothers, out in the bush, come on a lion, at a safe distance, so that they can admire its noble mien and powerful grace without feeling threatened. One brother smiles and says ‘mother!’ The other, after an initial look of puzzlement, nods his head in affirmation and laughs.

The explanation I furnished to accompany this is that their mother is a formidable and beautiful woman and that the first brother, seeing the lion, is reminded of her, and by naming her, invites his brother to make the same comparison that has already occurred to him, which he does after a moment’s puzzlement, and the two take pleasure in this new and unexpected – yet apt – use of the word.

20160427_143706

I think that the focus here is wrong: it is still concerned to make metaphor about words, and to see it primarily as a way of communicating ideas.

I would now like to alter the story slightly. A man on his own in the bush catches sight of the lion (from a safe distance, as before). On seeing it, he is moved: the sight of it stirs him, fills him with a mixture of awe and delight. And it is not what he sees, but rather what he feels, that calls his mother to mind: the feeling that the lion induces him is the same as he has felt in the presence of his mother. That is where the identification takes place, in the feeling: the outer circumstances might differ (the lion in the bush, his mother in the village) but the inner feeling is the same. If we think of an experience as combining an external objective component with an internal subjective one (and I am carefully avoiding any notion of cause and effect here) then the origin of metaphor lies in experiences where the external objective component differs but the internal subjective component is the same.

Why am I wary of saying ‘the sight of the lion causes the same feelings that the sight of his mother does’ ? Because it strikes me as what I would call a ‘mixed mode’ of thinking: it imports the notion of causality, a modern and analytic way of thinking, into an account of an ancient and synthetic way of thinking, thus imposing an explanation rather than simply describing. (This is difficult territory because causality is so fundamental to all our explanations, based as they are on thinking that makes use of literate language as its main instrument)

What I want to say is this: causal explanations impose a sequence – one thing comes first – the cause – and elicits the other, the effect. So if we stick with the man and the lion we would analyse it like this: ‘sense data arrive in the man’s brain through his eyes by the medium of light, and this engenders a physical response (spine tingling, hair standing on end, a frisson passing over the body) which the man experiences as a feeling of awe and delight.’

We can demonstrate by reason that the lion, or the sight of it, is the cause and the emotion the effect, because if we take the lion away (for instance, before the man comes on it) the man does not experience the emotion (although he may experience ‘aftershocks’ once it has gone, as he recalls the sight of it).

But there is a fault here. If we leave the lion but substitute something else for the man – an antelope, say, or a vulture – does it still have the same effect? It is impossible to say for sure, though we may infer something from how each behaves – the antelope, at the sight (and quite probably the scent) of the lion might bound away in the opposite direction, while the vulture (sensing the possibility of carrion near by or in the offing) might well move closer.

My point is that the analysis of cause and effect is rather more complex  than I have presented it here, which is much as David Hume makes it out to be, with his analogy with one billiard ball striking another; as Schopenhauer points out, what causes the window to shatter is not the stone alone, but the fact of its being thrown with a certain force and direction combined with the brittleness of the glass (and if the stone is thrown by a jealous husband through his love rival’s window, then we might need to include his wife’s conduct and the construction he puts upon it in the causal mix). Change any one of these and the result is different.

My being human is as much a precondition for the feelings I experience in the presence of a lion as the lion is, and I think that this is a case where, as Wordsworth puts it, ‘we murder to dissect’ – it is much more enlightening to consider the experience as a single simultaneous event with, as I have suggested, an inner and an outer aspect that are effectively counterparts. So the lion is the embodiment of the man’s feelings but so is his mother, and the lion and his mother are identified by way of the feelings that both embody; and the feelings are in some sense the inner nature or meaning of both the lion and the mother (think here of all the songs and poetry and music that have been written where the lover tries to give expression to his feelings for his beloved). This interchangeability and the identity of different things or situations through a common feeling aroused in each case is the foundation of metaphor and, I think, the key ‘mechanism’ of Art.

(This has an interesting parallel with the philosophy of Schopenhauer, as expressed in the title of his work Die Welt als Wille und Vortsellung, variously translated as ‘The World as Will and Representation’ or ‘The World as Will and Idea’. In this he borrows from Eastern philosophy to present the world as having a dual aspect – objectively, as it appears to others and subjectively, as it is in itself. Its objective aspect, Representation, is made known to us via our senses, and is the same world of Objects and Space with which this discussion began; we cannot by definition see what it is like in itself since it only ever appears as object, but once we realise that we ourselves are objects in the ‘World as Representation,’ we can gain a special insight by ‘turning our eyes inward’ as it were, and contemplating our own inner nature, which we know not by seeing but by being it.

And what do we find? For Schopenhauer, it is the Will; and the revelation is that this is not an individual will – my will as opposed to yours – it is the same Will that is the inner nature of everything, the blind will to exist, to come into being and to remain in being. (This bears a striking resemblance to the position advanced by evolutionary biologists such as Richard Dawkins, for whom humankind is effectively a by-product of our genetic material’s urge to perpetuate itself).)

I would diverge from Schopenhauer – and the evolutionary biologists – in their pessimistic and derogatory account of the inner nature of things, on two grounds. The first is that it makes us anomalous. Schopenhauer asserts that ‘in us alone, the Will comes to consciousness’ but is unable to explain why this should be so, while his only solution to the revelation that all things are just the urges of a blind and senseless will is effectively self-annihilation (not a course he chose to pursue himself, as it happens – he lived to be 72). There is a lack of humility here that I find suspect, a desire still to assert our uniqueness and importance in a senseless world. If the Will is indeed the inner nature of all things (and that is questionable) why should we consider ourselves the highest manifestation of it?

2016-04-27 14.24.26

The second ground is the nature of the feelings that I describe, which are the opposite of pessimistic: they are uplifting, feelings of awe, elation and delight. There is a fashion nowadays for explaining everything in terms of genetic inheritance or evolutionary advantage (‘stress is a manifestation of the fight-or-flight reaction’ for instance, or any number of explanations which couch our behaviour in terms of advertising our reproductive potential) but I have yet to come across any satisfactory explanation in the same terms of why we should feel elated in the presence of beauty, whether it is a person, an animal, a landscape, the sea or (as Kant puts it) ‘the starry heavens over us*’. The characteristic feature of such experiences is ‘being taken out of yourself’ (which is what ‘ecstasy’ means) a feeling of exaltation or rapture, of temporarily losing any sense of yourself and feeling absorbed in some greater whole.

I would venture that this disinterested delight is the single most important aspect of human experience and is (in Kantian phrase) ‘worthy of all attention.’

*The full quotation is not without interest: “Two things fill the mind with ever new and increasing admiration and awe, the more often and steadily we reflect upon them: the starry heavens above me and the moral law within me. I do not seek or conjecture either of them as if they were veiled obscurities or extravagances beyond the horizon of my vision; I see them before me and connect them immediately with the consciousness of my existence.” (Critique of Practical Reason)

The Mind’s Eye: Digital Camera or Camera Obscura?

HAMLET
My father!–methinks I see my father.

HORATIO
Where, my lord?

HAMLET
In my mind’s eye, Horatio.

CO

From this we know that the ‘inward eye which is the bliss of solitude’ was a current notion in Shakespeare’s day, and doubtless it is a great deal older than that; yet a conversation I had today reminded me of something I struggled to put into words (rather badly) a long time ago.

I was writing, I recall, about David Hume, quite possibly in an exam, maybe in an essay. Hume, at that time, was a source of great annoyance to me: I found his logic unassailable, yet I was sure he was wrong. The consequence was that I rather foolishly picked fights with him at every turn, when all that the university authorities required was that I demonstrate an adequate familiarity with his work.

Allan Ramsay, David Hume, 1711 - 1776. Historian and philosopher

Anyway, what I was taking issue with on this occasion was Hume’s notion that

‘All the perceptions of the human mind resolve themselves into two distinct kinds, which I shall call IMPRESSIONS and IDEAS. The difference betwixt these consists in the force and liveliness with which they strike upon the mind, and make their way into our thought and consciousness. Those perceptions, which enter with most force and violence, we may name impressions…  By ideas I mean the faint images of these in thinking and reasoning’

(those of you familiar with Hume will recognise that as the opening sentences of his Treatise of Human Nature – I really didn’t get very far at all before I started objecting)

I remember conducting an experiment (and this makes me think it must have been an essay, though to be sure, I might have repeated the same argument in an exam): I stared at something – it was a brown le Creuset saucepan, in my recollection, but we’ll come back to that – and then I closed my eyes and attempted to attend to the ‘faint image’ of it in my mind.

I insisted at the time that there was nothing to be seen.

There was something – I could have drawn a picture from it, or described it  – a small, heavy cast-iron pan, brown on the outside, creamy off-white within, the handle tapered towards the pot, rounded at the end and pierced with a hole which gave access to the hollow interior. But an image? a picture?

It wasn’t like seeing, I insisted: my argument with Hume was that he said that ideas were faint copies of impressions – like seeing, only less vividly, more feebly – and I felt that was not the case.

(I remember rehearsing a similar argument with my late brother Brendan in a slightly different form – my case then was that we had learned to talk about dreams as if they were essentially visual, when in fact they weren’t – but we had drink taken and our faculties were impaired, not least by the fact we thought them enhanced)

When I think back on this now, I feel I was being hard on Hume, though I think I have a point worth considering. I can still call that saucepan to mind, but I am no no longer sure if it was indeed what I conducted my experiment with – my recollection is that it was, but in the intervening years I have grown less willing to rely on the accuracy of memory; I accept that it invents.

But what does this ‘calling to mind’ involve? Davy Hume did not have the advantage I have of being familiar with faded images – an old photograph, a photocopy made when the toner is running low – so it is easy for me to conjure such a thing and say that an ‘idea’ is not like that – for me, at any rate. And though it is hard on Hume, I think I was right in this respect – a faded photograph, a feeble photocopy emphasises the relation with the original – it is an inferior duplicate, the same kind of thing, but not so well-realised – and that is something that Hume wants to say, that ideas are mere copies of impressions, and the reception of them by the mind is entirely passive: impressions enter via our senses with ‘force and violence’ and leave an impression behind, as a seal upon wax (though Hume expressly repudiates the image implied in the word – another point on which I took issue with him – if he did not mean ‘to express the manner, in which our lively perceptions are produced in the soul’ then why not use some other word? he could have called them ‘humes’).

schopenhauer1

Schopenhauer explodes this view (building on Kant’s insight) by pointing out that perception is not mere passive reception, but an intellectual act – we do not receive impressions ready made, like postcards through a letter box; we create them by processing the data – it is more akin to taking a photograph (whether digitally or otherwise) – light is focused by a lens to produce a pattern on a receptive surface, and that pattern is then converted into a form that we can comprehend.

But back to our experiments. If I try to attend to what I ‘see’ when I call something to mind, I am not at all sure how to describe it. Two things strike me as noteworthy: there is always visual input alongside it, whether my eyes are open or shut (if they are shut, I have the dark inside of my eyelids, sometimes marked with afterimages of any bright light that was there when my eyes were open; if they are open, my ‘idea’ is not in any way superimposed on what I see – it does not exist in the same space – though attending to one distracts from the other); and I cannot hold it still – it seems always to ‘slip away’ just when I most want to ‘look’ at it (though equally I can ‘move around’ it:  it is not two- dimensional, and is more like a model than a picture). Yet in spite of these difficulties I can confidently produce verbal descriptions or drawings of what I am ‘calling to mind’ – and indeed using words to express what I have in mind strengthens and clarifies whatever it is.

It strikes me that the comparison I have made above may hold the answer: in digital photography, the image exists in another form which is clearly non-visual – it is digitized, binary code – yet it still corresponds exactly to the visual image, to which it can be converted at any time.  What I have been saying about ‘ideas’ above – that they have all the properties of a visual image (we can use them the same way, can convert them – by words or lines – into actual pictures) yet lack the obvious one – we can’t actually see them – has an exact parallel in digitized images stored on a card: our camera can read them, and convert them into a form that our eye can read; but with ideas, it is our brain (or mind) that ‘reads’ them.

In other words, we ‘store’ our ideas in a different form, but one we can comprehend directly (just as if we called something in French to mind, but gave the sense of it in English). The distinction can be made clearer if we consider an image cast by a lens: I remember as a child being delighted when our father showed us how to produce an image on the wall using a magnifying glass – there it was, exact in every detail, but miniature and inverted. The principles of the camera obscura have been known since antiquity, and it is not unlikely that Hume might have seen them demonstrated; enough was known about the physiology of the eye for him to form a notion of an image being exactly reproduced inside the head. What Hume (and my youthful self) lacked was a model of how that image might exist in a wholly different form yet remain readable.

So now I would like to say that all the data we draw on for our interior life – whether that is memory, dreams, imagining – is stored and read in a different form. It is only when we come to talk about it that we naturally fall into using the language of the senses: we speak, like Hamlet, of seeing in our mind’s eye, but it is a different kind of seeing.

And having arrived, with some labour, at the end of the journey I embarked on in that essay of thirty-odd years ago, I am unsure if it matters or not.

Vanishing Point and the Golden Rule (by way of Immanuel Kant)

I remember once becoming absurdly excited in Princes St. Gardens in Edinburgh – that was just where I chanced to be, not the cause of the excitement – when I realised that an interesting thing happens if you number the dimensions in the reverse of the conventional order.

My brother had once explained the concept of the fourth dimension to me by saying that it was ‘at right-angles to the third’ which he elucidated by explaining that, as a point drawn out in one direction gives a line, so that line, moved at right angles to itself, generates a plane surface, which in turn generates a volume when moved at right angles to itself; hence the next dimension, the fourth, must involve performing an identical operation with a cube.

This I found pleasantly bamboozling: I could follow the first three steps of the procedure perfectly, yet with the fourth there seemed nowhere left to go; yet I convinced myself that dwellers in a linear or planar universe would experience the same difficulty at stages two and three. Later, when the same brother suggested that the fourth dimension was Time, I formed some notion of a cube moving through space and leaving a sort of cubic trail behind it – which was the direction I was headed in when I started thinking about Duchamp’s Nude Descending a Staircase, here . There is a similar though less marked suggestion of movement in time in this picture, where Miss Kate Ward has admirably captured the headlong speed of my Dursley-Pedersen:

Dursley-Pedersen at speed

John Ward at speed on his Dursley Pedersen, photographed by his daughter Kate

But in Princes Street Gardens it struck me that if we begin with volume as the first dimension – and picture a cube, say – we can abstract a second dimension from that by attending to one of its surfaces and ignoring the rest; then by the same method, we can consider the edge of that surface in isolation to give us a line – from where, if we wish, we can go to its end, and arrive at a point – but that is, indeed, the end, and there is literally no point in going beyond it.

I still consider that a neat manoeuvre, though nowadays I am more interested in the point you end up with than in the doing away with the need for any further dimensions.

Duchamp’s nude shows a temporal sequence as a spatial one, and that is consistent with our normal usage, which is to talk about time in spatial terms – we speak of a length of time, a short time, a space of time and so on, and we use ‘before’ and ‘after’ – which denote spatial relations – to speak of temporal ones.

It is clear that the ideas of space, time and movement are bound together: indeed we use movement through space (which takes time) as a way of thinking about time, even where no movement through space is involved – and since we developed film technology, we have reinforced this with time-lapse photography, so that we see flowers opening and closing, buildings being erected, even carcases decaying – though in fact there is movement there, without change of location, though it is normally too slow for us to perceive as such – we only notice that there has been a significant change from an earlier state to a later one, without detecting it from moment to moment.

But could you arrive at the notion of movement (and so of time) in a world of immutable Forms, such as Plato envisages in his Republic? The Forms are timeless and unchanging, but if they are distinct and separate (and extended, for how else could we picture them?) it would seem that they must occupy space and there must be distance between them; and where there is distance, there is surely the possibility of movement, and so of time, even if there is none actually?

However, we have introduced another element here, the one that Berkeley drew to our attention: a perceiver. To picture a world of timeless forms is to do so from some point of view, a particular location within that world, and it would seem to be from there that the notion of movement arises – and indeed, might we go further and suggest that it is location that gives rise to the concept of space?

Does a point imply space? in other words, is even a point – location without extension – sufficient to require the whole of infinite space for it to be located in, or is that just a product of our thinking being tied to a three-dimensional model to start with?

But there is some sleight-of-mind here: we might picture a point in space from which we look outward, and so gain some sense of depth and distance, but in what direction are we looking? more to the point, what are we looking with? It seems we have smuggled in an eye, albeit an invisible one – though you might well ask how you can have an eye without a lens, an iris, an optic nerve – and some sort perceptive apparatus at the other end of that nerve.

This leads me to Kant, and his observation that ‘space, time and causality are the mode and manner of our perception.’ Hume – whom Kant spoke of as having roused him from his dogmatic slumber – had suggested that the only way we could arrive at an idea of causality was through observing invariable succession – we see that A is invariably followed by B and in time conclude that A causes B. This is not a very satisfactory account, and it is thoroughly demolished by Schopenhauer, who points out that the most familiar and invariable succession of night and day does not lead us to suppose that one causes the other.

It is, however, the best you can do if you want to insist – as Hume did – on the empirical principle, i.e. that all knowledge is derived from experience, via the senses. What Hume failed to grasp, as Schopenhauer pointed out, is that perception is an intellectual act, not the passive process Hume supposed it to be – the mind works on and arranges the data supplied by the senses – in fact it makes sense of them.

The stepping stone between Hume and Schopenhauer is Kant, who realised that causality was not something we derived from experience, but rather a pre-requisite for making sense of it – it is, if you like, part of our intellectual apparatus; and it is not alone – he couples it with Time and Space, grouping all three together as ‘the mode and manner of our perception’ and putting forward the notion that, far from being derived from experience, these three are the means by which we make sense of experience itself. In this, they have been likened to a set of spectacles that we cannot remove, even when we realise that they  condition all that we see. We are therefore in the frustrating position of knowing that the world as we know it exists only for us (or those similarly equipped) which is the truth that Berkeley realised with his esse est percipi ; what the world is actually like in itself (i.e. unconditioned by the apparatus of space, time and causality) we cannot imagine.

This is what started, like a hare from its form, the ding-an-sich or thing-in-itself which Schopenhauer eventually ran to ground (with the aid of oriental philosophy). Schopenhauer’s brilliant move is to point out that, while the world as it is known to us via the senses is indeed much as Kant suggests, a world of representation, of objects-for-the-subject, conditioned by our intellectual apparatus, that is not the sole aspect we have access to – if we turn our eyes inward, as it were, we become aware that there is, in ourselves, a sort of privileged glimpse of the inner nature of the world, the very thing-in-itself – namely, the Will.

For Schopenhauer, we as individuals are self-conscious outposts of a single and otherwise blind and unconscious Will which is manifested everywhere and in everything and whose sole aim is to exist. This has echoes in our own day in the ideas of the evolutionary biologists, that we are in effect the mere by-product of our genes’ determination to reproduce ad infinitum. It is not of course original to Schopenhauer, who derived it from his reading of ancient Eastern, particularly Indian, philosophy.

Schopenhauer draws pessimistic conclusions from this: effectively, all that self-consciousness has done is make us helpless spectators, aware that we are embodiments of a will that we cannot control but which rather drives us: all we can do, at best, is to quiet the will, to turn away from existence (though Arthur himself was happy to keep going to what was then the ripe old age of 72). Nietzsche, who saw himself as a disciple of Schopenhauer, takes the idea in a more sinister direction: we can embrace our situation, and recognise that the will takes us beyond good and evil – if we are strong, we should follow its promptings wherever it leads us and glory in it, rejecting the ‘slave mentality’ of judaeo-christian thought, which he saw as essentially a conspiracy of the weak to keep power from the strong. You do not need to go far beyond Nietzsche (he died, insane, in 1900) to find the horrors to which that ultimately led.

Yet this interpretation seems to turn on a needlessly pessimistic interpretation of the nature of the Will coupled to an erroneous aggrandisement of our own status – both Schopenhauer and Nietzsche assume that, in coming to consciousness in Man, the Will has attained its highest form of existence (though Nietzsche might qualify that by adding ‘so far’). However, if we look more closely at the judaeo-christian tradition which Nietzsche dismisses and at the Eastern thought that Schopenhauer derived his ideas from, we find a different way of looking at it – having looked into ourselves, and seen that we are embodiments of a single will (and might that will not be higher, rather than lower, and ourselves just waking into it, not yet fully comprehending it?) we can then look outwards again and see others not as different from ourselves but essentially the same – which is the foundation of compassion and the Golden Rule which characterises all the great belief systems – ‘treat others as you would have them treat you’ – or, if you prefer, love your neighbour as yourself  – (because that is what, in effect, he is). (For an excellent and inspiring 10’ talk on the Golden rule, click here. )

Elective Causality

‘Myself when young did eagerly frequent

Doctor and Saint and heard great argument

About it and about: but evermore

Came out by that same door as in I went.’

(however, let us keep Omar Khayyam for another day)

Myself when young was much annoyed by David Hume, particularly his account of causality, so I am very grateful to Schopenhauer for showing me the error of his ways; but that too is a tale for another place.

What I want to consider today is the idea that you can choose to make something a cause or ground for your actions, a notion I have labelled ‘elective causality.’

On one level, of course, this seems paradoxical – when we speak of ‘cause’ in philosophy, we presuppose necessity – the effect is that which follows necessarily from the cause, the cause that which necessarily (and invariably)produces the effect: you cannot have the one without the other; we use this as a powerful tool in all sorts of reasoning.

(Though Aristotle offers a very interesting analysis of cause which I will look at elsewhere)

But here is another kind of cause: my son dies, by his own hand – what am I to make of that? The surprising discovery is that you can make of it what you wish. You could make it a matter for shame, a family disgrace, never to be alluded to, something best forgotten – I’m sure that has happened in reality; certainly it is a commonplace in stories of a certain period.

Or you could make it a ground for savage misanthropy, for hating the world as a stupid and meaningless place and human existence itself as something equally stupid and meaningless – and people have done that too, I am sure.

Or you might say: the only thing I can do is try to live better because of him, to let him be at my side, prompting me to take the better course, to do the daring or adventurous thing, even just to make the effort, for his sake.

It seems to me that there is a causal link here, and a strong one (being forged from love in grief) and yet at the same time it is something freely chosen (which is why it must be perpetually renewed, though I suppose that habit will strengthen it).

I’m sure there must be plenty cases of this – certainly there are in stories (Michael Henchard, in the Mayor of Casterbridge, makes his shame over the drunken sale of his wife at the hiring fair the ground for reforming his life)(I am also reminded of the Ninevites, who listened to Jonah (when he eventually mustered the courage to turn up) and repented).

It is interesting territory: people will look at a life and say ‘that event changed him’ or ‘that was a turning point’ – and what followed could be good or bad – losing someone you love might drive you to despair and ruin (“he really went to pieces after his wife died”) or equally it could be the occasion of improvement (“he’s a changed man since that happened – hasn’t touched a drop, devotes himself to charitable causes”).

I suspect that we are more inclined to see the operation of the will in the good cases than the bad, and that is reflected in the language we use: when the outcome is a bad one, we say someone was ‘driven’ to despair, suicide or the like, which makes it seem against the will; but if the outcome is good, we speak of the person’s ‘waking up’ ‘having his eyes opened’ – which seems to suggest a two stage-process: you come to see something, and as a result, you alter course. Though, to be sure, we do also speak of a person’s being changed – “he’s a changed man since that happened” – which does suggest an external force.

I am reminded that Shakespeare has an interesting take on the same idea, which he puts in the mouth of Edmund in King Lear:

‘This is the excellent foppery of the world, that,
when we are sick in fortune,–often the surfeit
of our own behavior,–we make guilty of our
disasters the sun, the moon, and the stars: as
if we were villains by necessity; fools by
heavenly compulsion; knaves, thieves, and
treachers, by spherical predominance; drunkards,
liars, and adulterers, by an enforced obedience of
planetary influence; and all that we are evil in,
by a divine thrusting on: an admirable evasion
of whoremaster man, to lay his goatish
disposition to the charge of a star!’

All in all, an interesting subject for reflection, to which I will return.