The Muybridge Moment


The memorable Eadweard Muybridge invented a number of things, including his own name – he was born Edward Muggeridge in London in 1830. He literally got away with murder in 1872 when he travelled some seventy-five miles to shoot dead his wife’s lover (prefacing the act with ‘here’s the answer to the letter you sent my wife’) but was acquitted by the jury (against the judge’s direction) on the grounds of ‘justifiable homicide’. He is best known for the sequence of pictures of a galloping horse shot in 1878 at the behest of Leland Stanford, Governor of California, to resolve the question of whether the horse ever has all four feet off the ground (it does, though not at the point people imagined). To capture the sequence, Muybridge used multiple cameras and devised a means of showing the results which he called a zoopraxoscope, thereby inventing stop-motion photography and the cinema projector, laying the foundations of the motion-picture industry.


(“The Horse in Motion-anim” by Eadweard Muybridge, Animation: Nevit Dilmen – Library of Congress Prints and Photographs Division; Licensed under Public Domain via Commons –

Muybridge’s achievement was to take a movement that was too fast for the human eye to comprehend and freeze it so that each phase of motion could be analysed. It was something that he set out to do – as deliberately and methodically as he set out to shoot Major Harry Larkyns, his wife’s lover.

It is interesting to consider that something similar to Muybridge’s achievement happened a few thousand years ago, entirely by accident and over a longer span of time, but with consequences so far-reaching that they could be said to have shaped the modern world.

We do not know what prompted the invention of writing between five and six thousand years ago, but it was not a desire to transcribe speech and give it a permanent form; most likely it began, alongside numbering, as a means of listing things, such as the contents of storehouses – making records for tax purposes, perhaps, or of the ruler’s wealth – and from there it might have developed as a means of recording notable achievements in battle and setting down laws.

We can be confident that transcribing speech was not the primary aim because that is not something anyone would have felt the need to do. For us, that may take some effort of the imagination to realise, not least because we live in an age obsessed with making permanent records of almost anything and everything, perhaps because it is so easy to do so – it is a commonplace to observe that people now seem to go on holiday not to enjoy seeing new places at first hand, but in order to make a record of them that they can look at once they return home.

And long before that, we had sayings like

vox audita perit, littera scripta manet
(the voice heard is lost; the written word remains)

to serve as propaganda for the written word and emphasise how vital it is to write things down. One of the tricks of propaganda is to take your major weakness and brazenly pass it off as a strength (‘we care what you think’ ‘your opinion matters to us’ ‘we’re listening!’ as banks and politicians say) and that is certainly the case with this particular Latin tag: it is simply not true that the spoken word is lost – people have been remembering speech from time immemorial (think of traditional stories and songs passed from one generation to the next); it is reasonable to suppose that retaining speech is as natural to us as speaking.

If anything, writing was devised to record what was not memorable. Its potential beyond that was only slowly realised: it took around a thousand years for anyone to use it for something we might call ‘literature’. It is not till the classical Greek period – a mere two and a half millennia ago (Homo sapiens is reckoned  at 200,000 years old, the genus Homo at 2.8 million)  – that the ‘Muybridge moment’ arrives, with the realisation that writing allows us to ‘freeze’ speech just as his pictures ‘froze’ movement, and so, crucially, to analyse it.

When you consider all that stems from this, a considerable degree of ‘unthinking’ is required to imagine how things must have been before writing came along. I think the most notable thing would have been that speech was not seen as a separate element but rather as part of a spectrum of expression, nigh-inseparable from gesture and facial expression.  A great many of the features of language which we think fundamental would have been unknown: spelling and punctuation – to which some people attach so much importance – belong exclusively to writing and would not have been thought of at all; even the idea of words as a basic unit of language, the building blocks of sentences, is a notion that only arises once you can ‘freeze’ the flow of speech like Muybridge’s galloping horse and study each phase of its movement; before then, the ‘building blocks’ would have been complete utterances, a string of sounds that belonged together, rather like a phrase in music, and these would invariably have been integrated, not only with gestures and facial expressions, but some wider activity of which they formed part (and possibly not the most significant part).

As for grammar, the rules by which language operates and to which huge importance is attached by some, it is likely that no-one had the least idea of it; after all, speech is even now something we learn (and teach) by instinct, though that process is heavily influenced and overlaid by all the ideas that stem from the invention of writing; but then we have only been able to analyse language in that way for a couple of thousand years; we have been expressing ourselves in a range of ways, including speech, since the dawn of humanity.

When I learned grammar in primary school – some fifty years ago – we did it by parsing and analysis. Parsing was taking a sentence and identifying the ‘parts of speech’ of which it was composed – not just words, but types or categories of word, defined by their function: Noun, Verb, Adjective, Adverb, Pronoun, Preposition, Conjunction, Article.

Analysis established the grammatical relations within the sentence, in terms of the Subject and Predicate. The Subject, confusingly, was not what the sentence was about – which puzzled me at first – but rather ‘the person or thing that performs the action described by the verb’ (though we used the rough-and-ready method of asking ‘who or what before the verb?’). The Predicate was the remainder of the sentence,  what was said about (‘predicated of’) the Subject, and could generally be divided into Verb and Object (‘who or what after the verb’ was the rough and ready method for finding that).

It was not till I went to university that I realised that these terms – in particular, Subject and Predicate – derived from mediaeval Logic, which in turn traced its origin back to Aristotle (whom Dante called maestro di color che sanno – master of those that know) in the days of Classical Greece.







Alexander the Great

Alexander the Great

Aristotle is the third of the trio of great teachers who were pupils of their predecessors: he was a student of Plato, who was a student of Socrates. It is fitting that Aristotle’s most notable pupil was not a philosopher but a King: Alexander the Great, who had conquered much of the known world and created an empire that stretched from Macedonia to India by the time he was 30.

That transition (in a period of no more than 150 years) from Socrates to the conquest of the world, neatly embodies the impact of Classical Greek thought, which I would argue stems from the ‘Muybridge Moment’ when people began to realise the full potential of the idea of writing down speech. Socrates, notably, wrote nothing: his method was to hang around the market place and engage in conversation with whoever would listen; we know him largely through the writings of Plato, who uses him as a mouthpiece for his own ideas. Aristotle wrote a great deal, and what he wrote conquered the subsequent world of thought to an extent and for a length of time that puts Alexander in eclipse.

In the Middle Ages – a millennium and a half after his death – he was known simply as ‘The Philosopher’ and quoting his opinion sufficed to close any argument. Although the Renaissance was to a large extent a rejection of Aristotelian teaching as it had developed (and ossified) in the teachings of the Schoolmen, the ideas of Aristotle remain profoundly influential, and not just in the way I was taught grammar as a boy – the whole notion of taxonomy, classification by similarity and difference, genus and species – we owe to Aristotle, to say nothing of Logic itself, from which not only my grammar lessons but rational thought were derived.

I would argue strongly that the foundations of modern thought – generalisation, taxonomy, logic, reason itself – are all products of that ‘Muybridge Moment’ and are only made possible by the ability to ‘freeze’ language, then analyse it, that writing makes possible.

It is only when you begin to think of language as composed of individual words (itself a process of abstraction) and how those words relate to the world and to each other, that these foundations are laid. Though Aristotle makes most use of it, the discovery of the power of generalisation should really be credited to his teacher, Plato: for what else are Plato’s Ideas or Forms but general ideas, and indeed (though Plato did not see this) those ideas as embodied in words? Thus, the Platonic idea or form of ‘table’ is the word ‘table’ – effectively indestructible and eternal, since it is immaterial, apprehended by the intellect rather than the senses, standing indifferently for any or all particular instances of a table – it fulfils all the criteria*.

Which brings us to Socrates: what was his contribution? He taught Plato, of course; but I think there is also a neat symbolism in his famous response to being told that the Oracle at Delphi had declared him ‘the wisest man in Greece’ – ‘my only wisdom is that while these others (the Sophists) claim to know something, I know that I know nothing.’ As the herald of Plato and Aristotle, Socrates establishes the baseline, clears the ground, as it were: at this point, no-one knows anything; but the construction of the great edifice of modern knowledge in which we still live today was just about to begin.

However, what interests me most of all is what constituted ‘thinking’ before the ‘Muybridge Moment’, before the advent of writing – not least because, whatever it was, we had been doing it for very much longer than the mere two and a half millennia that we have made use of generalisation, taxonomy, logic and reason as the basis of our thought.

How did we manage without them? and might we learn something useful from that?

I think so.

*seeing that ideas are actually words also solves the problem Hume had, concerning general ideas: if ideas are derived from impressions, then is the general idea ‘triangle’ isosceles, equilateral, or scalene or some impossible combination of them all? – no, it is just the word ‘triangle’. Hume’s mistake was in supposing that an Idea was a ‘(faint) copy’ of an impression; actually, it stands for it, but does not resemble it.

Head and Heart (1)

A thought about therapy in relation to art and music struck me after listening to James Rhodes in a TV programme, Notes from the Inside, in which he – a classical pianist and former psychiatric patient – takes a grand piano into a psychiatric hospital to play pieces he hopes will resonate with patients:

calling art and music ‘therapy’ gets it the wrong way round – that is medicine, psychiatry, trying to ride on the back of art and subordinate it. Art works because it is art. It would work anyway, if the person happened on it in a museum or on the radio at home or in a book. It is a way that people can break out of the toils in which they have become ensnared and glimpse (as Rhodes himself said) a way out, the possibility of going to another place. It is not part of a ‘programme’; it does not work in conjunction with drugs or some other treatment; it works because it reaches people – regardless of their state of mental health – in a way that other means cannot. When other things do not make sense or seem crazy or pointless, art and music tell us something different – especially when they reconcile the terrible things, make us see that it is still possible to go on living in spite of everything.

If something is worthwhile, it stands on its own merits: it does not need to disparage potential ‘rivals’. You do not establish the worth of association football by disparaging golf or cricket; you do not establish the worth of classical music by disparaging popular music, or vice versa; and you do not establish the value of reason and science by relegating intuition and all other forms of thought to a sideshow, a sort of childish whimsy, pretty but not to be taken seriously.

Art, poetry, music are modes of thought – or at least, I am compelled to call them that to draw attention to their equal worth to reason. I would prefer to say that they are responses to life – to the fact of finding ourselves alive and engaging with that – but that is beyond the narrow pale we have drawn round ourselves, centred on reason and the head, and denying the heart.

The very dichotomy, ‘head and heart’ is suspect, and like so many of its kind, it is made from one side only – it is an instance of what I have referred to above, the error of thinking that you establish your cause the better by disparaging what you see as its rivals, instead of on its own merits. The head fears the heart and is always concerned to keep it in its place, but the heart has no such misgivings. Doubt and scepticism, distrust of the senses, are the very foundation of Reason in the West; trust and faith are the concern of the heart.

It had not struck me how strongly my introduction to philosophy – which was chiefly through Plato – began with this determination to discredit the senses, which is the same as discrediting intuition and one’s natural bent. The senses are not to be trusted – it is hammered home: there is the famous bent stick in water, which the mind (or head) knows is straight, but the foolish senses can only see as bent. Things are not as they seem; appearances are deceptive – that is what Western philosophy is built on (consider Descartes, in his determination to arrive at something of which he can be certain, his conviction that everything his senses tell him might be a lie).

And what do we arrive at? Man, the rational animal, the one creature whose head rules his heart, who can subordinate the passions to reason, who can remain cool and detached – it is our supreme piece of idolatry to imagine that it is in this that we are the image and likeness of God (think of Blake’s image of the Ancient of Days: Blake's Ancient of Days).

There is nothing wrong with reason – I am not going to fall into the trap of disparaging it – I hold with Aquinas that there can be no incompatibility between faith and reason, just as there is no true dichotomy between head and heart; but I do think we have got ourselves into a false position where we have, as it were, elected Reason as our dictator, and subordinated things to Reason which rightfully stand alongside it, equal in value – perhaps even greater – but quite different in operation.

As I have suggested elsewhere, much of this is reflected in our attitude to language and the way we teach it. When I was young and studying philosophy the thing that impressed me about language was that it was rule-governed – and if only we could spend a bit more time and exercise our reason on the matter, we could clarify those rules, make them truly effective, eliminate the idiosyncrasies that have arisen from generations of unreflecting use and arrive at a language that is purified, efficient, rational, and clear – the ideal instrument for thought.

Now, however, I see that as a mistaken perspective – what strikes me as important about language is not that it has rules and a structure but that it is intuitive – we acquire it instinctively: if every book was burned, every school demolished, our capacity to learn language would not be diminished one jot – because schools and books were established as a means of extending the use we make of language – they are not the primary means of instruction, they are secondary.

Now I am not for a moment advocating a Taliban-like reversal of education: I only want to remind you of its place in the order of things. Language is ancient, instinctive, coeval with our humanity* – it is part of the expression of our humanity, and in its ‘natural’ form – speech** – it is bound up with art and music: any separation we make is artificial – these are colours on a spectrum, different aspects of the same thing, different sort of human behaviour in response to life. Literacy is a good thing, books are a good thing, schools are a good thing – but they are not the only good thing, and we should be careful what we teach our children.

*I have modified my view since I wrote this, and now believe that what we think of as language is of relatively recent origin – about 2,500 years ago – and that it was preceded by a much more holistic mode of expression, which integrated expression, gesture, movement, rhythm, song, music and art
** speech, I now think, is not the ‘natural’ form of language, but simply a facet of the holistic mode of expression described above: its current importance arises with the emergence of Language (as we now think of it) which results from the impact of writing on human expression.

A Way of Thinking

rotting apple

Poetry is a way of thinking.

By ‘poetry’ I mean not just poetry but everything that works in a similar fashion – by imagination and instinct – such as music and art generally (it’s handy to remember that ‘poetry’ just means ‘making’) – and by ‘thinking’ I mean rather more than the narrow sense in which we usually employ that word – thinking is the totality of what we do inside our head, of which ‘rational thought’ is only a subset.

An instance: this morning, I had an idea for a book. It came as it usually does, out of nothing, and then all at once began to burgeon (the best image I have of this is cells under a microscope dividing and multiplying with great rapidity) which is always exciting – you think, ‘there could be this – and then this – and this -’ it comes in a torrent, yet all seems to hang together; you feel the connections branching out all over the place, you sense how it would all work, without having to examine it too closely.

By the time I got back to the house, the excitement had subsided and a reaction had set in – again, this is familiar: a bit like the seed that falls on stony ground, some ideas spring up but do not have the soil to sustain them, so they wither as quickly as they came. And that thought – that this might be yet another disappointment, something of seeming promise from which nothing comes – brought to mind a poem by Seamus Heaney, Blackberry-Picking.

The first section, of sixteen lines, deals with the exuberant wild untrammelled joy of picking a great glut of blackberries; but the second part, only half as long, reads:

We hoarded the fresh berries in the byre.
But when the bath was filled we found a fur,
a rat-grey fungus, glutting on our cache.
The juice was stinking too. Once off the bush
The fruit fermented, the sweet flesh would turn sour.
I always felt like crying. It wasn’t fair
That all the lovely canfuls smelt of rot.
Each year I hoped they’d keep, knew they would not.

Now Heaney of course when he wrote that poem had no notion of me standing on the doorstep reflecting on how ideas can suddenly fail of their promise, nor need he have had any specific notion of what the poem ‘meant’ or was ‘about’; the concrete experience of the blackberry picking, that mad joy followed by disappointment and disgust (and the fact of its being a familiar sequence) was what he sought to capture.

Nonetheless, the poem illustrates perfectly what I was thinking about the failure of promise, so it does ‘mean’ that; that is what it is ‘about’. Nor do I need to add ‘that is what it means to me’ because the whole point is that we are dealing with universals here – by which I mean experiences of a kind that every human being has had, or has the potential to have. What Heaney as a boy experienced with the blackberries is something that many of us have found elsewhere in life; so the poem is not exclusively about any one of those experiences, it is an expression of each of them and it unites everyone who has ever felt anything like that, regardless of whether he ever picked a blackberry in his life.

We can imagine that two such people might meet, and on reading the poem, would nod and exchange looks, as much as to say, ‘I see what he means’ or even just, ‘that’s true.’ And they might do the same on hearing a particular passage of music, or seeing a painting – they would recognise, if you like, that here is a concrete expression of a human experience, an experience they themselves have had; and the poem (or the music, or the painting) would connect them.

That is the kind of thinking that goes on in stories, in music, in poetry, in art – this instinctive grasping of human experience, which our fellow-humans recognise and relate to when they see it. Reason, which does not like instinct and abhors jumping to conclusions, cannot explain it very well and tends to disparage and dismiss it or find some way to marginalise and subjugate it, but in fact it is central.

(And my book idea? it hasn’t withered yet: we shall see what comes of it)