The aim of all research, in the humanities as well as the sciences, is to arrive at results that are as accurate as possible. The history of Western thought is, in fact, epitomized by a continual shaving of information towards the promise of ever more accuracy, summarized pithily as ‘less is more’. Yet, accuracy is as far as ever, with movement towards it a frustratingly circular path of less being more and more being less.
If ultimate accuracy is impossible to attain – and this is exemplified by the search within quantum physics, in which “accuracy” takes on a whole new paradigm of meanings that are linked to measurement and the observer –, the answers may lie in the opposite direction: approximation, rather than accuracy.
In nature and our place within it, is it approximation that keeps life on a manageable keel? On a person-to-person level, Graham Greene posed a question and answered it in his way: “Time has its revenges, but revenge seems so often sour. Wouldn’t we all do better not trying to understand, accepting the fact that no human being will ever understand another, not a wife with a husband, nor a parent a child?” 1 Yet we do understand each other. Approximately. Well enough for day-to-day communication. We do so, despite the disparate sensory information (hearing, seeing, touch, smell) that assaults us individually and that our brains process at different speeds through different neural structures. To that must be added the significance of information streaming in from different locations, each person thus perceiving bits at a (slightly) different time from everyone else. Individual, accurate analysis would result in impossible noise; it is approximation, reflected in the approximate meaning of the words we use, that saves the day.
As John Stuart Mill put it, “There is no such thing as absolute certainty, but there is assurance sufficient for the purposes of human life.”2 Just as there is no real or proper, objective colour red outside of our perspective, the word ‘red’ is approximately close enough to be understood. Subjectivity in language, as in nature, is per se approximate.
As with language, so with feelings. We cannot feel someone else’s pain, however close we are to them and however graphically they describe it. We can relate to pain only as an approximation of what we, each personally, would feel. Similarly with taste and smell. How does one describe the taste of coffee? Or the taste of something possibly unfamiliar (say, alligator meat)? “It’s like chicken,” (or “fishy chicken”) is a popular and humorous approximate description, since chicken is presumed to be a widely familiar taste.
Approximation is ubiquitous. Happy, sad, thirsty, rich, high, cold – all are approximations. Or let’s take “fair”: what could be the meaning of “I bring my children up to be fair.”? The word “fair” is so open to interpretation as to be almost meaningless by any standard. Is it fair for a municipality to charge for water? How much monthly income would be fair? If two nations dispute a piece of land, is there a fair solution?
It is clear, then, that language is an approximate communicative device. Looking for accurate meanings is fraught with problems. There are no objective meanings that we could simply look up; if there were, notions such as “consciousness,” “self,” “free will” or “time,” would be simply about how language works within a logical system.
It was Ludwig Wittgenstein, who famously and paradoxically placed language in that position, claiming in his Tractatus Logico-Philosphicus, that philosophy boils down to nothing more than a series of linguistic puzzles. It is what people nowadays mean when they stop a discussion with “define your terms,” or triumphantly clinch an argument with “it’s all a matter of semantics.” Unfortunately, it isn’t. It can’t be. Replacing philosophy with lexicography is not the answer, since, as noted, language is approximate, fuzzy, and dissonant – like everything else in nature, it seems.
The approximation of language, an integral part of what it is to be human, reflects the multitudinous aspects of the self. It is all-pervasive, and keeps whole organizations in business: lawyers, judges, court officials, philosophy and language departments with their critical reasoning classes at universities and colleges, writers, journalists, not to mention politicians, whose every word has to be measured and is thereafter re-measured.
It so happens, that even Wittgenstein changed his earlier conclusions, admitting in his Philosophical Investigations that perhaps the ideas in the Tractatus had been over-simplified, and that problems within philosophy need to be solved through looking at how language is actually used, its ambiguity and how it changes. No more immune than events and humans who go through them, a word does not have a fixed definition, but is an evolving entity that carries its own history with it through time, picking up new nuances and discarding old ones through its approximate and pragmatic usage.
Not surprisingly, approximation itself spirals down into its own approximation when an attempt is made to define elements that are an integral part of their own “existence.” Instructive here are Gödel’s 1931 incompleteness theorems of mathematical logic that demonstrate the inherent limitations of every formal axiomatic system capable of modelling basic arithmetic. These theorems have implications wider than their original application to mathematics, i.e. that a system of logic cannot demonstrate its own assumptions.
The brilliance of the principle taken from Gödel is its simplicity. (But then everything is simple once it’s been stated/discovered). Its beauty is its application within a wider principle. It explains why, for example, we cannot define consciousness, since defining requires being outside that which we attempt to define; anything we say about it is conjecture following its own endless circular argument, reinforcing its own approximation.
The more we strive for accuracy, the further it moves away. Any line however straight it appears, reveals its fuzzy edge the closer we get to it. This phenomenon of approximate accuracy applies even to the most ostensibly accurate measurements of what are accepted as fundamental physical limits in the universe. Take the speed of light (186,000 miles per second), which does seem to be an unbreakable boundary. But is it? What do we mean by stating that an object cannot exceed (in fact, reach) the speed of light? According to Einstein’s theory of special relativity, the faster an object travels, the greater its mass, so that when an object approaches the speed of light, its mass becomes infinite, as does the energy required to move it, and time stops. But at that point we are no longer referring to an object that is going incredibly fast, but rather to something that has ceased being an object in any sense within time that has stopped. Similarly, the theoretical lowest possible temperature – absolute zero, -273 degrees Celsius – is falsely accurate, since at that point the oscillations of molecules would become as slow as they could possibly be, i.e. they stop moving, so that anything at that theoretical temperature is no longer an object. To claim, then, that there are limits within nature that are absolutely accurate is misleading; any theoretical limit leads to a change of the element being examined. This is in line with the uncertainty principle in physics, that purports a fuzziness in nature, a fundamental limit to what we can know about the behavior of quantum particles and, therefore, the smallest scales of nature. At these scales, all we can do is calculate probabilities for where things are and how they will behave. Probabilities are approximations par excellence.
Approximation as a defining part of nature is an oxymoron, but not a puzzle. It can be demonstrated in practical examples, both large and small, in day-to-day happenings and in history, literature, art and science. Approximation manifests itself in the fact that we can’t examine (or do) two things at once, so that every point is fleetingly replaced by another perceived point. A similar effect can be shown in quantum physics, where the uncertainty principle mentioned above, says that focusing on measuring one property of an object more precisely will make measurements of other properties less precise. It prohibits us from seeing “the whole picture,” in fact.
History, the relating of events that happened in the past, should seemingly be less accurate the further back in the past it occurred. Yet, history is no more accurate when referring to a closer past, since subjective interpretation plays an overwhelming part at this juncture. This is why today’s news is the least accurate description of events – fake news, anyone?
Generally, we live comfortably with approximation. If we know that 15,200 km separate the USA and Australia, this is an approximate fact. Likewise, if we Google the height of Everest, we are aware that whatever the information, it will be ‘true’ only approximately. Borders between nations are set out on maps and on the ground, and are accurately measured – until a dispute arises, when zooming in for greater accuracy will make it clear that the figurative lines in the sand were only approximately accurate.
Even scientists, whose business it is to aim for, and rely upon, extreme accuracy, know that measurements are accurate according to the measuring device used. They accept, and work with, the paradox of approximate perfection.
Whether looking for the existence of ever smaller particles will result in a description of the foundations of the universe is a moot point. In the meantime, the smaller the elements being sought, the more approximate their very existence.
It is a sobering thought that the millennia spent searching for accuracy may have been barking up the wrong tree. A rethinking of the efficacy of approximation rather than accuracy as the groundswell of our approach to the world, is a shift that releases the pressure of looking for what can never be found. That approximation is a fundamental pillar of the universe is a delicious oxymoron. It is also a rude awakening to the possibility that the theory of everything could only ever be an approximation.
Notes:
1Graham Greene, The Quiet American. William Heinemann Ltd. UK. 1955.
2John Stuart Mill, On Liberty. J.W. Parker, London 1859. Dover Publications, NY. 2002.