Michael Rosen |
They're extremely attractive, but they can be dangerous. Before we can really look at why that is, we have to understand the nature of an idea.
There's a very real truth to the quotation in the header image, though I want to cast it in a slightly different context than Rosen was getting at. What he was saying, of course, is that everything about us stems from our experiences and that every topic we might study is really the study of the past. Literature, mathematics, science, all are disciplines that rely heavily on the past. Indeed, modern physics tells us that even our most immediate experiences are experiences of the past, due to the fixed limit at which information can travel. Even our pain, which we experience as immediate, takes time to get from our pain receptors, traversing the myelin-sheathed nerves to trigger neurons in the brain. I'm going to take a slight diversion here to show how critical this is.
As a musician who's spent considerable time working with other musicians in recording environments, this is quite apparent to me, but I'm aware that those who haven't experienced such environments with regularity may not have the temporal sense that I've developed in that environment, so it's worth looking at what happens.
One of the hardest things to do in a recording environment is synchronisation. When musicians are playing together, this isn't much of a problem, except when one or more are less than entirely competent at keeping time. What generally happens is that the timing ebbs and flows. It won't generally be noticed by many but, for example, when musicians play live, they tend to slow down slightly during drum fills over section changes, speeding back up to tempo on introduction of the new section. This is merely the most obvious feature of a general principle, namely that musicians move around. They'll generally, if they're competent, go through these changes together.
In a recording studio, this takes on a slightly different complexion, because it's rare that the entire ensemble will all be playing at the same time, except in the case of professional gatherings, such as orchestras. What usually happens is that the rhythm section gets laid first, sometimes starting with only the drummer. If that happens, the drummer will need a metronome or click track.
Sounds travels at a finite speed, approximately 767 mph in air, although this changes with temperature and density. A reasonable rule of thumb is to work on the basis that, at room temperature, sound travels at something like 1 foot per millisecond. In the technical jargon, the difference between a sound leaving a source and arriving at the ear (or indeed any delays of this type) is known as 'latency'.
Often, these days, we use digital drum kits to trigger sample players, so that we can muck about afterwards with the sounds.
So, drummer sits down, speakers are about 5 or 6 feet away in the average project studio. He starts laying down his drum track based on the click track. Because there is a delay of 5 or 6 ms, this can have some interesting effects.
Most people can probably detect a delay of about 15-20 ms, but very good drummers will tend to have a sensitivity of about 4-5 ms. I'm sensitive, after many years of conditioning, to 3 or 4 ms. The average musician will be somewhere around 5-10.
Anyhoo, let's assume a really cracking drummer with excellent temporal sense. He lays down his drum track for a song that the band have been playing for years. He knows every note, and plays it flawlessly week-in, week-out, and never drops a note, even when practising on his own. When he gets in the studio situation, he often flounders (note that I'm not saying this of all drummers). Why is this?
First, it's because he's struggling with the latency. Second, it's because he's having a hard time staying on the beat without the variations. The drummer is usually the core of the regimentation of the ensemble, but now he's subservient to another core, the metronome.
There are some things we can do to mitigate this, but which course we choose depends on the musician. Some drummers work really well with a flashing light giving the tempo and, of course, this travels a smidgeon quicker, so latency is reduced massively. We can also send the click via headphones, which means it gets there are as quickly as the blinking light, because electrical signals, such as those travelling from any source to a direct connection, travel at the same speed, c. For the record, the technique that works best most of the time in my experience is simply for me to sit marking time with a finger, as I, being the engineer, am closer to the source, more aware of the pitfalls, and can read the situation well enough to reduce latency of trigger for the drummer, and can even anticipate and key into the drummer's perception.
So, we labour away, and he gets his part laid down. Then come the rest of the band, one or two at a time. They have an even harder time, because the variations they usually expect aren't there. They will tend to fall behind during fills, and run ahead when the section picks up again. They anticipate things derived from their experience of playing in the band but, of course, the band has never played to a metronome before, let alone a drummer playing to a metronome.
The above should highlight the approach I want to take here. I want to go deeper than Rosen's ideas, and get to the meat of what it is that we're studying.
We have an interesting ability, literally inherited from a long evolutionary history stretching back many millions of years, and it's all to do with how we process space and time. The brain records sensory input from all our senses (of which there are considerably more than five, despite popular opinion), nicely temporally ordered. This ordering means that we can abstract periods of time, remembering things that happened in the past, recognising patterns, and using those patterns to abstract into the future, projecting potential future patterns so that we can avoid pitfalls and work toward desirable goals.
Terry Pratchett once described humans as Pan narrans, the story-telling ape, and this is precisely what he was talking about. We tell ourselves stories about what happened in the past, and we tell ourselves stories about what will happen in the future. All of this arises directly from our temporal processes and pattern recognition. Herein lurks a danger, though, and that's because of the way our ideas interact with each other and how they interact with contradictory information.
You can think of ideas as having a property we'll call mass. That is, they have a certain resistance to any changes in 'velocity'. You could even stretch the analogy further and say that different kinds of idea have different mass. What we're talking about here is something that we all have, and it's extremely difficult to defeat: Cognitive bias. Cognitive bias comes in many forms, but the most pernicious of these is confirmation bias. Confirmation bias means that each of these ideas you have, both about the past and about the future, colour your perception of the world. Ideas are the lenses of the mind. When you meet a new idea, it's always seen through the lens of all of the ideas you currently hold. The ideas that sit well with all of those ideas will readily be taken in with minimal scrutiny, thus these have significant mass. Ideas that don't sit well will struggle to come under the influence of the forces in the nucleus, as it were.
OK, so I'm stretching the analogy too far. Mea culpa.
The point is that ideas group together and find safety in numbers. The more safe, comfortable ideas you hold, the greater the inertia in those ideas.
There's a wonderful post written some years ago by a former creationist, Glenn Morton, about how this works. He imagines a demon that sits on the shoulder, diligently filtering out any evidence that might cause dissonance or contradict any of the nucleic ideas, preventing you from seeing them, but that allows any evidence that might aid in confirming the nucleus, even to the degree that it will carefully interpret the evidence to make it fit. I'll link to his post at the bottom.
There are other factors in our cognitive development having taken the course it did besides pattern-recognition, of course; large brain, with a good mass to brain-mass ratio, opposable thumbs, etc, but what really defines the way we interact with the world is our ability with patterns.
All
of our knowledge about the universe comes to us from recognising patterns.
Our development from foragers, to hunter-gatherers, to agrarian, to
technological, all stem from recognising patterns. Poetry, art, music,
science, philosophy, logic, mathematics, all patterns.
We're
so good at it, and in responding to it, that we even attach
significance to events that are entirely meaningless or random. This is a useful skill in some respects, and gave clear advantages in evolutionary terms. When our ancestors saw what they thought matched the pattern of a tiger hidden in foliage, they took cover. It didn't actually matter whether or not there was a tiger there, simply being conditioned to respond to the appropriate pattern was sufficient to increase the odds of survival. However, it can be taken too far, and the result is what we
call superstition. In the dim and distant past, this took quite a few
forms according to the best evidence we have. In Homer's Odyssey there
are descriptions of 'auguries', which consist of predicting the future
by reading bird-sign. If a hawk passes by on the left in the evening with one eye closed doing the secret handshake,
yada yada...
The
night sky is a wealth of information about the cast of ancient
mythology, all rooted in patterns. Astrology is nothing more than
connecting an occurrence to a celestial event and making a religion out
of it. Dowsing, acupuncture, homoeopathy, alchemy, antivax and all manner of credulous wibble is founded in this superstitious drivel.
We're not alone in this, of course, and we can see high degrees of pattern-recognition ability in many species. There's a famous series of experiments conducted by BF Skinner, in which various birds exhibited superstitious behaviour, including pigeons and corvids. These experiments are a real eye-opener, and they tell us a lot about ourselves. In one that comes to mind, a pigeon is conditioned so that, if it turns in a certain direction, it receives a reward in the form of some grain. After some reinforcement, the pigeon will continue to turn in that direction even long after the reward stops coming, or when the reward comes after a different action. What the experiments showed is that this behaviour arises precisely because the reward is random, so there's no way to predict which behaviour will trigger the reward, which reinforces the behaviour and strengthens the response. This also leads to addiction to things like gambling and other risk-related activities.
I was reminded of this when watching an episode of Tricks of the Mind, in which he uses some of the same techniques on celebrities, with much the same result. I recommend watching the series, not least because it's almost entirely rooted in the principles we're discussing here.
Like the example of the tiger above, we're also extremely adept at spotting visual patterns where there is no pattern. This is especially true for faces, almost certainly as a result of our earliest imprintings as babies, learning to recognise our families. When we see faces in the clouds, it's a manifestation of this pattern-recognition. Similarly, any viewers of Top Gear will recall Clarkson extracting the Michael from the 'face' of Hammond's favoured wood-constructed Morgan. Psychologists have a name for this: Pareidolia. This is nothing more nor less than our cognitive biases at work, as are all these abilities with patterns.
Cognitive biases are difficult to defeat, but not impossible. Cognitive behavioural therapy, for example, is highly useful in reducing some particular biases. Many of the stresses we face in our daily lives arise from cognitive bias. We anticipate situations and begin to 'catastrophise', tending to think situations will be bad, and this escalates stress. CBT works by getting us to examine those situations and assessing pragmatically whether the stresses we anticipate are real, or as bad as we might think. Often, simply by casting the same situation in more positive terms, we can make a situation less stressful.
The best mechanism we know of for reducing and eliminating bias is science. Because it's self-correcting, all biases are ultimately eliminated.
That's not to say that scientists are bias-free, of course. Bias manifests in science all the time. Einstein, for example, in what he called his 'greatest blunder' (as we've discussed previously, it really was a howler, despite the romanticism you'll hear in popular science books), Einstein introduced a term into the equations for general relativity simply because he thought the universe was static, and GR as it stood said it couldn't be. This 'Lambda' term, or cosmological constant, was a term that basically added a knob that could be set to a value which determined the rate at which the cosmos was expanding or contracting, and Einstein set it so that it balanced perfectly, all because he, along with pretty much all the physicists of his time, thought that the universe was eternal and unchanging. Pure confirmation bias.
His greatest blunder was the huge amount of time and intellectual energy he exhausted in trying to debunk quantum mechanics, another manifestation of bias, but we've covered that at length elsewhere.
When Eddington read Einstein's work, he was immediately struck by it, but he was pretty much alone in the Royal Astronomical Society. Other notable fellows of the society in his day were extremely resistant to the 'German science', in what was pure prejudice, another form of bias.
Another good example is plate tectonics, an idea first proposed by meteorologist Alfred Wegener in 1912 (actually, not strictly accurate; the first rudimentary theory of plate tectonics was proposed by Leonardo da Vinci in the 15th century, after discovering fossil marine organisms on mountains). This idea was resisted for decades, and even ridiculed by many until, in the '50s and '60s, observations confirmed seafloor spreading, making plate tectonics the only game in town (aside from the asinine 'expanding Earth' nonsense, which I'll be covering in a future post).
There's a marvellous book on this topic, the brilliant The Structure of Scientific Revolutions by philosopher Thomas Kuhn, which deals specifically with the inertia of ideas in science and how old paradigms become extinct in practice. I highly recommend this work.
The point is that ideas have inertia, and can be extremely resistant to change. This is especially true with ideas from certain sources during our formative years. We're programmed to trust our parents implicitly. When they tell us we can trust somebody and take what they say as true, we will tend to accept it at face value. This can be dangerous, not least because the ideas imprinted during this time are the hardest to shift. There's an old saw of the Jesuits, the Catholic church's educators, often attributed to Aristotle.
One final thought about a particular bias that we often think of as being something desirable, but which actually serves to increase inertia; common sense. Common sense can be a useful tool, but over-reliance on it can be catastrophic to intellectual progress, precisely because of the inertia it represents. Because no post is complete without at least three mentions of the wiry-haired brainiac, I'll leave you with one of his most famous quotations:
Morton's Demon
We're not alone in this, of course, and we can see high degrees of pattern-recognition ability in many species. There's a famous series of experiments conducted by BF Skinner, in which various birds exhibited superstitious behaviour, including pigeons and corvids. These experiments are a real eye-opener, and they tell us a lot about ourselves. In one that comes to mind, a pigeon is conditioned so that, if it turns in a certain direction, it receives a reward in the form of some grain. After some reinforcement, the pigeon will continue to turn in that direction even long after the reward stops coming, or when the reward comes after a different action. What the experiments showed is that this behaviour arises precisely because the reward is random, so there's no way to predict which behaviour will trigger the reward, which reinforces the behaviour and strengthens the response. This also leads to addiction to things like gambling and other risk-related activities.
I was reminded of this when watching an episode of Tricks of the Mind, in which he uses some of the same techniques on celebrities, with much the same result. I recommend watching the series, not least because it's almost entirely rooted in the principles we're discussing here.
Like the example of the tiger above, we're also extremely adept at spotting visual patterns where there is no pattern. This is especially true for faces, almost certainly as a result of our earliest imprintings as babies, learning to recognise our families. When we see faces in the clouds, it's a manifestation of this pattern-recognition. Similarly, any viewers of Top Gear will recall Clarkson extracting the Michael from the 'face' of Hammond's favoured wood-constructed Morgan. Psychologists have a name for this: Pareidolia. This is nothing more nor less than our cognitive biases at work, as are all these abilities with patterns.
Cognitive biases are difficult to defeat, but not impossible. Cognitive behavioural therapy, for example, is highly useful in reducing some particular biases. Many of the stresses we face in our daily lives arise from cognitive bias. We anticipate situations and begin to 'catastrophise', tending to think situations will be bad, and this escalates stress. CBT works by getting us to examine those situations and assessing pragmatically whether the stresses we anticipate are real, or as bad as we might think. Often, simply by casting the same situation in more positive terms, we can make a situation less stressful.
The best mechanism we know of for reducing and eliminating bias is science. Because it's self-correcting, all biases are ultimately eliminated.
That's not to say that scientists are bias-free, of course. Bias manifests in science all the time. Einstein, for example, in what he called his 'greatest blunder' (as we've discussed previously, it really was a howler, despite the romanticism you'll hear in popular science books), Einstein introduced a term into the equations for general relativity simply because he thought the universe was static, and GR as it stood said it couldn't be. This 'Lambda' term, or cosmological constant, was a term that basically added a knob that could be set to a value which determined the rate at which the cosmos was expanding or contracting, and Einstein set it so that it balanced perfectly, all because he, along with pretty much all the physicists of his time, thought that the universe was eternal and unchanging. Pure confirmation bias.
His greatest blunder was the huge amount of time and intellectual energy he exhausted in trying to debunk quantum mechanics, another manifestation of bias, but we've covered that at length elsewhere.
When Eddington read Einstein's work, he was immediately struck by it, but he was pretty much alone in the Royal Astronomical Society. Other notable fellows of the society in his day were extremely resistant to the 'German science', in what was pure prejudice, another form of bias.
Another good example is plate tectonics, an idea first proposed by meteorologist Alfred Wegener in 1912 (actually, not strictly accurate; the first rudimentary theory of plate tectonics was proposed by Leonardo da Vinci in the 15th century, after discovering fossil marine organisms on mountains). This idea was resisted for decades, and even ridiculed by many until, in the '50s and '60s, observations confirmed seafloor spreading, making plate tectonics the only game in town (aside from the asinine 'expanding Earth' nonsense, which I'll be covering in a future post).
There's a marvellous book on this topic, the brilliant The Structure of Scientific Revolutions by philosopher Thomas Kuhn, which deals specifically with the inertia of ideas in science and how old paradigms become extinct in practice. I highly recommend this work.
The point is that ideas have inertia, and can be extremely resistant to change. This is especially true with ideas from certain sources during our formative years. We're programmed to trust our parents implicitly. When they tell us we can trust somebody and take what they say as true, we will tend to accept it at face value. This can be dangerous, not least because the ideas imprinted during this time are the hardest to shift. There's an old saw of the Jesuits, the Catholic church's educators, often attributed to Aristotle.
The implication is clear, and carries with it the elucidation of how, for many decades - more than we can know - children were subjected to institutionalised abuse, much of which has been coming to light in the past few decades the world over, and continues to do so. I attended an event for survivors of institutional abuse in Ireland some years ago, hosted by the then president, Mary McAleese, at Áras an Uachtaráin in which she talked about this as being the result of 'bad imprinting of children' (this, while new legislation for 'blasphemous libel' was being pushed through).'Show me the child until he is seven and I will show you the man.'
One final thought about a particular bias that we often think of as being something desirable, but which actually serves to increase inertia; common sense. Common sense can be a useful tool, but over-reliance on it can be catastrophic to intellectual progress, precisely because of the inertia it represents. Because no post is complete without at least three mentions of the wiry-haired brainiac, I'll leave you with one of his most famous quotations:
"Common sense is the collection of prejudices we accumulate by the age of eighteen."Thanks for reading. Nits and crits welcome, as always.
Morton's Demon
No comments:
Post a Comment
Note: only a member of this blog may post a comment.