My dad, pushing me off on a wobbly first bike on the drive that ran alongside my childhood home: “Look where you’re going, not where you’ve been.”
I put my feet down, seeking some stability that little bit longer.
My brain is telling me I've seen people riding their bikes on telly, in books and in real life. I know this works.
My bodily experience is telling me this isn't working.
I've tried. I've pedalled. I've stopped.
At the time, I assume bikes aren't right for me. The physics makes no sense – how could two wheels possibly keep me upright when I'm clearly meant to topple over? My experience and intuition tell me one thing, while the knowledge being handed down (in a heavy handed way) by my dad tells me another.
In his book, Thinking Fast and Slow, Daniel Kahneman maintains that "The accurate intuition of experts are better explained by the effects of long practice than by heuristics."
However, if our early practice is stumbling and tumbling or our perception of the experience is faltering or defective, then our future judgements and intuition can be skewed.
A few weeks later and I'm dragging the bike out to the garden. There are no parents to be seen or heard now. Somewhere in my childhood mind I'd realised that my Dad's very sensible advice to "Look where you're going," had been in conflict with the other old classic, "Look at me when I'm talking to you."
My knowledge that it was possible to ride a bike, balancing on two wheels, had been impacted by the dissonant social rules I'd been following which led to an impoverished experience and a questioning of both the original knowledge and the social rules.
What a childhood quandary.
Fortunately, my resilience and stubbornness took over.
Setting off now though, I build up from a one metre push off, to a five metre wobble, to ultimately doing multiple laps of the garden round the silver birch and back.
Some types of learning come naturally through experience, while others require formal instruction. David Geary calls these respectively biologically primary and biologically secondary forms of knowledge.
Unlike learning to walk or talk – skills we are, as a result of evolution, biologically wired to acquire – things like writing and complex mathematics require instruction or teaching.
Some knowledge, like how to ride a bike, sits in an interesting middle ground where instruction meets experience.
This kind of learning requires either ourselves or someone we're working with to make valid inferences about the experiences we're having, or the practice we're doing, and to highlight these to us or raise our awareness of them.
In the incident with the bike, I was fortunate to have made the correct inferences myself. This has not always been the case.
"Valid intuitions," Kahneman writes, "develop when experts have learnt to recognize familiar elements in a new situation and to act in a manner that is appropriate to it."
This means that invalid inferences can arise when:
Something has occurred or not occurred to us in the past which prevents or limits our resilience or stubbornness from kicking in
We or someone supporting us are inexpert
We have no mental model(s) of success
We see elements as being familiar when they are not
There are elements which are familiar to us which we do not see due to something in the external context or our own internal context
There are elements which are familiar to us which we consciously or subconsciously choose not to see
We are unaware how to overcome obstacles or challenges when they arise in a new situation
We choose not to take on new situations or challenges or even deny their existence
These scenarios can, more often than not, each be overcome. The may require us, though, to be accompanied in this process by someone else. More on this in a future post.
Unlike pure knowledge, which as we discussed in the previous post can be quite binary (you either know something or you don't), experience has this frustrating habit of training our intuitive thinking to jump to conclusions that our logical minds might question. Kahneman would recognize this as a classic conflict between heuristic thinking and rational analysis – the same conflict that makes us see patterns in random data or jump to conclusions based on limited evidence.
Sometimes, though, our experiences can actually create what Kahneman calls "cognitive illusions" – biases that interfere with our ability to accept new knowledge.
This is what makes it so hard to unlearn things. It can be very difficult to simply override our intuitive responses, built on past experiences – but we can learn to recognize when we need to engage our rational mind to check them.
This counterpoint between experience and knowledge isn't a flaw in our learning. Rather, it's an essential part of the arrangement. It's what drives us to test, to experiment, to refine our knowledge. It's what makes learning a dynamic process rather than just a simple accumulation of facts or heuristics. It's what sometimes requires us to call on someone else to accompany us in our thinking and/or doing.
So perhaps, instead of seeing experience and knowledge as being in competition or dissonant, we should view them as being more like sections in an orchestra. Sometimes they seem to be playing different pieces entirely, but always they are working together to create something more meaningful than either could achieve alone.
Comments