We often rely on our intuition, but is there any scientific explanation for a concept which puts instinct ahead of conscious reasoning? And how can we use it better in our everday lives? Ben Newell, Associate Professor of Cognitive Psychology at the University of New South Wales, investigates and explains.
Chess picture from Shutterstock
The word intuition is derived from the Latin intueor – to see; intuition is thus often invoked to explain how the mind can “see” answers to problems or decisions in the absence of explicit reasoning – a “gut reaction”.
Several recent popular psychology books – such as Malcolm Gladwell’s Blink, Daniel Kahneman’s Thinking Fast and Slow and Jonah Lehrer’s The Decisive Moment – have emphasised this “power of intuition” and our ability to “think without thinking”, sometimes suggesting we should rely more heavily on intuition than deliberative (slow) or “rational” thought processes.
Such books also argue that most of the time we act intuitively – that is, without knowing why we do things we do.
But what is the evidence for these claims? And what is intuition anyway?
Defining intuition
Albert Einstein once noted “intuition is nothing but the outcome of earlier intellectual experience”. In a similar vein, the American psychologist Herbert A. Simon (a fellow Nobel Laureate) stated that intuition was “nothing more and nothing less than recognition”.
These definitions are very useful because they remind us that intuition need not refer to some magical process by which answers pop into our minds from thin air or from deep within the unconscious.
On the contrary: intuitive decisions are often a product of previous intense and/or extensive explicit thinking.
Such decisions may appear subjectively fast and effortless because they are made on the basis of recognition.
As a simple example, consider the decision to take an umbrella when you leave for work in the morning. A quick glance at the sky can provide a cue (such as portentous clouds); the cue gives us access to information stored in memory (rain is likely); and this information provides an answer (take an umbrella).
When such cues are not so readily apparent, or information in memory is either absent or more difficult to access, our decisions shift to become more deliberative.
Those two extremes are associated with different experiences. Deliberative thought yields awareness of intermediate steps in a chain of thought, and of effortful combination of information.
Intuitive thought lacks awareness of intermediate cognitive steps (because there aren’t any) and does not feel effortful (because the cues trigger the response). But intuition is characterised by feelings of familiarity and fluency.
Is intuition any good?
Whether or not intuition is inherently “good” really depends on the situation.
Herbert A. Simon’s view that “intuition is recognition” was based on work describing the performance of chess experts.
Work by the Dutch psychologist Adriaan De Groot, and later by Simon and the psychologist William G Chase, demonstrated that a signature of chess expertise is the ability to identify promising moves very rapidly.
That ability is achieved via immediate “pattern matching” against memories of up to 100,000 different game positions to determine the next best move.
Novices, in contrast, don’t have access to these memories and thus have to work through the possible contingencies of each move.
This line of research led to investigations of experts in other fields and the development of what has become known as recognition-primed decision making.
Work by the research psychologist Gary A Klein and colleagues concluded that fire-fighters can make rapid “inutitive” decisions about how a fire might spread through a building because they can access a repertoire of prior similar experiences and run mental simulations of potential outcomes.
Thus in these kinds of situations, where we have lots of prior experience to draw on, rapid, intuitive decisions can be very good.
But intuition can also be misleading.
In a contrasting body of work, decision psychologist Daniel Kahneman (yet another Nobel Laureate) illustrated the flaws inherent in an over-reliance on intuition.
To illustrate such an error, he considered this simple problem:
If a bat and a ball cost $1.10 in total and the bat costs $1 more than the ball, how much does the ball cost?
If you are like many people, your immediate – intuitive (?) – answer would be “10 cents”. The total readily separates into a $1 and 10 cents, and 10 cents seems like a plausible amount.
But a little more thinking reveals that this intuitive answer is wrong. If the ball cost 10 cents the bat would have to be $1.10 and the total would be $1.20! So the ball must cost 5 cents.
So why does intuition lead us astray in this example? Because here intuition is not based on skilled recognition, but rather on simple associations that come to mind readily (that is, the association between the $1 and the 10 cents).
Kahneman and Tversky famously argued these simple associations are relied upon because we often like to use heuristics, or shortcuts, that make thinking easier.
In many cases these heuristics will work well but if their use goes “unchecked” by more deliberative thinking, errors – such as the 10 cents answer – will occur.
Using intuition adaptively
The take-home message from the psychological study of intuition is that we need to exercise caution and attempt to use intuition adaptively.
When we are in situations we have experienced lots of times (such as making judgements about the weather), intuition – or rapid recognition of relevant “cues” – can be a good guide.
But if we find ourselves in novel territory or in situations in which valid cues are hard to come by (such as stock market predictions), relying on our “gut” may not be wise.
Our inherent tendency to get away with the minimum amount of thinking could lead to slip-ups in our reasoning.
Ben Newell receives funding from the Australian Research Council.
This article was originally published at The Conversation. Read the original article.