[Image courtesy of Geoff Oliver Bugbee]
Laurence Gonzales is the author of the book Deep Survival: Who Lives, Who Dies and Why.
He says he'll talk today about "intelligent mistakes:" why smart people do dumb things. His interest in this topic stems from his dissatisfaction with official aviation accident reports from transportation authorities.
On the one the pilot wasn't stupid, but he or she did a stupid thing.
Gonzales is a pilot and recounts how he was nearly killed by following a script that excluded the knowledge of a looming thunderstorm. He nearly flew into it.
We create these mental models, or behavioral scripts that often get in the way of important information. Again, why do smart people do stupid things?
In some cases expertise contributes to those accidents. Having done something repeatedly, the learned behavior is automatic. It's the way we use models of the world - and not the world - to navigate. He offers a couple ways in which we conflate two images of objects that can get us into trouble when, for example, one object is dangerous.
Intellectual knowledge can compete with actual knowledge.
"Most things we make mental models of fall into a category we call 'ignore.'" It's an efficiency thing, right? If we had to examine everything far less would get done - and we have numerous mental models that help us navigate our surroundings. It's also flawed because statistically what has happened in the past will occur in the future - except when it doesn't.
Offering an example of a famous violinist playing at a subway stop, he explains that the models at work in the minds of passersby discounted the expertise right in front of them. "Many of our worst decisions are life, aren't," really.
But at the moment of realization those mental models vanish instantaneously.
It was once thought that the four minute mile, or summiting Mount Everest, was impossible. Another characteristic of these mental scripts is that they're stable - even being passed along between generations. The famous 1999 collapse of the homecoming bonfire structure at Texas A&M, which killed several students, is an example. No one stopped to think critically about a 1920's tradition that had resulted in a dangerous structure 80 feet tall.
These kinds of mistakes result from a "very private conversation between the mind and body about what's occurring in the environment."
The scripts can be encourage by reward systems that draws the behavior forward. Pilots, for example, look forward to blue skies, smooth flying and greased landings.
On the other hand safety can be extremely expensive for society, which leads to silly consumer warnings, which he lists to laughter.
He recounted how sea radar was expected to lead to fewer accidents, but since then accidents have gone up as captains drove their ships faster.
Gonzales wants us to think of our world differently so those mental models don't intrude with catastrophic results. However thoughtful we may be our intellectual powers can fail us at the worst times. Describing some training he experienced with survival experts, he recounts how the trainer frustrated him by moving so slowly. But in reality the trainer was moving slowly simply to take into account the environment. He was just paying attention.
He concludes: learning doesn't just add something new, but changes everything that you knew before.