There are 3 tech stories in the news this week that show just how weird our risk thinking can be.
We are easily distracted by unusual stuff. Our imaginations are taken captive by the bizarre, the dramatic and we our brains seem unable to resist such stories. Some time in the next week, the Chinese Space Station Tiangong-1 – or “Heavenly Palace”, will become entangled with Earth’s atmosphere.
At a speed of 27,000 km/h, over 8 tonnes of ‘science’ is headed towards us for an uncontrolled re-entry. It might hit something on the way down; a plane, your kid’s school, or it might just plop into the ocean. Should we worry? How should we prepare?
The reality is, no-one can make practical preparations for this. It’s a risk we can not control. It’s highly unlikely that whatever is left of the Heavenly Palace, is going to affect anything when it comes down. The likelihood that you could be personally killed by this has been calculated as 1 in 23 trillion – or to put that in perspective, I am 165,000 x MORE LIKELY to win the Euromillions lottery. Which is also weird, because I never buy tickets!
So it’s a totally uncontrollable risk that is highly unlikely to affect any person, creature or property. But are we going to spend a long time speculating about it? You bet we are. Weird!
The public reaction blamed mad science, that this level of risk-taking was reckless and unnecessary.
This sort of reaction is what I think of as “one-sided risk”. It only takes account of the things that could go wrong, the costs, the downsides. It’s also a way of ignoring the current situation, where human drivers regularly wipe out themselves and others.
“Stupid airplanes, always crashing, falling out the sky and killing everyone”. Perhaps in the 1930s, those sentiments would have been more valid as planes did fail frequently.
But actively learning from failure, getting smarter, improving techniques, materials, procedures, communication, skills… these are the benefits of exploring risks so that we may reach the upsides: reliable planes that take us long distances, quickly, efficiently, safely.
Our world is going to change massively once autonomous vehicles become as developed as air travel has become. Instead of calls for greater pressure from transportation regulators to ensure that lessons are learned and that testing takes place under strict controls, we want to ban this risk. We only seem to be able to see the cost and are unable to process the enormous societal and environmental benefits that will happen.
Why is it that people appear to be incapable of avoiding unbalanced, emotive decisions driven by short term issues, rather than considering the long term good? Rational thinking is dismissed as irresponsibility: weird!
As the front cover of The Economist says, this week has been an “EPIC FAIL” for Facebook.
Without trying to be political, it’s also been a huge blow to the integrity of democratic elections that have taken place in the last few years, all over the world.
It seems that social media has encouraged us to commit our thoughts, feelings, attitudes and relationships to the safety of web pages which we believe we control. But on the other side of the server, people have been able to pull together this data to know us better than we know ourselves.
We’ve become vulnerable to manipulation, seemingly on an industrial scale. Societies are becoming polarized, elections are being shockingly won in highly unlikely circumstances.
The technology that we thought was giving us freedom, seems to have been subverted to control a proportion of society so that election results can be forced. If a sense of fairness is lost, then what follows is anger and chaos. The consequences could be huge.
But when faced with really big risks, risks of complexity that affect many different stakeholders, the sort of risks that really should not be ignored, we tend to think that they are someone else’s problem, or we deny their significance, so we can focus on doing easier things. Weird!
So there we have three weirdnesses. We get distracted by the unusual but irrelevant. We shut down risk-taking when something goes wrong, instead of recognizing that error is the price for learning. And we avoid thinking about the big and complicated risks that could really hurt us, in order to satisfy ourselves that we are doing something, even if it is a bit trivial in comparison.