The experts are ruining the world.
The technocrats we’ve counted on to run the machines, institutions and systems of our advanced civilization have not only failed to protect us from disasters, they now admit that they don’t have a clue as to why they failed.
In just the past few weeks, we’ve seen top experts confess that they don’t know what they are doing.
This lede taken from an article in MarketWatch has half the story right, but badly misses the most salient point. The news story driving the article is a set of exculpatory statements made recently by a diverse group of “experts” disclaiming their responsibility in each of a series of recent disasters. Here are a few of the cases they looked at:
> – Top securities regulators and the leaders of the exchanges confess to Congress that they don’t know what caused the sudden drop in stock prices last Thursday, a market meltdown that destroyed about $1 trillion in wealth in a manner of minutes.
>
> – “Fabulous Fab,” a clever trader at Goldman Sachs, confesses that he didn’t understand the implications of his own financial creations, the kind of paper that forced the global economy to its knees.
>
> – Top executives from the world’s largest car maker confess that they don’t know why one of their best-selling vehicles sometimes accelerates out of control.
>
These stories are truly reported, but the writer’s conclusion that these people have failed to keep the trust that the public has put in them is misplaced. The public should not have relied only on the experts that designed or built the systems that failed. All of these systems are technically complex, that is, their behavior cannot be predicted by the very design model that was used in creating them. The real culprit is our culturally rooted blind faith that the world is a machine that we know enough about to manipulate and control.
We should, by now, have begun to understand that the promise of gains in efficiency or technological prowess need to be taken with a grain of salt. It may take experts to design them, but it always take a different sort of competence to govern them. That’s what is missing in this otherwise excellent article. The very set of traits that experts bring to the design of systems is exactly the opposite to those needed to keep them under control. Expertise is in part defined by a sense of infallibility. In most cases this quality is justified. As a result the public and those responsible for running its key institutions make the understandable, but erroneous, assumption that the experts are the right ones to both build and keep watch over the systems. Wrong.
As I have written in my book and on this blog, complex systems require different governance schemes than machines, however complicated the machines are. The critical rule for complex system is to assume that you do not know how these systems are going to behave, especially when stressed. This means that those running the show must always proceed cautiously, always carefully monitoring the current state. Controlling the extent of allowable behavioral variations will minimize the possibility that the system will collapse or undergo some sort of regime change. In most of the cases mentioned in the article this translates into the absolute need for some sort of regulation, that is controls that keep the system from moving too far from the region of known, as opposed to predicted, behavior.
What we don’t need is what the writer wants, “smarter or more honest experts.” Those we already have are smart and mostly honest enough. When they confess that they can’t explain what has happened, they are finally being honest. They are only doing what we ask of them. We listen to them on the upside, but on the down side we’re pretty deaf to them and to other “non-experts”, who have learned much about how these systems tick by watching them perform. As long as the systems produce oil from the seabed or make money magically appear, those who call out warnings are seen as Pollyannas or, in the current lingo, wimps.
It’s not too late to regain control of our world, and we need the experts to do that. After all, when the plane hits a flock of geese and the engines go out, you’d still rather have Captain Sullenberger in the cockpit than Ron Paul.
Yes, we still do need the experts, but we also need the wimps. Captain Sullenberger, in my sense, is much more the wimp that the expert. He got real good at doing what he does by learning through long experience, coupled with careful observation. The experts were those that designed the airplane and made some assumption that the probability of losing both engines to an errant flock of birds as too low to worry about. I’d rather have the wimps in the cockpit when I fly than the designers.