Be careful what you wish for

Photo by Toimetaja tõlkebüroo on Unsplash

Have you ever gone on a diet, try to exercise, or try to become more productive only to become fatter, lazier, and idler?

Then, welcome to the law of unintended consequences, a world where good intentions can have nefarious results.

In the time of British colonial rule in India, there was a problem with venomous cobras, so the government tried to get rid of them. They offered a reward for every cobra that was killed. Soon, some clever entrepreneurs started breeding them in order to claim the prize. Eventually, the British realized this and stopped the scheme. All those cobras that had been bred were released and their population increased.

This became known as the cobra effect and illustrates the pervasive tendency of complex systems to backfire when messing with them despite good intentions.

There are numerous examples of this in politics, the economy, the environment, and many other domains.

  • The introduction of invasive species like rabbits in Australia created havoc in the habitat due to the lack of predators for them.
  • Prohibition of alcohol in the US increased its consumption
  • Interventionism in the Middle East with the purpose of pacifying the area and reducing terrorism has had the opposite effect
  • Viagra was developed to treat a heart condition but now it is used for something else
  • The microwave oven was invented by accident when trying to develop a radar communication system during WWII

We tend to think that the experts — economists, scientists, engineers politicians — know what they are doing and have a clear mental model that matches reality. But this is just an illusion. Most of the time they play by ear, by trial and error and, often, find results they weren’t looking for.

Also, there is a human tendency to interventionism — in case of doubt, take action and we’ll figure out how to clear the mess later — seems to be the sentiment behind the wars in the middle east, some medical interventions, and some environmental interference despite the evidence that very often is better to stay put.

There are several reasons for this lack of vision:

  • Naivety. We delude ourselves into believing we have instantly found the solution to the problem. We are action biased so we feel intervention is always the best solution, but we are unaware of our blind spots — we don’t know what we don’t know
  • Rush. For every problem, we find a solution, any solution, quick! Action always seems the best option instead of reflection. This can lead to irresponsible and indiscriminate action which does more damage than good.
  • Cause and effect. We behave like amateur detectives, we know who is guilty before CSI do. If everything is so obvious, how come they don’t see it?

Common sense is the least common of all the senses

  • Black and white. We tend to oversimplify, to take shortcuts, to fall prey to cognitive biases. This makes us go extreme, all or nothing, but reality is not binary and keeps creating waves that increase the complexity.
  • Complexity. Reality is more complex than we think. We’ll never be able to fully understand it. Not that long ago our ancestors came down from the trees and now we pretend to understand quantum physics. We must accept that there are some limits to human knowledge and we need to try to survive in a world we don’t understand.

First, we must understand that, complex systems — the human body, the environment, or the economy — are almost impossible to understand at a deep level. Economists, Sociologists, and especially politicians need to be more humble about what they think they know and admit they can’t see the whole picture yet.

We all need to realize that, by pressing that button, there will be intended and unintended consequences that are impossible to predict with the information we have now.

Second, we need to experiment with low-risk ventures, sandbox our terrain, and tinker in a controlled environment. Do everything in incremental steps, trial and error, and slow steady improvements.

Third. Understand risk. Risk is impossible to predict and difficult to avoid. The only thing we can do is to design systems that are more robust and prepared to survive worst-case scenarios. Don’t try to predict a tsunami, move 10 km inland.

Fourth. Avoid top-down approaches. Next time you hear and academic professor of economics lecturing about quantitative ease, run away quick. Top-down systems are based on the naive and arrogant assumption that they have the perfect model for reality even though that model is pure fiction.

The Soviet union was an attempt to simplify and optimize the economic system from the top down. Like any mental model it doesn’t work outside the mind.

Fifth. Keep your eyes open. Often, what you find by chance is more important than what you were looking for in the first place. This is true even in the Scientific domain. Most scientific discoveries were made by chance while looking for something else — Insulin, penicillin, Teflon, velcro.

America was “discovered” while Columbus was looking for India.

This is called serendipity and illustrates perfectly how having a plan and sticking to it can have a detrimental effect on the outcome.

Sixth. Forget about efficiency. Trying to optimize everything is counterproductive, when something is too streamlined there is no backup in case of failure.

Globalization is, among other things, about efficiency. Every country specializes in a small part of the production process and thus the world can benefit from abundant goods at a low price…until Shit Hits The Fan, like now.

In the middle of the COVID crisis, the main exporters of basic commodities like rice, wheat, and corn are in lockdown trying to preserve what they got for their citizens. This, if continues for much longer could lead to a supply chain disruption and breakdown of the system.

Seventh. Decentralization. The bigger an organization, the bigger its mistakes. Better to have many small errors that can be corrected easily than one huge irreversible mistake. By spreading the risk into smaller units the system becomes more robust.

If you look back at your life — and manage to avoid the narrative fallacy of justifying the decisions you made as intended, planned, and rational — you will have to admit that most of the things that happened to you were just coincidences. Luck plays an important role in life although most prefer to cling on to the illusion of control.

Perhaps, you became a karate teacher just because there was a dojo near your house, or you became a lawyer because your friend talked you into it, or fell in love with your partner just by pure coincidence.

Everything you are, what you have, your values, ideas, and inclinations are the result of pure chance. Have you been born in a different country you would be a different person now.

Knowing this, don’t get trapped by dogma, don’t try to forecast and plan every detail of your life as if you have total control.

Be open to serendipity. Keep your eyes open, try new things, tinker, take small risks, and be open for the gains, be opportunistic ( in a good way), keep your options open.

When you travel, don’t behave like a tourist, following the beaten path and taking selfies in the same landmarks. Disconnect your GPS and improvise, follow your intuition. Get lost. Speak to a stranger. Your experience will be richer than those poor tourists following an umbrella.

Life is full of unintended consequences. I came to this world “by mistake” , according to my mum, and I’m glad to be here. Enjoy your life, don’t take it for granted, and keep your eyes open. Life is full of surprises and some of them are really good.

Student of life. Trying to make sense of it all, be happy and help others achieve their dreams. Join me at:

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store