A Response to Black Swans
- “Knowledge” is a personal view of life that is absolute
- Knowledge often leads to a misunderstanding of our surroundings
- If we’re able to reduce what we know, we can look at the world objectively
- Follow the five steps below to reduce what you know to increase your understanding
We live in a information rich society. Sounds like a good thing, but if you read my syndicated article on improving more by learning less, it’s quick to see how negatively impactful an inundation of information can be.
With information comes the illusion of knowledge, and with knowledge comes the illusion of understanding. And when your understanding directly impacts your plan for life, it’s important to reduce your overexposure to information, and therefore, reduce your knowledge.
Before we go on, its important to define knowledge as: a personal infallible truth – a view of life that is absolute.
Knowledge is some idea or theory that we not only personally believe in, but also base our entire lives around. So when I say reduce what you know, I’m not advocating a lack of intelligence. Rather, I’m saying that we shouldn’t necessarily believe that our knowledge is right, and definitely not an absolute.
Great! Lets continue, as we look at five reasons why it’s paramount to reduce what you know:
1. Ideas are sticky, even the bad ones
We love our ideas, and for some of us, we love to tell people about them (not me, of course!). What’s tricky about ideas is that it’s easy to fall in love with them, even the bad ones. We all have our pet theories, and we grab onto them like they define our lives. It’s this rabid identification with ideas that cause us to miss out on the actual truth.
When we hold onto ideas and theories, we tend to misinterpret potential new information and viewpoints, often times to our detriment.
“The problem is that our ideas are sticky: once we produce a theory, we are not likely to change our minds. When you develop your opinion on the basis of weak evidence, you will have difficulty interpreting subsequent information that contradicts these opinions, even if this new information is obviously more accurate.” – Nassim Taleb, The Black Swan
By reducing what you “know” as infallible truths, you approach the world with no prejudices, and more importantly, no blind beliefs. You’re able to see experiences, facts, and people for who and what they really are. It’s this ability to remain openminded in the light of new evidence that actually makes us more informed.
2. Everything is cyclical, but nothing’s repeatable
Think of this in terms of economics (damn you, finance background): We can all agree that the business cycle and the economy are both cyclical, no?
There will always be economic booms and busts, and as a product of two major recessions, I’m pretty interested the idea of another one. Well, with how much accuracy can you predict the next economic downturn?
If history is any indication (which it is and it isn’t – important to note), than it’s pretty clear to see how hard it is to accurately predict the timing and ramifications of a downturn.
Leading economists and clerics may have predicted the 2008 crash, for example, but I’d look into how many times they’ve predicted a downturn. Probably every other year, leading me to believe that more knowledge does not make you a better predictor.
“The more information you give someone, the more hypotheses they will formulate along the way, and the worse off they will be. They see more random noise and mistake it for information.” – Nassim Taleb, The Black Swan
The point is that no level of information will let you forecast with accuracy when the next ’08 crisis will occur. But, if you know that an economic downturn will happen at some point (which we can agree is true), than isn’t that all the information you need?
The only knowledge we need to insulate ourselves from the next recession is that it’s going to happen one day. We don’t need to know why, when, or how bad. Prepare for it! Build a strategy that will allow you take advantage of the situation when it occurs, since you know it will occur.
The fact of the matter is that history is cyclical, but it isn’t repeatable. The next downturn will have results similar to the tech bubble or the financial crisis, but will have enough nuances that will cause us to “miss the signs.” Don’t ignore the general writings on the wall by looking in-depth for the color of the pen. Know that it’s there and plan for it.
3. Confirming the confirmation bias
Let’s face it, we’re all biased. As we now know, we have pet ideas we hold onto, even in the light of new evidence.
But even more alarming, we actively seek information that confirms our biased predisposition. So not only do we have a tendency to miss new information because of what we think we know, but we then look for evidence to support what we’ll probably believe anyway, contradiction or not.
The Skeptics Dictionary defines the confirmation bias as: a type of selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs.
And when we’re looking for something we want to see and believe, again Talib reminds us that “these instances are always easy to find.”
It’s easy to seek out information that makes you feel good about your thinking or outlook, and much harder to look for information that contradicts your knowledge and beliefs. By reducing what you know, instead of actively searching for confirming evidence, you look for contradictory evidence.
Either way, by looking for contradictions, you actually increase your understanding of the world around you. If there is contradictory information, than you have the ability to update your knowledge and viewpoint, and if there isn’t contradictory information, then huzzah, you were right all along!
“This,” Talib says, “is true self-confidence: the ability to look at the world without the need to find signs that stroke one’s ego.”
4. Good stories are a narrative fallacy
Narrative fallacies are my pet ideas, and I’ll be the first one to admit it. Regardless, a narrative fallacy describes a human nature to shove everything into a simple framework that makes sense.
Basically, it’s the story you tell yourself that gives order to the world. “All homeless people are smelly and unmotivated,” for example. Mostly true, but I’m sure you can find an extremely motivated and well washed homeless person, who has chosen to live a life on the road.
But you’re not going to walk by every homeless person you see and ask them if they have goals, or smell their hair. You’re going to assume that they are the scourge of the Earth, because that narrative makes sense to you. If you meet an engaging homeless person, it’s going to shatter your knowledge of this narrative, and since the shattering of a belief is scary, we choose to believe the simple narrative and move on with our lives.
“The fallacy is associated with our vulnerability to over interpretation and our predilection for compact stories with raw truths.” – Nassim Talib, The Black Swan
When we fail to reduce what we know, we live a life made up of narrative fallacies, where what we tell ourselves isn’t actually true. The narrative fallacy combines with the confirmation bias to cause a distortion of reality, where there is a difference between what you see and what is actually there.
5. The expert problem
I think we can explain this problem in one word: hubris. When we become an expert, we believe that our knowledge and understanding can account for everything we’ll encounter. We assume that our intelligence will provide us with absolute insight.
Sadly, not so. First, as mentioned previously, we reach a point where no additional pieces of information will make us better at forming conclusions:
In 1965, Stuart Oskamp gave clinical psychologists – who are experts by society’s understanding – various files, each containing increasing amounts of information about their patients. Contrary to belief, the psychologists diagnoses did not grow with the additional information. When it comes to sciences that deal with the unpredictability of humans, which many do, gut instinct is often more powerful than reliance on information.
Second, when we become an “expert” (by the way, who decides when someone is an expert – do they get an identifying tattoo?), our knowledge actually blinds us from seeing the greater picture.
For example, Talib points out the work of psychologist Philip Tetlock, who studied political and economic experts. “His study exposed an expert problem: there was no difference in results whether one had a PhD or an undergraduate degree. Well published professors had no advantage over journalists. The only regularity Tetlock found was the negative effect of reputation on prediction: those who had a big reputation were worse predictors than those who had none.”
Further, experts have trouble seeing events that come from outside their specialty even though the events may have a huge effect their specialty or life. It’s much better to seek a broader understanding of the world by choosing to know less, than trying to have deep expertise that blinds you from your surroundings.
I should point out again that when I talk about knowing less, I’m not advocating ignorance. In fact, it’s just the opposite. When we put too much stock in our current knowledge, we begin to think in absolutes, rather than with an open, unprejudiced mind.
By reducing what you know, you accept that life is random and unpredictable, and approach it with open eyes. By stopping the information overload synonymous with today’s society, you give yourself the best chance to succeed. And to me, success trumps information, 10 times out of 10.