Ignorance Is Not Bliss: How Ignorance Kills Conversations

Last week, rapper B.o.B. made an extraordinary claim that left most of the scientific com–let me rephrase that– most of the world dumbfounded. It turns out that after all his research and years of scientific experimentation, the entertainer finally answered one of humanity’s deepest and most unanswered questions: what is the shape of the Earth? Well, to your unending surprise and amazement it turns that out that our planet is actually flat! Like a pancake. But that’s not all. The entertainer readily took to Instagram to present his mounting evidence. Oh but it doesn’t end there. When confronted by–pretty much–one of science’s greatest minds alive, Neil deGrasse Tyson, the rapper not only defended his point but went on to record a diss track to put the astrophysicist in his place. The consensus is in, the Earth is flat.

Well, you can probably guess how that went. In about fifteen minutes the Internet hilariously did its duty and mopped the flat floor with this guy. Unfortunately, in the world we live in where mass communication makes it incredibly easier for news to spread like wildfire, the message was already traveling the airwaves. The damage had been done.

Although this guy managed to set us back about a thousand years and simultaneously put the American education system to shame (or in its place, depending on your point of view), nobody actually takes him serious enough to change the science books. But still there are many people out there who due to B.o.B’s high(ish) profile and influence, and their own limited knowledge, are willing to believe his incredible claim over someone who happens to be an authority on this particular area, and many others. And it makes you wonder a bit what would motivate someone like B.o.B in this day and age to go against a planet’s worth of evidence and scientific proof with such confidence and swagger. Especially when information is incredibly easy to access.

Well, as it turns out this happens a lot more often than you think. In fact, most of us do it daily, we access the knowledge that we have and apply it wherever it may be needed, sometimes without realizing that we know very little about what we’re talking about. Essentially most of us go through our days believing we know more than we do. But we also underestimate our abilities in certain areas of our lives, and all of that gets compiled into social errors that we make which ultimately affect how we present ourselves to the world, how others see us, and the toll that our credibility takes among our peers.

While it’s perfectly fine to be skeptical and question institutions and established ideals, we must be careful to make claims that we are not sure we can’t back up with hard evidence. In fact, Carl Sagan said it best: “Extraordinary claims require extraordinary evidence.”

Take note B.o.B.


…the more you think you know, the saying goes. Unfortunately, very few people know, or aware, of this fact.

As I understand it what happened with B.o.B. is what the psychologists call “illusory superiority” or the Dunning-Kruger effect where a person with little or no skill in a particular subject or domain believes their skill to be greater than it is, while a person with a high degree of skill usually undermines their own abilities or fails to recognize that others are not at their level. Note that what I’m referring to has nothing to do with intelligence but everything to do with information storage.

Why is this a problem?

If you’ve ever been in an argument that you can’t win, not because of your own incompetence but because of your opponent’s indifference to their very own incompetence and their insistence at their limited evidence, then you understand that frustration firsthand.

In fact we see this problem everyday and everywhere from street corners to scientific conferences. People who know very little are usually the ones making the most noise. There is a certain over-confidence that overrides any reservation of doubt, while those who know a bit more hold more reservations precisely because they know the margin of error they can find themselves in.

In a study conducted psychologists David Dunning and Justin Kruger, they attribute the problem to a miscalibration of people’s abilities due to a “deficit in meta-cognitive skill”, in their own words. The study was done mostly with Cornell University undergraduates, which means that there’s already a certain handicap present. However, it should translate to the rest of the population in certain terms, although the study made no mention of there ever being a replication of it. But whereas Cornell university students are more apt to know their own academic misgivings in certain areas as opposed to the regular high-school graduate, the study does clearly show that–and forgive me for being so blunt about it–the cure for your stupidity is, perhaps paradoxically, for you to not be so stupid about it.

The study they performed revealed that when the bottom percentile of the students (meaning those who were more confident and less competent) out of four percentiles, were given short training sessions to better their logical reasoning skills, basically providing them more information on how to determine their actual level of expertise, almost magically they improved their assessment of their own scores just as well as the top percentile of students had, thus making them experts. Note that the training packet did not make them expert test-takers but simply gave them a guide about how to find their own competence level. How does this translate to the rest of us? The study proves something that should come to no surprise to us, that even if you overestimate your knowledge of something, or underestimate your own abilities, the only way to right the way is to acquire more information! Basically, the more informed you are the less you are willing to overstep your boundaries.

Of course, that’s easier said than done. After all, these students had a helping hand, meaning that they were provided the information necessary to be made aware of their error, whereas for the rest of us we are doomed to come to this realization all on our own.

The Unregulated Market of Information

As it turns out lying is not such a good social currency as some people might think. Others of course reap great benefits by doing it. But unless you are a politician, or have some Machiavellian goal in mind, lying or even spreading partial untruths can have a detrimental effect on your place in society. That is simply because people everywhere in the world recognize sincerity as an unspoken social rule to keep us “within the tribe”, and violating it puts out outside of this circle, even–and ironically– if we do it ourselves.

The problem is that information is an unregulated market, especially in our time, and especially in places where information is free and wide and available such as most of the world is today. Mass communication has made it easier for people to propagate ideas and messages while at the same time it has made it increasingly difficult to separate reality from myth. And honestly nobody could blame you when daily we are hit with more information than we can decipher.

The abundance of information has also had an adverse effect. It has allowed consumers of this information to pick and choose the information that best suits them, information that confirms their suspicions, biases, likes and dislikes, etc. No longer are we tied to the inconvenience of truth, now we are free to select only what we like. But it is actually worse than we imagine.

In a pre-Internet world, it was not only harder to propagate a message but also harder to access it. This kept everything tied to single strands of information that could be followed and scrutinized easier. Today anyone with a computer can create his/her own ideological bubble–and they do–and thus isolated bubbles of information form essentially keeping groups with the same frame of mind contained. And this is where the Kruger-Dunning effect comes into play as it easier to maintain the level of knowledge that you have among your group than to spend any significant amount of energy looking into different, and possibly contradicting, claims to your own.

When we look for information, some of us are looking for nothing other than what confirms what we already think or know and ignore the rest. That’s called confirmation bias. But when we are exposed to something beyond that, this puts us at risk of leaving the social bubble that we are part of, our tribe per se, which can result on acceptance of this new information or not, something called cognitive dissonance.

Two things can happen when these ideological bubbles crash into each other: they either burst violently or they merge. The best thing that can happen in any society is for these bubbles to merge because this means that ideas are exchanged and discourse is created.

The Burden of Authority

Too often people challenge the authority of experts as if they themselves were experts in those fields because they have read one or two magazines, paged through a few internet journals, skimmed an article, or watched Discovery Channel. I actually applaud these people because authorities of any kind should be scrutinized. There’s a twist though.

Unsurprisingly, experts of any field of study spend years, sometimes decades, perfecting their art or field. They challenge other thinkers and often challenge established ideas and present their own, sometimes being met with severe opposition from their own colleagues. This happens everywhere in life and it’s not a rare occurrence for innovators to be often shunned by those who maintain the sanctity of knowledge, and because knowledge itself is so fragile and malleable, unlike facts, experts are often scrutinized for what they claim to know. Although it’s not necessary to be formally trained in any area, self-training or self-teaching can only get a person so far before they are met with the unavoidable fact that to improve one’s knowledge one must be able to see an opposing position or at least recognize that one’s picture of the world is not complete without the input of others, especially those who can back up their claims with expertise and hard evidence. And even then it might not work, just as the [very knowledgeable] proponents of the theory of cold fusion.

My point here is that although sometimes authority is harsh to criticize new ideas, it’s not without precedent that those who make new, and at times improbable claims, can end up changing the world.

Our beef with B.o.B.–or anyone else who claims to know more than the experts–is not with the claim itself, but rather with the [weak] evidence presented.

Every field and discipline that exists will always have a pyramid of authority that at times is hard to climb. One must be bold to attempt to climb it, but one has to go about it carefully. The pyramid can be indeed humbling.

The Power of “I Don’t Know”

With the wide availability of mass communication sometimes it might seems that reliable information has to be mined, while disinformation or misinformation is always readily available. The hard part is knowing what you’re looking for and being smart enough to smell the bullshit wherever you may find it. Case in point, this blog.

If anything, embarrassment alone should be a powerful motivator to fact-check. Many people who have been publicly embarrassed by the Internet police can attest to that. As odd as it may sound, that unrelenting, unforgiving side of the Internet is the good side of shared mass communication. I’m speaking generally here and at each occurrence, not always and not by everyone. Rather than face public embarrassment or worse, there’s an easier solution.

Although to most of us the phrase “I don’t know” signifies ignorance, we as a society need to recognize that sometimes ignorance is the best that can happen. Honest ignorance that is. Willful ignorance is another matter. I don’t know opens the door to new possibilities and discoveries and, in certain situations it brings a certain trustworthiness to a person. It lets the world know that while you don’t know the answer, you will not make up bullshit only to appear knowledgeable, that you are willing to find an answer. So never be afraid to utter it and don’t condemn those who do.

The concept that most of don’t seem to grasp is that people who speak like equals are not necessarily equally informed. And in some cases one is detrimentally less informed than the other. So the problem in normal conversation is not ignorance, it’s closed-mindedness, willful ignorance by claiming to know something you are not close to sure of. The reason why people are so confident in their answers of certain things is simply because that is all the information they have at their disposal. But the wrench that breaks the machine is the denial that there is any more information out there to which they just haven’t had access to yet.

To me there are few things more corrosive in topical conversation than preconceived knowledge. People must be made aware that knowledge is ever-evolving and migratory. Even when we believe we know everything about anything, something else can always be learned. In fact, that’s what formal education is based on: building upon previous knowledge.

So next time you argue a point with someone, ask yourself if you have all the facts, if you truly know what you’re talking about, and be humble with the information you possess because you never know when someone will know more than you and not be humble about– that includes the internet. If someone had only told that to B.o.B.

And not that I would, but as for what you’re reading now, read it with skepticism because you don’t know me, I could be bullshitting you for all you know.




Interesting Reads:

“What’s Wrong With Lying” By Christine M. Korsgaard- Harvard University http://www.people.fas.harvard.edu/~korsgaar/CMK.WWLying.pdf

“Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incopetence Lead to Inflated Self-Assessments” By Justin Kruger and David Dunning- Cornell University (Kruger-Dunning Effect Paper)  http://psych.colorado.edu/~vanboven/teaching/p7536_heurbias/p7536_readings/kruger_dunning.pdf

Perceptions in Politics  http://thehill.com/opinion/john-feehery/266941-john-feehery-the-rise-of-the-misinformed-voter-and-donald-trump