Being wrong

Opinion by Adrian Liu
Feb. 28, 2019, 1:00 a.m.

As I sat down to write my first opinion piece, a familiar thought greeted me: I understand so little of what I’m writing about that I can’t have an opinion on it.

This was a thought previously most salient to me in discussion sections. I would stay quiet: an insecurity — insecurity that because I knew so little about the nuances of the topic, I was nowhere near qualified to state an opinion — spurred on my silence. My contributions would inevitably be oversimplified, miss crucial aspects of the relevant discussion and likely just be simply wrong.

The issue is surely too complex for me to understand, I would think — best to wait a few years, read more about this and then form some opinion on it after a PhD. And best to stay quiet until then.

I think this inclination of mine was borne out of a reaction to those who I saw inside and outside discussion sections who seemed overconfident in the correctness of their views. An overconfident, unconsidered opinion belied a disregard for the truth, in my view. Conversely, if I valued the truth, I should keep my mouth shut in cases when I was unsure about it.

I came to realize, however, that such an attitude was unproductive in the lengths to which I took it. Not wanting to express potentially untruthful opinions might have seemed a form of respect for the truth, but it also hampered my ability to seek it. In declining to have an opinion, I was throwing my hands up, intellectually speaking — I was saying the situation is too complex for me to judge, and then punting rather than plumbing the complexities at hand. I would assume there was something incomplete or incorrect in my understanding, but would not figure out what specifically was wrong.

It was through writing essays, when I was forced to take an opinion and defend it, that I found myself getting closer to defending positions that seemed correct to me. In the process of articulating what I thought was the most reasonable position on something, I could find out where in fact my understanding was incomplete or incorrect. Once I had an opinion, and once I understood what that opinion was, I could understand the ways it was wrong. In section discussions, I had been avoiding this process and thus staying with my hazy intuitions and my hazy convictions that they must be somehow wrong.

It’s not fun to be wrong, and I think we all find ourselves avoiding being wrong at times. In my case, I would avoid voicing opinions. Let’s call me an “auditor.” In other cases, I see people voice opinions while disavowing them in order to avoid being wrong. Let’s call such a person a “devil’s advocate.” They will debate, but only from the viewpoint of the devil — that is, they will take up a viewpoint, but not own it as their own. It’s a hypothetical, a viewpoint that someone might hold. If it’s wrong, that’s fine because they never claimed to believe it themselves anyway. By holding no opinion at all, they’ve removed all possibility of being wrong.

Another group avoids being wrong by voicing their opinions and vigorously rejecting other viewpoints, not seeking the truth but seeking to protect their viewpoints. Let’s call them “debaters,” since in a debate you first assume your side is correct, and then argue from there. Overconfident debaters must protect their self-conception of being always right. They cognitively shield themselves, through ignoring, mischaracterizing or dismissing opposing opinions, from any suggestions that they might be wrong. Insecure debaters act similarly, but from a place of fearing that their opinions are not robust enough to stand up to scrutiny. If they reject scrutiny out of hand, it can never challenge their views.

Academic discourse can be described as a collaborative conversation that aims at the truth. In this conception, both my former stance and the stance of the debater are woefully counterproductive: my stance because there can be no conversation if one side is not talking, the debater’s stance because there can be no aiming at truth if either side thinks they’ve already found it. The devil’s advocate is slightly more useful — they can poke and prod at arguments, and if we can fortify our arguments to stand up to the devil’s advocate, we’ve made them stronger. But infernal advocacy should be undergirded with a genuine viewpoint; otherwise it is truly a devil’s viewpoint, tearing down opinions, making no attempt to build up new ones.

A productive conversation occurs in a middle ground wherein all sides are aiming at the truth, have opinions to bring to the table and are willing to be wrong.

Of these three properties of the middle ground, the last one — willingness to be wrong — is surely the most difficult. By presenting opinions in essays and receiving feedback, I gain a better understanding of all the places where I was liable to be wrong. But as I am wrong more and more often, the quotidian truth became increasingly salient to me: being wrong is not fun. I find myself veering toward the other side — toward wanting to defend my opinions against the suggestion that they might be wrong, to dig my heels in and insist that criticisms were misguided, that objections were based on misunderstandings of what I was saying.

In some cases, I’ve found myself convinced by a counterargument and then, apparently unwilling to admit I had changed my mind, would continue arguing a position I didn’t agree with anymore. Bemusedly, I watch myself defending an opinion I myself didn’t believe in anymore and wonder helplessly why I was digging my own grave. Through these experiences I’ve gained some measure of empathy for the plight of the debaters. The desire to protect one’s viewpoint is certainly understandable, if unproductive.

I’ve introduced four types of discussion tactics borne of not wanting to be wrong, and I named their practitioners: the auditor, the devil’s advocate, the overconfident debater, and the insecure debater. I doubt this is exhaustive, but I think they do give us examples of prominent strategies we are apt to come across in our discussions. In contrast to these, the mindset that is actually conducive to productive conversation aiming at the truth requires willingness to put one’s opinions out into a forum in good faith and being willing to accept that they’re probably wrong.

Good faith, in this context, is basically a commitment to truth: what you say should be what you think is the most true, and you should challenge others only when you believe they could change their view to get closer to the truth. This also means that you are attached to your opinions only insofar as you see them as good approximations of how the world actually is, and that once you are convinced that there is a better approximation, you resist the urge to hold on to your old view.

In practice, good faith means recognizing all the ways we are apt to fall short of these goals — I’ve noted four — and simply making an effort to avoid them. I suspect all of us can see ourselves in one of the four classes of discussion tacticians. Whichever one hides inside you, I urge you to join me in working to get to the middle. Such centrism in truth-seeking is not a cop-out. Let’s accept that we’re inevitably going to be wrong, and that it’s not going to be fun, but that we can at least be wrong together. Only then can we collaboratively inch closer towards getting it right.

 

Contact Adrian Liu at adliu ‘at’ stanford.edu.

Adrian Liu '20 was Editor of Opinions in Volumes 257 and 259.

Login or create an account

Apply to The Daily’s High School Summer Program

deadline EXTENDED TO april 28!

Days
Hours
Minutes
Seconds