"I feel like I'm not the sort of person who's allowed to have opinions about the important issues like AI risk."
"What's the bad thing that might happen if you expressed your opinion?"
"It would be wrong in some way I hadn't foreseen, and people would think less of me."
"Do you think less of other people who have wrong opinions?"
"Not if they change their minds when confronted with the evidence."
"Would you do that?"
"Yeah."
"Do you think other people think less of those who do that?"
"No."
"Well, if it's alright for other people to make mistakes, what makes YOU so special?"
A lot of my otherwise very smart and thoughtful friends seem to have a mental block around thinking on certain topics, because they're the sort of topics Important People have Important Opinions around. There seem to be two very different reasons for this sort of block:
- Being wrong feels bad.
- They might lose the respect of others.
Be wrong
If you don't have an opinion, you can hold onto the fantasy that someday, once you figure the thing out, you'll end up having a right opinion. But if you put yourself out there with an opinion that's unmistakably your own, you don't have that excuse anymore.
This is related to the desire to pass tests. The smart kids go through school and are taught - explicitly or tacitly - that as long as they get good grades they're doing OK, and if they try at all they can get good grades. So when they bump up against a problem that might actually be hard, there's a strong impulse to look away, to redirect to something else. So they do.
You have to understand that this system is not real, it's just a game. In real life you have to be straight-up wrong sometimes. So you may as well get it over with.
If you expect to be wrong when you guess, then you're already wrong, and paying the price for it. As Eugene Gendlin said:
What is true is already so.
Owning up to it doesn't make it worse.
Not being open about it doesn't make it go away.
And because it's true, it is what is there to be interacted with.
Anything untrue isn't there to be lived.
People can stand what is true,
for they are already enduring it.
What you would be mistaken about, you're already mistaken about. Owning up to it doesn't make you any more mistaken. Not being open about it doesn't make it go away.
"You're already "wrong" in the sense that your anticipations aren't perfectly aligned with reality. You just haven't put yourself in a situation where you've openly tried to guess the teacher's password. But if you want more power over the world, you need to focus your uncertainty - and this only reliably makes you righter if you repeatedly test your beliefs. Which means sometimes being wrong, and noticing. (And then, of course, changing your mind.)
Being wrong is how you learn - by testing hypotheses.
In secret
Getting used to being wrong - forming the boldest hypotheses your current beliefs can truly justify so that you can correct your model based on the data - is painful and I don't have a good solution to getting over it except to tough it out. But there's a part of the problem we can separate out, which is - the pain of being wrong publicly.
When I attended a Toastmasters club, one of the things I liked a lot about giving speeches there was that the stakes were low in terms of the content. If I were giving a presentation at work, I had to worry about my generic presentation skills, but also whether the way I was presenting it was a good match for my audience, and also whether the idea I was pitching was a good strategic move for the company or my career, and also whether the information I was presenting was accurate. At Toastmasters, all the content-related stakes were gone. No one with the power to promote or fire me was present. Everyone was on my side, and the group was all about helping each other get better. So all I had to think about was the form of my speech.
Once I'd learned some general presentations at Toastmasters, it became easier to give talks where I did care about the content and there were real-world consequences to the quality of the talk. I'd gotten practice on the form of public speaking separately - so now I could relax about that, and just focus on getting the content right.
Similarly, expressing opinions publicly can be stressful because of the work of generating likely hypotheses, and revealing to yourself that you are farther behind in understanding things than you thought - but also because of the perceived social consequences of sounding stupid. You can at least isolate the last factor, by starting out thinking things through in secret. This works by separating epistemic uncertainty from social confidence. (This is closely related to the dichotomy between social and objective respect.)
Read and discuss a book on a topic you want to have opinions about, with one trusted friend. Start a secret blog - or just take notes. Practice having opinions at all, that you can be wrong about, before you worry about being accountable for your opinions. One step at a time.
Of course, as soon as you can stand to do this in public, that's better - you'll learn faster, you'll get help. But if you're not there yet, this is a step along the way. If the choice is between having private opinions and having none, have private opinions. (Also related: If we can't lie to others, we will lie to ourselves.)
Before you're publicly right, consider being secretly wrong. Better to be secretly wrong, than secretly not even wrong.
(Cross-posted at LessWrong.)
Claim 1: "Be wrong." Articulating your models and implied beliefs about the world is an important step in improving your understanding. The simple act of explicitly constraining your anticipations so that you'll be able to tell if you're wrong will lead to updating your beliefs in response to evidence.
If you want to discuss this claim, I encourage you to do it as a reply to this comment.
Claim 2: The social inhibition against making strong claims can interfere with the learning process by making people reluctant to articulate their beliefs, for reasons mostly unrelated to epistemic humility.
If you want to discuss this claim, I encourage you to do it as a reply to this comment.
Claim 3: "In secret." Because of claims 1 and 2, if you notice yourself unable or unwilling to articulate and test your beliefs in front of others, you can get much of the benefit for comparatively little of the difficulty, by doing so in secret.
If you want to discuss this claim, I encourage you to do it as a reply to this comment.
Yes. I believe that this is a good and valuable method by which to build internal honesty, but I also claim that the benefits of public honesty are significantly underrated. (This is based on personal opinion and experience, and relates to your honesty vs authenticity essay.)
I've seen that people who are able to consistently portray themselves as correct (whether by being very careful about what they say, or by lawyering the discussion after the fact - especially with highly technical knowledge) without accepting blame or culpability. This behavior can easily lead to success in a large, bureaucratic system with lots of rules and formality - because you are always right or the big winner on paper. But anyone they've interacted with consistently - such as in a smaller, tight-knit group, or any situation where several other people who interact with them in different settings can correspond - will soon see them as deceptive or lacking authenticity. It's a sort of bayesian reasoning where you have no proof regarding any specific incident, but the whole thing doesn't add up properly.
But having friends, family, and peers' respect is, I think, more valuable to most of us in the long run than immediate success. And you don't get that by "being right" - you get that when others' history of experience and intuition totals to "I can trust this person." Demonstrating that you don't lie to yourself, that you don't lie to others, and that you either don't have the capability to carefully massage everything to come out on top (or choose not to do so if you can) is key to wholesome long-term relationships of every kind.
Pingback: Canons (What are they good for?) | Compass Rose
Pingback: Improve comments by tagging claims | Compass Rose
Pingback: Harmonizing disparately operating, highly effective people. – Entreprofessor
Pingback: Brainstorming micro-cultures for highly effective, disparately operating people. – Entreprofessor
Pingback: Approval Extraction Advertised as Production | Compass Rose