Disinformation seems to have increased in recent years, and perhaps dramatically. There are many reasons for this. One is sheer information overload. When there is a massive volume of info to process, people are more prone to fall victim to demagogues and other sources that exploit their biases. Another problem is the attention economy. Large tech companies have an incentive to send us the most engaging content, and not the most accurate content. And those two categories are often very different.
While an ultimate solution to this problem has to rely on systemic interventions – I’m personally fond of nationalizing the large tech platforms to ensure they’re focused on the public good, and not maximum profits for a tiny number of robber barons – there are still things we can do as individuals to protect ourselves from disinformation. I’ve spent a lifetime trying to understand how information is transmitted from one human to another. So I thought I’d share some tidbits I’ve picked up, over the years, that might help others – or provoke a conversation as to whether the standards I’m using are the right ones.
Some of this is taken from Phil Tetlock’s work. Tetlock, the founder of the Good Judgment Project, has studied the best decision-makers on the planet. Other tips are things I’ve picked up from personal experience. (Treat this latter category as significantly less credible, as personal experience is notoriously unreliable, unless it comes from an evidence-based expert. I’d place myself very weakly in that category, partly because of my studies but also because of the sheer volume of people I’ve talked to, and tried to influence.)
Does your source change their mind?
Flip-floppers are widely condemned in public life. John Kerry may have lost his presidential race because of his history of changing his mind. And many people lauded Bernie Sanders because of his principled stand over the years. It makes sense, of course, to trust people more if they don’t bend their beliefs to fit the political fads of the day. Even worse is people who are two-faced; who say one thing to one person, and another thing to another, out of pure self interest.
But that should not prevent us from truly appreciating people who are able to change their mind. Even the best minds, after all, make mistakes. And if someone seems unable to change their mind, and resistant to other ideas, that is a sign that their information might not be credible.
Do they change their minds in big ways or small?
We’ve all met people whose judgment changes rapidly. One day they are vegan. The next day they are on a carnivore diet. One day they love Joaquin Phoenix. The next day, they say he’s a sell-out and an asshole. There is nothing morally wrong with such rapid and dramatic shifts in judgment. But Tetlock and other researchers have found that, while people who have good judgment change their mind a lot, they change their mind in small ways.
There are at least two reasons this is important. First, any new information we receive, about an issue or person, has to be weighed against all the other information we already have. And an accurate process of “updating” our beliefs should weigh that new evidence against that long history. Rarely, do we get new info that is so overwhelming that it outweighs all the information we’ve obtained before. But the emotional momentum of what’s recent and “available” often distorts our judgment. Credible sources avoid this bias.
Second, human beings tend to move in social waves; they get caught up in the fads of the moment. But this often distorts our judgment. We base our beliefs and opinions on what will give us social standing, rather than what we can ascertain to be objectively correct. People who tend to make small changes in judgment can resist this social momentum.
Do they consider the counter-arguments – and frame those counter arguments in their strongest form?
In his book, Think Again, the psychologist Adam Grant argues that good decisions require a certain form of self doubt. Across many domains, the people with the best and most credible judgment appear to start, not from where their beliefs are strong, but rather where they are weak.
This is part of the project of this blog, of course. We want to “steel-man” other perspectives; in contrast, typically what you see in debate and discussion is a “straw man.” That is, a counter-argument that is so distorted and fake that it’s hardly an argument at all.
We see this in debates constantly, including in animal rights. We will say, for example, that there’s no health benefits at all to eating meat. Or that there’s no scientific value in experimenting on animals. And we don’t bother to dive deeper and understand why our assertions might be wrong. We assert simply because it fits our pre-existing narrative and bias.
But this is a recipe for self-delusion that prevents us from understanding the world the way it is — and, even more importantly, trying to change it.
The sources we trust, then, should not do this. You should be trusting sources more when they make the best arguments for the other side, rather than merely attacking people who disagree as stupid or corrupt.
Are your sources the hero of their own stories?
This is something that I’ve just picked up over the years, so treat it with a grain of salt. But I’ve found that, across a wide range of human behavior, the people who are always the hero of their own stories generally don’t get things right. The world is a complicated and messy place; so, too, are we. And sources that are credible should show this by admitting their own mistakes.
Technical mistakes, ones that relate to erroneous information, are the easiest to admit. For example, perhaps they read a table wrong, or just quoted the wrong number from a chart. Sociopolitical mistakes, relating to one’s beliefs about norms and people, are harder to admit and mean more. Was there a time in your life when your judgment about someone else was wrong? When you were too wedded to a political ideology that you later realized was a mistake?
But the most important mistakes to admit are mistakes of character and morality. Precisely because they are the most costly. These are mistakes not of information or social judgment, but mistakes in our own ethical judgment. Was there a time that you prioritized your own well being over the well being of the people on your team, or even your family? Was there a time that you let your anger get a hold of you, and it led you to hurt someone in a way that was unfair?
Phil Tetlock has written that the best decision-makers are the ones who admit their own biases. I think that’s true, but I also would argue that the biases that are the hardest to overcome are the ones that are most humiliating to admit. And while many of us will admit to having a fact wrong, far fewer will admit that they acted wrongly.
So when you look at a source, ask yourself if they are the hero of their own stories, or if they are sometimes the villain? If they’re open about the latter, then, perversely, they might be a source you can trust.