The title, in case you didn’t already guess, is fake news. There was no study. But think about your reaction when you read it. Raise your hand if you said “wait a minute, I always thought fake news was a huge deal but I guess this study proved me wrong. I’ll just change my mind without thinking about it at all.” Anyone? Yeah I didn’t think so. And if you aren’t convinced by a headline on a reputable publication such as this one (OK maybe not so much), are you really buying the fake headlines that the Pope backed Trump or that Hillary actually didn’t win the popular vote?
Recently there has been an uproar surrounding these fake headlines. Germany wants Facebook to pay $500,000 for every fake news story that shows up. California (of course) wants to pass a law that will make sure every high school teaches its students how to spot fake news stories. I wish those stories were themselves fake news, but they appear to be all too real.
Now there probably are some people who do read these fake headlines and don’t do their research. Maybe they’ll store it somewhere in the back of their mind and use it as evidence to support their positions in debates with their friends. But I suspect that the only people who believe a fake headline are ones who were already inclined to believe it before they read it. No study has been done, but I’ll make the claim anyway: Nobody changes their mind because of fake news.
(One qualification to the above point is that it may break down if real news were censored. Here I am thinking about a case where the government restricts the media so that propaganda becomes the only source of information. Obviously that would be a major problem)
Perhaps more concerning is that people also don’t seem to change their mind because of real news either. They don’t let the facts guide their positions, but instead seek out the facts that support the positions they already held. Is believing a fake news story any worse than only believing the stories that confirm your preconceived inclinations?
In other words, the problem is not fake news. The problem is confirmation bias. Everyone’s guilty of it. I certainly am. How could you not be? With the internet at your fingertips, evidence supporting nearly any argument is freely available. And I don’t just mean op-eds or random blog posts. Even finding academic research to support almost anything has become incredibly easy.
Let’s say you want to take a stand on whether the government should provide stimulus to get out of a recession. Is government spending an effective way to restore growth? You want to let the facts guide you so you turn to the empirical literature. Maybe you start by looking at the work of Robert Barro, a Harvard scholar who has dedicated a significant portion of his research to the size of the fiscal multiplier. Based on his findings, he has argued that using government spending to combat a recession is “voodoo economics.” But then you see that Christina Romer, an equally respected economist, is much more optimistic about the effects of government spending. And then you realize that you could pick just about any number for the spending multiplier and find some paper that supports it.
So you’re left with two options. You can either spend a lifetime digging into these dense academic papers, learning the methods they use, weighing the pros and cons of each of their empirical strategies, and coming to a well-reasoned conclusion about which seems the most likely to be accurate. Or you can fall back on ideology. If you’re conservative, you share Barro’s findings all over your Facebook feed. Your conservative friends see the headline and think “I knew it all along, those Obama deficits were no good,” while the liberals come along and say, “You believe Barro? His findings have been debunked. The stimulus saved the economy.” And your noble fact finding mission ends in people digging in their heels even further.
That’s just one small topic in one field. There’s simply no way to have a qualified, fact-driven opinion on every topic. To take a position, you need to have a frame to view the world through. You need to be biased. And this reality means that it takes very little to convince us of things that we already want to believe. Changing your mind, even in the face of what could be considered contradictory evidence, becomes incredibly hard.
I don’t have a solution, but I do have a suggestion. Stop pretending to be so smart. On every issue, no matter what you believe, you’re very likely to either be on the wrong side or have a bad argument for being on the right side. What do the facts say, you ask? It would only be a slight exaggeration to say that they can show pretty much anything you want. I’ve spent most of my time the last 5 or so years trying to learn economics. Above all else, I’ve learned two things in that time. The first is that I’m pretty confident I have no idea how the economy works. The second is something I am even more confident about: you don’t know how it works either.