Take a look at this tweet:
I saw this in my Twitter feed because Glenn Kessler, fact checker for the Washington Post, retweeted it. I immediately thought it seemed weird. I've been skeptical of any analysis of survey results since I realized accepted practices in the field let you prove anything, no matter how untrue, just to attack groups you don't like (demonstrated here).
That wasn't what was at play here though. What was at play here is the results seemed too good. I know a lot of people believe the United States is growing more polarized, and that may be true, but look at those numbers. 92%? 94%? Even if those numbers were correct, how could we measure liberalism and conservatism of an entire country well enough to know?
I'm not convinced we could. To investigate, I decided to look at the data. It turns out that's not available. The Pew Research Center doesn't publish their data so people can check their conclusions. They publish a summary of sorts, but it doesn't even include the information shown in their charts. It shows reponse rates for all respondents, not stratified results for Republican/Democrat respondents separately. That means readers don't even have the information necessary to plot the same graphs, much less verify them.
I think it's weird news articles get written based off work which cannot be investigated. I think it's weird for journalists to trust a source so much they decide verification is not necessary. But whatever. It's apparently an accepted practice. Journalists will just publish graphs people hand them if they like what those graphs show.
That's not just me being snarky and criticizing the lack of investigation. It's me pointing out the way these graphs are being used is completely bogus. The graphs were created by combining the results of ten questions. No explanation is given as to how these questions were originally chosen. No explanation is given as to why we should expect these questions to measure conservatism/liberalism. The report says:
The questions cover a range of political values including attitudes about size and scope of government, the social safety net, immigration, homosexuality, business, the environment, foreign policy and racial discrimination.
Why are those the issues we need to look at to measure conservatism/liberalism? It doesn't say. It doesn't even say how we know the questions ask truly get at the conservative/liberal split on those issues. I don't think they do. The respondents were asked to select which options in these rows they agreed with more:
Look at the third line. A person is measured as "conservative" if he or she thinks poor people have it easy because of government assistance. The economy has gone up in down in the last 20 years. A person could think poor people have it easy one year and hard another without changing how conservative/liberal they are. Even worse, government benefits have changed over the last 20 years. Obamacare alone would change how people answer this question.
And the fourth question? The United States economy has changed over the last 20 years, and its debt burden has grown by an enormous amount. When the country's debt grows, of course how much it can spend on things, such as helping needy, will change. And again, government assistance has changed. The amount of money being spent to help the needy has changed. Is it any surprise people's view on whether or not we need to spend more has changed?
How about that sixth question? The role of immigrants in a country's economy can change a great deal in 20 years time. Why should people recognizing that be viewed as the country becoming more polarized? And what about the change in how immigrants are treated in those 20 years? The rights of illegal immigrants have been expanded in the last 20 years, including their rights to certain benefits. Does changing your view to reflect a change in the true answer really mean you've become more polarized?
These are just some of the obvious concerns with using these ten questions to measure how conservative or liberal people are. Nobody seems to have looked into them. The journalists at the Washington Post seem to have just ignored them. They don't even tell their readers what questions were asked. I'd check for myself, but again, the Pew Research Center doesn't provide its data so people can verify its conclusions.
Fortunately, the Pew Research Center did make a figure showing some of that data. They published a figure showing how Republicans and Democrats responded to the ten question over the last 20 years:
As it shows, these 10 questions don't all measure the same thing. Some questions, like how people feel about homosexuality, show Democrats and Republicans are moving in the same direction. Other questions, such as the one question on foreign policy, show one party's views changing while the other remains constant. Then you have questions like the ones I highlighted. All three questions I highlighted show an increase in "polarization."
For instance, a person who believes economic problems, growing debt levels and the introduction of Obamacare mean the government can't afford to do much more to help the needy is labeled as more "conservative." The change in the amount of money available, and the change in amount of money being spent are simply ignored. If a person's view on the question has changed, this study and the Washington Post means their view on liberalism/conservatism has changed.
That's how they produced the graph which started all this, a graph the author of the Washington Post articled cited in the tweet says "explains everything you need to know about partisanship." Explains everything, seriously? It doesn't explain anything. It indicates people have become more polarized on certain issues, but look at which issues those are. Almost every single one of them can be tied to the changing economy and government policies of the United States.
The graph this author promotes as "so important" does absolutely nothing to show Republicans are becoming more conservative and Democrats are becoming more liberal. All it shows is when the reality of a country's situation changes, people's views on that situation change.
And this would be obvious to anyone who bothered to investigate the study before reporting on it. It would be obvious to a "fact checker" like Glenn Kessler if he, you know, checked the facts of the story. Instead, these journalists at a major newspaper promote a story which amounts to nothing more than, "Some people handed us some pretty charts, so we published them."