Close Mobile Menu

The Poll Slayer: New Book Argues that Surveys are Simplistic But Humans are Complicated

February 3, 2016
by Eli Wolfe
polling

It’s virtually impossible these days to imagine an America without those vaunted interpreters of the national mood: polls. They help determine the fate of political contenders, shape social policies, and interpret the mood of the nation. The aggregator realclearpolitics.com lists no fewer than 22 public polls in the past week focused on the Democratic presidential primary alone.

But are polls as useful as we think?

Not by a long shot, argues Robert Wuthnow, a Princeton sociology professor who’s been studying survey research since he was a grad student at UC Berkeley in the 1960s. In a new book, he not only charts the history of polling but takes a well-aimed swing at the billion-dollar polling industry.

While the nation currently seems obsessed with the parsing of political polls, his book focuses on surveys about faith. The title of his book Inventing American Religion suggests its conclusion: Polls about faith are next-to-useless in telling Americans anything about what they believe.

“Religion is complicated, it’s personal,” Wuthnow says. “To reduce it to a ‘yes-or-no’ public opinion question and then come out with these amazingly overstated headlines and conclusions that religion is dying in America, it just doesn’t make any sense.”

He characterizes the history of religious polling as short, Christian, and highly flawed.

Small religious congregations trying to learn more about their worshippers were conducting informal internal polls throughout the 19th century, but the  first national poll on religious beliefs occurred in 1939, courtesy of George Gallup. It honed in on basic questions that he thought would be of interest to Christians: how many Americans believed in God? How many attended church on a weekly basis? How many owned a Bible?

Still, Americans initially just weren’t that interested. “I was actually surprised in going back and tracing the responses to the polls—nobody cared,” Wuthnow says. “Gallup did his best to make it a story, but it just didn’t get much attention.”

But such nonchalance began to change after World War II. Facing an anti-religious, communist hegemony in Eastern Europe, people in the United States craved scientific proof that they belonged to a devout, Christian nation. Statistics that had previously been disregarded—like the fact that 9 out of 10 Americans believed in the existence of God, or that only 1 in 100 considered themselves an atheist—suddenly became a cornerstone of American identity.

Over the next decades, religious polling expanded into an important subset of the polling industry. Pollsters collected data on how members of different religious groups felt about abortion, birth control, apostasy, divorce, and a slew of other issues. But, as Wuthnow noted, even as people were quizzed about their faith, the biased, Christian framework for religious polling was slipping into place.

This became most apparent in 1976, after the surprise election of president Jimmy Carter. Journalists zeroed in on the idea that Carter had received a swell of support from a hidden population of evangelical Christians dwelling in the American heartland. Following the election, numerous pollsters—led by Gallup—focused their research on evangelicals.

The emphasis on Christian-oriented polling was reinforced by the emergence of religious-marketing companies in the 1980s. Organizations like the Barna Research Group churned out polls commissioned by evangelical leaders. These polls were shared with the general public through televangelists such as Pat Robertson and Jerry Falwell, who helped to strengthen the public’s impression of the United States as an overwhelmingly Christian nation.

In Inventing American Religion, Wuthnow argued that religious polls have become so skewed by commercial and political forces that they don’t take into account the increasingly diverse religious landscape of the United States. His term for this phenomenon: Christian norming.

“White norming” describes the bias among pollsters to ask questions that make more sense to white Americans than people of other races. But “Christian norming” is also a problem.

In general polling, “white norming” describes the bias among pollsters to ask questions that make more sense to white Americans than people of other races. But Wuthnow said many polls perpetuate a similar problem with religion by asking questions tailored for white, Protestant Christians. “Christians, especially evangelicals, are much more comfortable defining themselves in terms of belief,” he says, noting that many religious polls simply ask respondents whether they believe in God, the Bible, salvation, etc. “If you’re Jewish or Muslim, that doesn’t work for you—your religion is more about practice, ritual, maybe even family tradition.”

Wuthnow is also concerned about the inherent fogginess of religious belief. For most people, describing personal faith is difficult, and may be impossible to do over the course of a five-minute phone interview with a stranger.

Take, for example, the “religious nones.” As their title suggests, these are individuals who identify as non-religious—but the beliefs within this group are actually quite diverse.

“Most of the people who say they have no religion wouldn’t consider themselves atheists or agnostics,” says Berkeley sociology professor Claude Fischer, who has read Wuthnow’s book. His own research has found that many so-called religious nones regularly attend religious services, pray, and even consider themselves spiritual. 

“You really have to pay attention to how these words are understood,” Fischer says. “The meaning of these words often changes over time.”

Another problem afflicting religious polling, but also polling in general, is the alarming historical drop in response rates, which experts blame on polling saturation.  Americans these days are typically no longer flattered to be asked their opinions by a pollster, given the growth in political and marketing survey research companies. Wuthnow reports the United States has over 1,200 polling firms, which made more than 3 billion calls over the course of more than 37,000 polls in the last presidential election. 

“The average poll now gets an 8 percent response rate,” Wuthnow says. “When you go out and you do a poll and don’t have any idea what 92 percent of the people think, it’s not going to be a very good poll.”

To be fair, a single-digit response rate doesn’t automatically negate the quality of a poll. Laura Stoker, Berkeley associate professor of political science, notes that researchers are able to compensate for low response rates with post-stratification weights—a technique that basically inflates or deflates the size of a sample population to achieve the desired number of subjects.

Doubling or halving the answers of various respondents isn’t an ideal way to produce data. But Stoker said it’s a necessary evil in an age where it’s unrealistic to expect even one in 10 people to respond to a survey.

“Trust in polls is on the decline in general,” Stoker says. “We can try to compensate for that, but it’s not a panacea.”  

Greg Smith, associate director of research at the Pew Research Center, acknowledges that declining response rates are an industry-wide concern. He even offered an example of how the trend already is impacting Pew polls: People who attend church regularly also happen to respond to polls more often, which can result in data over-estimating the number of church-going Americans.

While he disagreed with a sweeping assertion about the unreliability of religious polling, he agrees their usefulness is limited. “If you want to know the full detail and the full depth of one person’s religious experience, or the full complexity of their beliefs about God, a nationwide survey isn’t going to be the tool you choose,” Smith says. “Surveys can’t tell us everything we might want to know about religion in the United States, but if done well, they can tell us a lot.”

He also adds that in recent years, Pew has conducted several nation-wide surveys aimed at Islamic, Jewish, Hindu, Mormon and Catholic respondents.;

“We’ve done a tremendous amount of work and expended a tremendous amount of effort in trying to understand and document and explore beliefs and practices and experiences of most non-Christian faiths,” Smith says.

Even so, Wuthnow insists this doesn’t solve the fundamental problem underlying the entire polling industry: Pollsters value quantity over quality.

“It’s just kind of like ambient noise,” Wuthnow said. “It’s something we’ve gotten used to and it’s out there. But whether we take it seriously? I’m not so sure that we do.”

Share this article