Trust Magazine

‘Defining the Universe’ Is Essential When Writing About Survey Data

Lessons learned

In this Issue:

  • Winter 2019
  • In Philadelphia, a Wellspring for Artistic Creativity
  • Inspired by the Power of Knowledge
  • The Big Picture
  • Noteworthy
  • Teens and Their Cellphones
  • Dispatch: Key to Healthy Fisheries
  • Stateline: Movement Motivator
  • 'Defining the Universe' Is Essential When Writing About Survey Data
  • Q&A: Science-Based Accord to Protect Arctic Ocean
  • States Jump at Chance to Boost Revenue with Sports Betting
  • Return on Investment
  • Improving Public Policy
  • Informing the Public
  • Invigorating Civic Life
  • End Note: A New Way to Categorize Americans by Religion
  • Progress in 2018: A Year of Working Together
  • What Is the Future of Truth?
  • View All Other Issues
‘Defining the Universe’ Is Essential When Writing About Survey Data

The answer to almost any survey question depends on who you ask.

At the Pew Research Center, we conduct surveys in the United States and dozens of other countries on topics ranging from politics and religion to science and technology. Given the wide range of people we speak to for our polls—and the many issues we ask them about—it’s important to be as clear as possible in our writing about exactly who says what.

In research circles, this practice is sometimes called “defining the universe”—that is, clearly identifying the population whose attitudes we’re studying, whether those people are police officers in the U.S., Christians in Western Europe, or some other specific group. This kind of clarification can go a long way toward ensuring that readers interpret survey results correctly.

In many cases, the universe of survey respondents we’re talking about is relatively straightforward. If we say that nearly two-thirds of U.S. adults think the national economic situation is good, we’re referring to the views of Americans ages 18 and older. But in other cases, additional clarification and context may be needed, particularly for readers who aren’t schooled in the art of interpreting poll results.

Consider one of our recent survey findings about Facebook users. The survey found that 74 percent of adult Facebook users in the U.S. have taken at least one of the following three steps in the past 12 months: adjusted their privacy settings, taken a break from checking the platform for several weeks or more, or deleted the app from their phone.

But there are some caveats to bear in mind when considering this finding. First, not all U.S. adults use Facebook. Pew Research Center’s most recent estimate is that 68 percent of American adults use the platform. In other words, it’s not accurate to say that 74 percent of all U.S. adults have adjusted their Facebook privacy settings, taken a break from the platform, or deleted the app from their phone in the past 12 months. It is accurate to say that 74 percent of U.S. adults who currently use Facebook—that is, 74 percent of that 68 percent—have taken one of these three steps in the past year.

It’s also crucial to remember that this finding only refers to adult Facebook users in one country. Worldwide, Facebook claims more than 2 billion monthly active users, and our survey did not include respondents from countries outside the U.S. It also doesn’t reflect U.S. Facebook users who are younger than 18—a substantial population.

Another example of a survey universe that’s not always easily understood concerns U.S. politics. The center and other polling organizations frequently conduct polls to find out Americans’ views on political topics. But these views can vary depending on whether the universe of interest is all U.S. adults, registered voters, or—in some cases—an even more narrowly defined group, likely voters.

Only about two-thirds of adults are registered to vote, and these adults tend to be more politically engaged than those who are not registered. That can translate into differences in attitudes.

Some polling organizations also report on the subset of the public they deem likely to vote, particularly for surveys conducted in the run-up to an election. This determination is complicated, but might be made by asking respondents questions about their voting behavior and intentions, or by using people’s past records of voting, among other things.

Since different polling organizations use different definitions of the term “likely voters,” it’s important to understand who is and isn’t included in the definition being used.

Survey data can be tricky for nonexperts to follow. It can be confusing for some readers to understand why 51 percent may not represent a “majority,” or how a survey of 1,000 people can tell you what an entire country thinks. One way to help readers interpret the many poll findings they encounter is to be as clear as possible about the group of people whose answers are being described. 

John Gramlich is a writer and editor at the Pew Research Center.

This article was previously published on and appears in this issue of Trust Magazine.

Q&A: Science-Based Accord to Protect Arctic Ocean Stateline: Movement Motivator
19x9 placeholder

Pew Research Center

Learn More
Quick View

Pew Research Center

A nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research.

Learn More