< Back to the Blog

Avoiding Survey Pitfalls--Pseudo-Data

posted Wed, Sep 12, 2018

By Don Zimmerman, Ph.D., Principal, Communication Science, Inc. 

The 2018 mid-term election campaigns have begun.  And political advertising dominates the media. And pollsters launched multiple surveys to assess voting preferences.

Last week, I received a call from a political pollster asking my opinions about selected candidates and proposed Colorado legislation.

After a quick series of questions about national and state candidates, the surveyor then began asking about ballot initiatives.

When the surveyor asked about one initiative, I pointed out that I didn’t know about it. She then read a lengthy explanation and began asking a series of complex, detailed and convoluted questions.

I had not had time to think about the ballot initiative.

I stressed that my response would provide false information—pseudo-information— and that she should point it out to her supervisor.

That didn’t matter.

She then began asking a series of questions about the initiative.  Possible responses ranged from strongly disagree, disagree, neutral, agree, strongly agree.

I couldn’t respond, “I don’t know.”  Or “I haven’t thought about it.”

I answered “Neutral.”

The survey was collecting what has been called pseudo-data.

Nearly 50 years ago, Steven H. Chaffee, Stanford, Ph.D., and one of the leading researchers on communication research, pointed out that researchers too often collect what he called pseudo-data.

What then is pseudo-data?

Pseudo-data is false data—i.e., it does not represent respondents’ real awareness of the topic, understanding of the topic, beliefs about the topic, or, in this case, potential voting behavior.

When planning surveys, researchers and pollsters should ask,

  • Who are the appropriate people to ask about the topic?
  • Is a survey the appropriate method to gather information about people’s awareness, understanding, attitudes, and related behaviors—voting in this case?

 

Then researchers and pollsters should ask additional planning questions, such as

  • Will the potential respondents be aware of the topic?
  • If they are aware of related details about the topic?
  • If so, how did they learn about the topic?
  • Is topic relevant—i.e., important—to potential respondents?
  • What information or misinformation might potential respondents have?
  • If so, had they had time to think about it? Form opinions?
  • Would they then act on their opinions?

 

The answers to these questions begin to suggest the complexities and challenges of survey research that will provide valid and reliable information for making decisions.

Other questions can be asked too.

Universities and colleges offer multiple courses on developing surveys, designing studies, collecting data, data analysis, and interpreting the findings.  And academic publishers offer dozens of textbooks explaining how to conduct surveys, analyze the data, and interpret the results.

The blogs that follow will explore methods for developing surveys for research and evaluation.