by Laurie R. Kuslansky, Ph.D.
Managing Director, Jury Consulting
Research has shown that a variety of individuals are not fully represented in telephone surveys, especially Democrats, the young, nonwhite, and urban voters who can be the hardest for pollsters to reach.
In addition, the migration from landline phones associated with home addresses to portable cell phones unrelated to home addresses compounds the problem of reaching and surveying a representative sample using traditional approaches to phone surveys.
And, not all venues were created equally. Some have more hard-to-reach residents, while some have more cell phones replacing landlines.
In general, “It has become increasingly difficult to contact potential respondents and to persuade them to participate,” dropping from 36% in 1997 to 9% now willing to participate in phone surveys.
There are many. To name a few:
Lower representativeness of people willing to respond to phone surveys requires taking additional measures to assure a representative sample, but the options aren’t ideal:
- Even if you weight the sample, you may simply be giving undue weight to those you reach who may not actually match the ones you missed;
- It may cost more money to buy additional samples to get a sufficient number of people who represent a particular category.
o For example, if you are seeking to survey jury-eligible adults and voter registration is one of the requirements, you may have to spend extra money and buy a list of registered voters if not enough people in the normal random digit dialing (RDD) sample turn up eligible to be a juror, assuming they answer the phone and are willing to answer questions in the first place... and 91% on average are not.
- You may have to spend more money and/or time to reach a reasonable representative sample because of the low incidence rate (i.e., people willing and able to complete the survey). This approach has improved the field rate of responses from the 9% average to 22%, but added time means added cost.
Are people who answer telephone surveys different than people who don’t?
Yes. Among other things, people who engage significantly more in volunteerism and civic activity are more likely to agree to participate in telephone surveys than people who do not.” Intuitively, this makes sense.
What does this mean for telephone surveys of mock jurors?
- Anecdotally, the kinds of people unwilling to agree to participate in phone surveys are more similar to people unwilling to be on a jury than those who end up as jurors. Hence, those who respond to surveys are perhaps a better representation of likely jurors than those who do not.
- Financial information tends to be hard to gather in phone surveys in general. If that information is pertinent to a jury study, it can perhaps be gleaned indirectly from other factors (education, employment, home ownership, marital status, etc.). Instead of the specific dollar amounts, one might be able to code someone who has more vs. fewer markers of likely affluence vs. poverty;
- Many work to design the sample of a study by first referring to the latest (2010) Census data and the ACS (American Community Survey) estimates, but they are somewhat off, especially in terms of under-estimating the rise of Hispanics. The problem is that such information is often the only or best available data. Other options that may be more current are real-estate websites that describe communities as well as anecdotal information from local counsel about a particular jury pool. Putting together the specifications for the polling sample is part of the art and science of polling.
Are There Any Solutions?
Yes. “A new study by the Pew Research Center for the People & the Press finds that, despite declining response rates, telephone surveys that include landlines and cell phones and are weighted to match the demographic composition of the population continue to provide accurate data on most political, social and economic measures.”
In addition to the technical issues that depress response rates, one should also consider how easy or hard you make it for someone to reply to the questions. Shorter and easier are better than longer and more difficult. Questions that require greater effort or too many questions are more likely to end up being asked, but not answered.
Ironically, the data that are missing is that which describe people who don't take surveys or end up making much difference on juries.
Other articles related to mock trials, jury consulting and phone surveys on A2L Consulting's site:
- FREE DOWNLOAD: Storytelling for Persuasion - 144-page complimentary book
- 10 Things Every Mock Jury Ever Has Said
- Why Do I Need A Mock Trial If There Is No Real Voir Dire?
- 3 Ways to Force Yourself to Practice Your Trial Presentation
- 7 Questions You Must Ask Your Mock Jury About Litigation Graphics
- 11 Problems with Mock Trials and How to Avoid Them
- 12 Astute Tips for Meaningful Mock Trials
- Trending: Mock Trial Testing of Litigation Graphics AND Arguments
- 10 Suggestions for Conducting Mock Bench Trial Consulting Exercises
- Mock Trials: Do They Work? Are They Valuable?
- The 13 Biggest Reasons to Avoid Last-Minute Trial Preparation
- The Magic of a 30:1 Presentation Preparation Ratio
- 11 Surprising Areas Where We Are Using Mock Exercises and Testing
- Accepting Litigation Consulting is the New Hurdle for Litigators
 Midterm Calculus - Why Polls Tend to Undercount Democrats by Nate Cohn, The Upshot, N.Y. Times 10/30/14 at http://www.nytimes.com/2014/10/30/upshot/why-polls-tend-to-undercount-democrats.html?_r=0&abt=0002&abg=1
 Assessing the Representativeness of Public Opinion Surveys, by Pew Research Center for the People and The Press, 5/15/12 at http://www.people-press.org/2012/05/15/assessing-the-representativeness-of-public-opinion-surveys/
 Op. Cit. at Pew
 Op. Cit. at Pew
 See Katherine G. Abraham, Sara Helms and Stanley Presser. 2009. “How Social Processes Distort Measurement: The Impact of Survey Nonresponse on Estimates of Volunteer Work in the United States.” Am. J. of Soc. 114: 1129-1165. Roger Tourangeau, Robert M. Groves and Cleo D. Redline. 2010. “Sensitive Topics and Reluctant Respondents: Demonstrating a Link between Nonresponse Bias and Measurement Error.” Public Opinion Quarterly 74: 413-432.
 Post-election assessments of poll accuracy by the National Council of Public Polls at http://ncpp.org/files/NCPP%20Election%20Poll%20Analysis%202012%20-%20FINAL%20012413.pdf