Skip to main content
Intended for healthcare professionals
Restricted access
Research article
First published online June 22, 2018

Response Rates for Surveys Completed With Paper-and-Pencil and Computers: Using Meta-Analysis to Assess Equivalence

Abstract

The increasing number of self-report surveys being collected using computers has led to a body of literature examining the response rates for computerized surveys compared with the more traditional paper-and-pencil method. However, results from individual studies have been inconsistent, and the meta-analyses available on this topic have included studies from a restricted range of years and did not use proper statistical procedures for examining comparability. Consequently, we conducted a meta-analysis with 96 independent effect sizes spanning over two decades of studies; we also assessed potential moderators. Comparability was determined using confidence interval equivalence testing procedures. The meta-analysis indicated nonequivalence, with those in the paper-and-pencil condition being almost twice as likely to return surveys as those in the computer condition. There was large heterogeneity of variance, and 11 of the 18 potential moderators were significant. Two meta-regressions yielded only two significant unique moderators: population and type of measure. Results highlighted issues within the response rate literature that can be addressed in future studies, as well as provided an example of using equivalence testing in meta-analyses.

Get full access to this article

View all access and purchase options for this article.

References

Ahern N. R. (2005). Using the internet to conduct research. Nurse Researcher, 13, 55–70. doi:10.7748/nr.13.2.55.s7
Barrios M., Villarroya A., Borrego Á., Ollé C. (2011). Response rates and data quality in Web and mail surveys administered to PhD holders. Social Science Computer Review, 29, 208–220. doi:10.1177/0894439310368031
Beebe T. J., Locke III. G. R., Barnes S. A., Davern M. E., Anderson K. J. (2007). Mixing Web and mail methods in a survey of physicians. Health Research and Education Trust, 42, 1219–1234. doi:10.1111/j.1475-6773.2006.00652.x
Borenstein M., Hedges L. V., Higgins J. P. T., Rothstein H. R. (2009). Introduction to meta-analysis. Hoboken, NJ: Wiley.
Borenstein M., Hedges L. V., Higgins J. P. T., Rothstein H. R. (2014). Comprehensive meta analysis version 3. Englewood, NJ: Biostat.
Börkan B. (2010). The mode effect in mixed-mode surveys: Mail and Web surveys. Social Science Computer Review, 28, 371–380. doi:10.1177/0894439309350698
Buchanan T. (2007). Personality testing on the Internet: What we know, and what we do not. In Joinson A., McKenna K., Postmes T., Reips U.-D. (Eds.), Oxford handbook of Internet psychology (pp. 447–459). Oxford, England: Oxford University Press.
Coburn K. M., Vevea J. L. (2015). Publication bias as a function of study characteristics. Psychological Methods, 20, 310–330. doi:10.1037/met0000046
Dillman D. A., Phelps G., Tortora R., Swift K., Kohrell J., Berck J., Messer B. L. (2009). Response rate and measurement differences in mixed-mode surveys using mail, telephone, interactive voice response (IVR) and the Internet. Social Science Research, 38, 1–18. doi:10.1016/j.ssresearch.2008.03.007
Dolnicar S., Laesser C., Matus K. (2009). Online versus paper: Format effects in tourism surveys. Journal of Travel Research, 47, 295–316. doi:10.1177/0047287508326506
Eckford R. D., Barnett D. L. (2016). Comparing paper-and-pencil and Internet survey methods conducted in a combat-deployed environment. Military Psychology, 28, 209–225. doi:10.1037/mil0000118
Egger M., Smith G. D., Schneider M., Minder C. (1997). Bias in meta-analysis detected by a simple, graphical test. BMJ, 315, 629–634. doi:10.1136/bmj.315.7109.629
Göritz A. S. (2014). Incentive effects. In Engel U., Jann B., Lynn P., Scherpenzeel A., Sturgis P. (Eds.), Improving survey methods: Lessons from recent research (pp. 339–350). London, England: Taylor & Francis.
Greenlaw C., Brown-Welty S. (2009). A comparison of web-based and paper-based survey methods: Testing assumptions of survey mode and response cost. Evaluation Review, 33, 464–480. doi:10.1177/0193841X09340214
Higgins J. P. T., Thompson S. G., Deeks J. J., Altman D. G. (2003). Measuring inconsistency in meta-analyses. BMJ, 327, 557–560. doi:10.1136/bmj.327.7414.557
Jacob R. T. (2011). An experiment to test the feasibility and quality of web-based questionnaires of teachers. Evaluation Review, 35, 40–70. doi:10.1177/0193841X11399376
Kaplowitz M. D., Hadlock T. D., Levine R. (2004). A comparison of web and mail survey response rates. Public Opinion Quarterly, 68, 94–101. doi:10.1093/poq/nfh006
Kepes S., Banks G. C., Oh I.-S. (2014). Avoiding bias in publication bias research: The value of “null” findings. Journal of Business and Psychology, 29, 183–203. doi:10.1007/s10869-012-9279-0
Knapp G., Hartung J. (2003). Improved tests for a random effects meta-regression with a single covariate. Statistics in Medicine, 15, 2693–2710. doi:10.1002/sim.1482
Laguilles J. S., Williams E. A., Saunders D. B. (2011). Can lottery incentives boost Web survey response rates? Findings from four experiments. Research in Higher Education, 52, 537–553. doi:10.1007/s11162-010-9203-2
Mahfoud Z., Ghandour L., Ghandour B., Mokdad A. H., Sibai A. M. (2015). Cell phone and face-to-face interview responses in population-based surveys: How do they compare? Field Methods, 27, 39–54. doi:10.1177/1525822X14540084
Manfreda K. L., Bosnjak M., Berzelak J., Haas I., Vehovar V. (2008). Web surveys versus other survey methods: A meta-analysis comparing response rates. International Journal of Market Research, 50, 79–104.
McCoy C. (2010). Perceived self-efficacy and technology proficiency in undergraduate college students. Computers & Education, 55, 1614–1617. doi:10.1016/j.compedu.2010.07.003
Medway R. L., Fulton J. (2012). When more gets you less: A meta-analysis of the effect of concurrent Web options on mail survey response rates. Public Opinion Quarterly, 76, 733–746. doi:10.1093/poq/nfs047 A
Moher D., Liberati A., Tetzlaff J., Altman D. G., & the PRISMA Group (2009). Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA statement. PLoS Medicine, 6, e1000097. doi:10.1371/journal.pmed.1000097
Richman W. L., Kiesler S., Weisband S., Drasgow F. (1999). A meta-analytic study of social desirability distortion in computer-administered questionnaires, traditional questionnaires, and interviews. Journal of Applied Psychology, 84, 754–775. doi:10.1037/0021-9010.84.5.754
Rogers J. L., Howard K. I., Vessey J. T. (1993). Using significance tests to evaluate equivalence between two experimental groups. Psychological Bulletin, 113, 553–565. doi:10.1037/0033-2909.113.3.553
Rusticus S. A., Lovato C. Y. (2011). Applying tests of equivalence for multiple group comparisons: Demonstration of the confidence interval approach. Practical Assessment, Research & Evaluation, 16, 1–6.
Shih T.-H., Fan X. (2008). Comparing response rates from Web and mail surveys: A meta-analysis. Field Methods, 20, 249–271. doi:10.1177/1525822X08317085
Tabachnick B. G., Fidell L. S. (2007). Using multivariate statistics (5th ed.). Boston, MA: Allyn & Bacon.
Van Horn P. S., Green K. E., Martinussen M. (2009). Survey response rates and survey administration in counseling and clinical psychology. Educational and Psychological Measurement, 69, 389–403. doi:10.1177/0013164408324462
Viechtbauer W., Cheung M. W.-L. (2010). Outlier and influence diagnostics for meta-analysis. Research Synthesis Methods, 1, 112–125. doi:10.1002/jrsm.11
Viechtbauer W., López-López J. A., Sánchez-Meca J., Marín-Martínez F. (2015). A comparison of procedures to test for moderators in mixed-effects meta-regression models. Psychological Methods, 20, 360–374. doi:10.1037/met0000023
Weigold A., Weigold I. K., Drakeford N. M., Dykema S. A., Smith C. A. (2016). Equivalence of paper-and-pencil and computerized self-report surveys in older adults. Computers in Human Behavior, 54, 407–413. doi:10.1016/j.chb.2015.08.033
Weigold A., Weigold I. K., Russell E. J. (2013). Examination of the equivalence of self-report survey-based paper-and-pencil and Internet data collection methods. Psychological Methods, 18, 53–70. doi:10.1037/a0031607
Whitehead L. (2011). Methodological issues in Internet-mediated research: A randomized comparison of Internet versus mailed questionnaires. Journal of Medical Internet Research, 13, 303–308. doi:10.2196/jmir.1593

Biographies

Arne Weigold is an associate professor at Notre Dame College in South Euclid, OH. He received his PhD in experimental (cognitive) psychology from Texas Tech University in Lubbock, TX. His areas of expertise include data collection methodology, equivalence testing, and scale development.
Ingrid K. Weigold is a professor at The University of Akron in Akron, OH. She received her PhD in counseling psychology from Texas Tech University in Lubbock, TX. Her research interests include data collection methodology, scale development, and human agency.
Sara N. Natera received her MA in clinical psychology from Cleveland State University in Cleveland, OH, and currently works as a data analyst.

Supplementary Material

Social Science Computer Review

Supplemental material files:

Summary

Supplemental material for this article is available online.

Resources

File (weigold_online_supp.pdf)

Cite article

Cite article

Cite article

OR

Download to reference manager

If you have citation software installed, you can download article citation data to the citation manager of your choice

Share options

Share

Share this article

Share with email
EMAIL ARTICLE LINK
Share on social media

Share access to this article

Sharing links are not relevant where the article is open access and not available if you do not have a subscription.

For more information view the Sage Journals article sharing page.

Information, rights and permissions

Information

Published In

Article first published online: June 22, 2018
Issue published: October 2019

Keywords

  1. paper-and-pencil
  2. computer
  3. response rate
  4. meta-analysis
  5. equivalence testing

Rights and permissions

© The Author(s) 2018.
Request permissions for this article.

Authors

Affiliations

Arne Weigold
Notre Dame College, South Euclid, OH, USA
Ingrid K. Weigold
The University of Akron, Akron, OH, USA
Sara N. Natera
Cleveland State University, Cleveland, OH, USA

Notes

Arne Weigold, Notre Dame College, South Euclid, OH 44121, USA. Email: [email protected]

Metrics and citations

Metrics

Journals metrics

This article was published in Social Science Computer Review.

VIEW ALL JOURNAL METRICS

Article usage*

Total views and downloads: 806

*Article usage tracking started in December 2016


Articles citing this one

Receive email alerts when this article is cited

Web of Science: 11 view articles Opens in new tab

Crossref: 16

  1. Autonomy in the context of cognitive demands—is the resource becoming ...
    Go to citation Crossref Google Scholar
  2. Nurses’ moral courage in Finnish older people care: A cross-sectional...
    Go to citation Crossref Google Scholar
  3. Mode Effects
    Go to citation Crossref Google Scholar
  4. Quantitative data collection approaches in subject-reported oral healt...
    Go to citation Crossref Google Scholar
  5. Concurrent Mixed Modes: Response Quality, Speed, and Cost
    Go to citation Crossref Google Scholar
  6. College students’ and Mechanical Turk workers’ environmental factors w...
    Go to citation Crossref Google Scholar
  7. Graduating nurse students’ interest in older people nursing—A cross‐se...
    Go to citation Crossref Google Scholar
  8. “Every little thing that could possibly be provided helps”: analysis o...
    Go to citation Crossref Google Scholar
  9. How digital are ‘digital natives’ actually? Developing an instrument t...
    Go to citation Crossref Google Scholar
  10. Web-based and mixed-mode cognitive large-scale assessments in higher e...
    Go to citation Crossref Google Scholar
  11. Psychometric Properties of a Chatbot Version of the PHQ-9 With Adults ...
    Go to citation Crossref Google Scholar
  12. Prevalence of Low Back Pain among Primary School Students from the Cit...
    Go to citation Crossref Google Scholar
  13. Setting up Probability-Based Online Panels of Migrants with a Push-to-...
    Go to citation Crossref Google Scholar
  14. Findings from Optometrists' Practices in Advising about Lifestyle Stud...
    Go to citation Crossref Google Scholar
  15. Mode Effects
    Go to citation Crossref Google Scholar
  16. The use of electronic incident reporting system: Influencing factors
    Go to citation Crossref Google Scholar

Figures and tables

Figures & Media

Tables

View Options

Get access

Access options

If you have access to journal content via a personal subscription, university, library, employer or society, select from the options below:


Alternatively, view purchase options below:

Purchase 24 hour online access to view and download content.

Access journal content via a DeepDyve subscription or find out more about this option.

View options

PDF/ePub

View PDF/ePub

Full Text

View Full Text