MENU

AskIFAS Powered by EDIS

The Savvy Survey #1: Introduction

Jessica L. O'Leary and Glenn D. Israel

This initial publication in the Savvy Survey Series provides Extension faculty who are interested in creating savvy surveys with a brief introduction to survey design and basic considerations to make when choosing to use a survey for program planning and evaluation. The publication also provides an introduction to and overview of the Savvy Survey Series in Appendix A.

Introduction to Surveys

Most Extension programs are designed to alter participant awareness, knowledge, attitudes, or aspirations, often with the intent of producing subsequent changes in behavior. However, Extension professionals need accurate, reliable methods for identifying client needs and/or measuring changes that occur due to activities in the program. Many methods exist for assessing needs and measuring change, including those using secondary data, key informant interviews, staff observations and client self-reports. One of these methods is survey research.

Survey research refers to "any measurement procedures that involve asking questions of respondents" (Trochim, 2006, para. 1). A survey is traditionally used to gather information from a large group of people. From this information, a researcher can draw valuable conclusions about some aspect of the study's population. Since collecting information from everyone in a population of interest usually presents an impossible or impractical task, information is instead captured from a small sample of a larger population of interest using some form of a questionnaire. By studying the sample, we can extrapolate about the general population.

Self-Reporting in Surveys

Self-reported information is at the heart of the survey process. Client self-reports can be designed to capture either quantitative data (i.e., frequencies, averages, demographics, or variability in responses) or qualitative responses (i.e., success stories, explanations of problems, or personal insights).

Quantitative data are commonly collected from clients using questions where an answer is selected from a provided list. Qualitative data may either be gathered from questions that ask survey takers to provide a detailed response or through a different qualitative collection method (such as focus groups or interviews). Regardless of method type, a survey's main goal is to ask individuals to provide answers to a set of questions and to then record those responses in a meaningful way (Vogt, Gardner, & Haeffele, 2012). When deciding whether a survey is the proper tool for capturing the desired information, the main question to ask is "What am I trying to find out?" (Vogt et al., 2012).

Best obtained with a survey:

Are you trying to...

  • discover what happened, how often it happened, and to what extent it happened? e.g., How often do you (as a homeowner) water your lawn?
  • obtain responses to structured, short-answer questions? e.g., What is the process you use to water your lawn?
  • obtain data and information that are fairly easy for respondents to recall or create? e.g., How many years have you lived in your current home?

Studies that rely on these types of questions sometimes generalize findings to a larger population (i.e., suggesting that your findings are true not only for those you surveyed, but for anyone who would also fall into the population of interest).

Better obtained using another method (focus groups, interviews, journaling, etc.):

Are you trying to...

  • find out how or why something happened? e.g., Why did homeowners who lived in a particular neighborhood choose to adopt Florida Friendly Landscaping practices as a part of their HOA covenant?
  • obtain in-depth explorations of participants' views and perspectives? e.g., What do you (as a homeowner) think about the way climate change impacts your life?
  • obtain information that requires considerable thought and reflection? e.g., Imagine something you would like to change about your home landscape. What barriers would keep you from completing that change?
  • Responses to these questions help us to simply understand a particular subset of a population, rather than the population as a whole.

Making a Good Survey with the Tailored Design Survey Methodology

After determining that a survey is the best method for obtaining answers to your questions, the next step is to consider what it takes to make a good survey. A survey is far more than a list of questions. Instead, a good survey is one that has been intentionally and intelligently designed to obtain responses that provide high levels of quality and richness for each of the items included in the survey (Dillman, Smyth, & Christian, 2014).

County faculty want surveys that will produce high-quality responses from a large quantity of respondents. Dillman et al. (2014) developed a survey methodology that promotes the features necessary for generating both quality and quantity in responses. This methodology is referred to as the Tailored Design Survey Methodology (TDSM). The tailored design method is based around three fundamental concepts: error reduction, survey procedure construction, and positive social exchange (Dillman et al., 2014).

  • Error reduction: the tailored design method focuses on reducing four types of error (coverage, sampling, nonresponse, and measurement) throughout the survey process.
  • Survey procedure construction: the tailored design method encourages the creation of not just a high-quality questionnaire, but a set of survey procedures that interact and work together in order to get many clients to complete the questionnaire.
  • Positive social exchange: the tailored design method draws attention to the elements of a survey that can be enhanced through positive social exchange (such as survey sponsorship and the content of survey questions).

Survey Error

There are numerous sources of error that can creep into a survey. One goal in creating a savvy survey is to intentionally reduce the four common sources of survey error: coverage error, sampling error, nonresponse error, and measurement error (Dillman et al., 2014; Groves, 1989). The potential for these types of error must be addressed while developing a survey because any one of them is powerful enough to undermine the quality of information collected (Dillman et al., 2014). All surveys contain some degree of error, but it is possible to keep the error rate low; however, the goal is to keep each of these error types as low as possible. Details for each of the four sources of survey error can be found below (Dillman et al., 2014).

Coverage Error

  • The survey must provide adequate coverage of the population of interest.
  • Coverage error decreases when all members of the population have a known, nonzero chance of being included in the sample for the survey. (i.e., the list contains all members of the population).
  • Coverage error increases when anyone excluded from the survey is different on any measures of interest from those who were included in the survey.

Sampling Error

  • Sampling error occurs when only a subset of the population is surveyed.
  • Sampling error is also highly dependent on sample size, making an understanding of how to determine an appropriate sample for your study quite valuable.

Nonresponse Error

  • Nonresponse error is a consequence of not receiving completed questionnaires back from everyone who received the survey. People who choose not to respond may be (and usually are) different from those who choose to respond, in ways that may be important to the study.
  • Reducing nonresponse error involves motivating all those sampled to respond. One option for reducing nonre- sponse error is using a mixed-mode approach to data collection; however, this approach must be thoughtfully conducted in order to avoid increasing other error types.

Measurement Error

  • Measurement error occurs when respondent answers are either inaccurate or imprecise.
  • Measurement error is often increased by poorly worded questions, poor visual layout, and other problems with questionnaire construction.

Survey Procedure Construction

When designing a survey, county faculty should focus on creating a set of survey procedures that interact and work together to enhance response rates. To develop an effective questionnaire, faculty should consider what topics are truly relevant to the study, the length and the visual design and layout of the questionnaire, and organization and order of the items on the questionnaire. Furthermore, each item included in the questionnaire should be carefully crafted to fit the context of the survey and the overall survey design.

In addition to the questionnaire itself, some features of a survey's design that county faculty will want to think about include

  • survey mode(s): how the survey will be completed by respondents (e.g., paper and pen, Web, telephone);
  • sample: how the sample will be generated and the number of units that will be sampled;
  • contact: how contacts will be made with potential respondents (e.g., mail, phone, electronic), how often and when those contacts will be made, and what level of personalization and visual design will be integrated into each contact; and
  • incentive: whether incentives will be included, what type and level of incentive would be appropriate for the situation, and timing for providing incentive.

Positive Social Exchange

When designing a survey, county faculty want to increase participation by establishing a social bond with the survey taker. There are steps they can take to strengthen this bond and get better results. According to social exchange theory, people tend to act based on the benefits they expect to receive from their participation (Dillman et al., 2014). There are three general considerations that should be made when designing the survey and related procedures: enhancing trust between the survey sender and potential respondent, increasing perceived benefits of participation, and decreasing perceived costs of participation. A few of the methods for enhancing positive social exchange suggested by Dillman et al., (2014) include

  • providing information about the study (increased benefits);
  • asking for advice or help from the participants (increased benefits);
  • expressing positive regard and appreciation for participation (increased benefits);
  • making the questionnaire interesting and convenient to respond to (increased benefits; decreased costs);
  • minimizing the request for personal or sensitive information (decreased costs); and
  • gaining sponsorship from an authority that is trusted in the eyes of the participant (trust).

Such efforts often result in an increase in participant responses.

In Summary

This initial publication in the Savvy Survey Series focused on introducing the basic concepts associated with survey design. Subsequent publications in the series will expand upon each of these topics in greater detail. For a brief overview of each publication in the Savvy Survey Series, see Appendix A.

References

Dillman, D. A., J. D. Smyth, & L. M. Christian. 2014. Internet, phone, mail, and mixed-mode surveys: The tailored design method (4th ed.). Hoboken, NJ: John Wiley and Sons.

Groves, R. M. (1989). Survey errors and survey costs. New York: John Wiley and Sons.

Trochim, W. M. K. (2006). Survey research. In Research Methods Knowledge Base. Retrieved from https://conjointly.com/kb/survey-research/

Vogt, W. P., D. C. Gardner, & L. M. Haeffele. (2012). When to use what research design. New York: Guilford Press.

Appendix A

Savvy Survey Series Overview

The Tailored Design Survey Methodology will serve as the basis for a series of survey-based EDIS publications entitled: Savvy Survey Series. Twenty-two publications make up this series with items chosen as a result of state specialist expertise. A brief description for each publication in the series has been provided below.

The Savvy Survey #1: Introduction

General summary of surveys, explanation of what makes a good survey, synopsis of the Tailored Design Survey Methodology, and series overview.

The Savvy Survey #2: Using Surveys in Everyday Extension Programming

Information about using surveys to inform program development (needs assessment), to support program improvement (formative and summative evaluations), and to capture satisfaction with programming efforts (customer satisfaction); use of the logic model to guide questionnaire development; and general data types (demographics, factual information, attitudes and opinions, behaviors and events).

The Savvy Survey #3: Successful Sampling

General overview about who should be surveyed. Topics include survey population, sampling frame, those outside the population of interest, over-coverage/error/bias, and how to define your sample.

The Savvy Survey #4: Details in the Design

Information about the number of times contact should be made, modes that are available for use, ways to personalize the process, and whether incentives should be used.

The Savvy Survey #5: The Process for Developing Survey Questions

Information about how best to use your logic model to create your survey.

The Savvy Survey #6: Writing Items for the Questionnaire

Synopsis of survey question considerations including measurement considerations (reliability/validity), choosing topics, types of questions, measurement types, creating indices for measuring attitudes and perceptions, and text or wording choices.

The Savvy Survey #6a: General Guidelines for Writing Questionnaire Items

The Savvy Survey #6b: Constructing Open-ended Items for a Questionnaire

The Savvy Survey #6c: Constructing Closed-ended Items for a Questionnaire

The Savvy Survey #6d: Constructing Indices for a Questionnaire

The Savvy Survey #6e: Understanding How Question Type Impacts Future Analysis

The Savvy Survey #7: Formatting Questionnaires

Summary of questionnaire design considerations including which questions to ask first, organizing the flow of thought through the instrument, navigating through the questionnaire with design elements, determining the appropriate length for the questionnaire, and overall visual design and layout issues in a questionnaire.

The Savvy Survey #8: Pilot Testing and Pre-Testing Questionnaires

Overview of questionnaire testing methods including: pilot testing, cognitive interviewing, think-aloud procedures, and retrospective debriefing.

The Savvy Survey #9: Gaining Institutional Review Board Approval for Surveys

Overview of the Institutional Review Board (IRB) process and protocols at the University of Florida, as well as necessary contact information.

The Savvy Survey #10: In-person-administered Surveys

Introduction to pen and paper questionnaire construction or when a sampling list is unavailable; considerations for one-shot contacts; formatting and visual design elements within the questionnaire.

The Savvy Survey #11: Mail-based Surveys

Introduction to pen and paper questionnaire construction; considerations for multiple contacts; formatting and visual design elements within the questionnaire.

The Savvy Survey #12: Telephone Surveys

Introduction to telephone questionnaire construction; considerations for multiple contacts; designing telephone interview scripts.

The Savvy Survey #13: Online Surveys

Introduction to online survey design and questionnaire construction; considerations for multiple contacts; common online Extension survey platforms; formatting and visual design elements within the questionnaire.

The Savvy Survey #14: Mixed-mode Surveys

Introduction to mixed-mode survey design and questionnaire construction; considerations for multiple contacts; common online Extension survey platforms; formatting and visual design elements within the questionnaire.

The Savvy Survey #15: Survey Responses and Data Entry

Overview of data entry considerations and potential issues; details on how to clean data.

The Savvy Survey #16: Data Analysis and Survey Results

Summary of data analysis options including programs for analysis, running descriptive analyses, and running inferential analyses.

The Savvy Survey #17: Reporting Survey Findings

Synopsis of best practices for reporting survey results including expanding yearly reports of accomplishment, identifying and reporting to both internal and external audiences.

The Savvy Survey #18: Group-administered Surveys

Introduction to pen and paper questionnaire construction; presents the development and implementation of group-administered instruments, as well as preparing an introductory script, training survey administrators, and managing the survey process.

Publication #AEC 391

Release Date:July 17, 2023

Related Experts

Israel, Glenn D.

Specialist/SSA/RSA

University of Florida

Related Collections

Fact Sheet

About this Publication

This document is AEC 391, one of a series of the Department of Agricultural Education and Communication, UF/IFAS Extension. Original publication date August 2013. Revised December 2016, December 2019, and July 2023. Visit the EDIS website at https://edis.ifas.ufl.edu for the currently supported version of this publication.

About the Authors

Jessica L. O'Leary, former doctoral candidate; and Glenn D. Israel, professor emeritus, Department of Agricultural Education and Communication; UF/IFAS Extension, Gainesville, FL 32611. The authors wish to thank Amy Harder, Ed Osborne, Marilyn Smith, and Nick Fuhrman for their helpful suggestions on an earlier draft.

Contacts

  • Glenn Israel