Ask an Epi: Tips and Tricks for Creating Online Surveys

by |

This post focuses on specific aspects of creating online surveys for research and evaluation. This is meant to be a guide for programming your survey into an online platform after you have already developed the tool (survey questions, type of questions, answer choices, the skip logic flow, introductory language, etc.). The tips and tricks below were developed based on experience with REDCap and Qualtrics, but they likely apply to other platforms as well. 

1. Understand Your Electronic Survey Platform

  • Why: Familiarizing yourself with the capabilities and limitations of your chosen platform is important before beginning survey development. It is more time-consuming to move surveys from one platform to another after the survey has been developed. Data analysis benefits from seamless data export and compatibility with the survey platforms. If you can export the survey responses to a format that the data analyst can directly use, it will be a timesaver
  • How: Explore the FAQ and question sections of survey platforms to gain deeper insights. Verify that your survey platform supports exports in formats that end with .csv, .xlsx, .RData, .sas7bdat, .dta, and check for integration capabilities.

2. Think About Variable Naming Conventions

  • Why: Consistent and clear naming makes data management and analysis easier.
  • How: Consider the clarity and relevance of variable names, in other words avoid creating really long and confusing names. Try using sequential naming (eg. “eg_1, eg _2, eg _3”) for variables that belong to a set of questions or scale. Don’t make changes to the variable names once you have started collecting data. 

3. Establish Coding Processes for Variables

  • Why: Efficient survey programming, data cleaning, and analysis are supported by standardized variable coding. Good variable coding practices enhance data accuracy and avoid confusion.
  • How: Special responses like “I don’t know” or “Refused” should be coded distinctly and consistently (e.g., 99, 999) to avoid confusion during analysis. If the answer choices are ordinal (e.g., all, some, a few, none), then try to code these options in order for all questions with the same options (e.g., 1, 2, 3, 4). Maintain consistency in coding across the survey.

4. Create a Data Dictionary or Codebook

  • Why: You can help other parties (such as team members who will analyze the data or audiences that may see results summaries) understand your survey by developing a data dictionary, which also facilitates tracking survey versions and updates.
  • How: The specifics vary by project, but some general components include question wording, variable names, variable values, skip logic, and prompts.
  • Link: An exemplary simple data dictionary can be found here.

5. Utilize Conditional Logic (aka Skip Logic)

  • Why: Skip logic can be used to show or hide specific messages or questions to specific survey takers. Tailoring the survey flow improves engagement and data relevance because users don’t have to see questions that don’t apply to them.
  • How: Familiarize yourself with your platform’s conditional logic functionalities, as they can vary significantly.

6. Implement Data Validation

  • Why: Ensures the collection of intended data types especially in open-ended questions where people need to provide exact data (e.g., dates, phone numbers, email addresses) and enhances overall data quality.
  • How: Restrict input fields accordingly within your survey platform to set the allowable formats for responses. 

7. Incorporate Progress Indicators

  • Why: Indicators of survey completion progress can lower dropout rates by setting clear expectations for respondents.
  • How: Include visible progress bars or similar cues in your survey design. Each platform handles the survey progress options differently. 

8. Ensure Clear Communication

  • Why: Minimize respondent confusion and frustration with clear, constructive feedback on survey error messages.
  • How: Design intuitive and helpful error messaging within the survey interface. In REDCap you can use “Field Notes” to clarify, further explain your questions, or provide examples to your respondents. This can be very helpful when you are using a validated scale that you cannot modify. In Qualtrics, you can create hover boxes over text to provide further explanations for your questions.

9. Integrate Email Lists for Survey Distribution and Tracking

  • Why: Utilizing email lists for survey distribution ensures surveys are delivered directly to the intended participants, making it easier to manage outreach and data integrity.
  • How: Learn your survey tool’s email distribution capabilities or integrate with email tools such as mail merge to send surveys directly to your participants rather than using a generic link. This approach allows you to use unique identifiers, such as email addresses, to monitor who has received the survey, opened the email, and completed the questions.

10. Pilot Your Survey

  • Why: Testing uncovers potential issues, allowing for adjustments before widespread distribution.
  • How: Pilot test the online tool under conditions similar to how you plan to do the main survey deployment. Test your survey multiple times with different people, try different survey paths using skip logic, and make sure that the test data was properly collected and it’s what you expect.

 

Expanding Survey Accessibility and Equity

1. Creating Multilingual Surveys

  • Why: Consider offering your survey in multiple languages to accommodate members of your audience who may be more comfortable responding in a language other than English. This not only respects their linguistic preferences but also enriches the data by including a broader spectrum of respondents.
  • How: Platforms like Qualtrics support multilingual surveys directly, whereas REDCap may require admin-approved plugins, or the use of different surveys for each language for similar functionality. 

2. Accessibility Compliance

  • Why: Ensure that your survey is fully accessible by including alternative text for visuals and using clear, readable fonts. Also consider your use of language – in most cases it should be straightforward, concise, and understandable to a broad audience.
  • How: Avoid jargon, technical terms without explanation, and complex sentence structures such as double negatives, etc.

3. Use of Inclusive Language

  • Why: Clear, inclusive, and respectful language ensures that all intended respondents feel welcomed and understood, which can then increase participation rates and the diversity of responses.
  • How: Ensure that your language does not inadvertently exclude people based on gender, race, ethnicity, age, ability, or other factors. When in doubt, consult up-to-date inclusive language guides or people representative of the communities you’re engaging with.

4. Post-Launch Recommendations

  • Create regular data backups post-launch to ensure data preservation. 
  • Implement a notification system to track new responses.
  • Periodically review collected data to detect potential issues. It is beneficial if the review process takes place more often when survey first goes live to catch issues early.

 

More on this subject:

https://www.qualtrics.com/support/survey-platform/survey-module/question-options/recode-values/

https://libguides.library.kent.edu/qualtrics/howto/coding

Andrés Hoyos-Céspedes, MPH, CPH

Epidemiologist II / Project Manager

Shanyin Yang, MPH

Epidemiologist I