Survey fatigue or over-surveying is something to recognize and guard against.
Students can feel burnt out from requests for their feedback about academic courses, campus events, and general college experiences — all on top of customer satisfaction surveys they receive from retailers.
Survey fatigue can lead to declining response rates, and data collection is only worthwhile when a substantial quantity and quality of data is collected. This needs to be accounted for whenever you coordinate assessment projects.
Before I say more, it is important to identify the most appropriate data collection method according to your needs. Although surveys are familiar to staff and somewhat easy to create, they shouldn’t always be your default assessment option. There are many data collection methods that should be considered in addition to surveys. First identify what you intend to measure, then determine the best way to measure it. (My favorite method is rubrics, but see page 25 of this assessment guide for more.)
When surveying is the most appropriate method, there are a number of considerations and strategies related to design, administration, use, and deliberate engagement of stakeholders that you can decide between. Those elements can make for good survey methodology, while also increasing your response rates!
1. Only ask necessary questions.
Toss out nice-to-know questions or questions you can’t pinpoint a specific use for their data.
2. Use survey or skip-logic to ask population, behavior, or response-based questions.
This way, students should only receive relevant questions and not have to select N/A if we know it wouldn’t apply to them. (For example, fourth-year students only get questions relevant to their experience vs. first-years.)
3. Determine what data you already have that doesn’t need to be collected by survey.
As long as you collect a common identifier (like a student’s email or ID number), you could obtain data like demographics, class year, and major from student information systems.
4. Put demographic questions at the end.
These sorts of questions are quick and easy to fill out – so not much effort to close out the instrument – but putting them early in the survey can delay getting to the “point” of the survey. The only exception here is if you need a demographic question for skip-logic purposes (such as capturing their class year to determine follow up questions).
5. Pilot your instrument with students to check the clarity of questions.
You may think that your questions make sense, but since you’re not the targeted survey taker, be sure to get feedback from your intended audience and adjust accordingly.
6. Test your instrument and administration method for the functionality of links and logic.
It’s always disappointing to check your survey results only to see a question was worded incorrectly, that you missed including a response option, or that the logic and links didn’t work. The good news is that, with testing, you can catch those omissions and errors before sending out your survey.
7. Strive to make your survey accessible via multiple devices.
Realize students may access or look to respond to your survey from various devices — including laptops, smartphones, and tablets. So, it’s critical to make sure your survey adapts and works well across device experience.
8. Be mindful of timing for communication and data collection.
Consider the academic calendar, student schedules, and student attention patterns when determining when to send out your surveys and ask for responses back.
9. Pre-announce, announce, and follow up.
To avoid sending an email invitation out of the blue, send a pre-announcement explaining the what, why, and when of a survey to students. Then, send your actual survey invitation and follow up, beyond reminders, to thank students for their attention, consideration, and participation. As a bonus, share some results and implications for the data in the follow-up.
10. Strategically send reminders to non-respondents.
The latter note of going only to non-respondents is key: when possible, don’t send reminders to everyone, as it’ll seem like you didn’t recognize the people who have responded. Also, know that sending 2-3 reminders while the survey is active is plenty.
11. Personalize invites with names and other connections to the sender.
If at all possible, address the students by name in the invitation and share how their perspective is relevant and important to your data collection purpose.
12. Use the invitation to motivate and garner interest from students.
Similar to the last point, use language to generate curiosity, empower students’ voices, and increase their desire to engage on the topic. For example, a survey about future campus-wide events could stress the desire to allocate resources to be relevant and aligned with student interests, making their response to the survey important to capture diverse perspectives of the student body.
13. Talk about how data will be used.
Beyond just saying why you are collecting the data, talk about how you will use the results.
14. Explain how you’ll handle the data.
Be sure to include a note about confidentiality (that you’ll only share responses with relevant decision-makers) and anonymity (that there is no way to connect responses to any student’s name) if those are part of your methodology.
15. Be realistic about completion times.
Leverage survey technology estimates or use a stopwatch to time students piloting your survey. Then, include an estimate in the invitation about how long it will take to complete the survey. This should motivate your design to only contain the necessary information in order to be quick, but also give you an opportunity to be transparent with expectations when it is a long or involved survey.
16. Include and share data in your invitations.
A great way to generate interest and engage your readers is to offer more than text; share some data! For example, you could write about past student dissatisfaction with event scheduling which led to more convenient, future offerings. This is a great way to demonstrate that past data was reviewed, communicating to students that you will indeed review their data should they participate.
17. Include the first question in the invitation, if possible.
Some survey technologies allow you to embed the first survey question in the email invitation. Doing so is a great way to encourage participation and get their response collection underway.
18. Get students’ perspectives on survey needs and projects.
To best inform your strategic thinking about survey priorities, make sure to get student input. As much as you may think you know what’s best or pressing for students, it helps to hear from them directly about priorities and needs.
19. Invite students to be part of the design process.
Students can be great design collaborators. They can help provide question language, tone feedback, and comment on anticipated student experiences. Pilot review of a designed instrument from other students later is still valuable, as not all students are the same or think alike, nor does the intent of the student voice in the design necessarily shine through clearly in the resulting survey.
20. Ask students for advice on how to administer the survey.
Do your homework to present to them options, but value their feedback with respect to when to send out the survey, what email to present it from, and marketing campaigns you might consider beyond the invitations.
21. Integrate with existing student systems.
Use your learning management system (LMS), student engagement platform, and website to promote or increase access of your survey to students. Think about where students traffic and cater to their digital domain.
22. Invite students to interpret results.
Just as students have valuable perspectives on strategy and design, they can offer brilliant observations for interpreting results. Involving students reinforces that their voices are being heard, which could encourage future peer-to-peer promotion and motivation to participate in surveys.
23. Share results with students.
Be sure to re-engage students once data is analyzed to share the results and, even better, announce the improvements you plan to make based on the survey findings.
Engaging Faculty & Staff
24. Discuss the purpose and value of surveys.
Do not go through this process alone. If you’re collecting data on behalf of one or multiple offices, make sure those employees are aware of the purpose and value of the survey.
25. Get leadership on board and talking about these efforts.
Leadership involvement signals importance; include their voices and invite their presence and perspective in your process.
26. Share literature and resources about survey methodology.
Literature and research can reinforce the credibility and value of surveys. JMU’s instrument design information, covering psychometric properties like validity and reliability, illustrates how surveys can be designed with the utmost integrity. Resources like these can help equip, prepare, and build competence for folks to know more about and engage in the process.
27. Watch and report on response rates.
Sharing response rates with staff can keep them updated on data collection status. Plus, this data can be used to generate group and individual conversations on what is or isn’t working with your survey creation and promotion.
28. Make assessment a standing discussion in your office or department meetings.
Even if you are only issuing one survey a year, you can discuss data needs and priorities, content considerations, design process and review, updates on administration and data collection, analysis discussion, sharing results, identifying actions to be taken, and plans for future data needs.
29. Celebrate the results!
Reinforce the relevance and importance of the collected data in answering questions you had or providing essential information about your services and student learning. It can even be a big deal to celebrate a stellar response rate.
Using the Results
30. Share and discuss data with students.
Discussing results with students reiterates that they are stakeholders and not just your subjects of study.
31. Share and discuss data with staff.
Not everyone is going to be part of the survey process, so share broadly (but intentionally!) for awareness and to open the door for collaboration regarding data implications.
32. Use data to inform decision-making.
When designed well, each question will hopefully yield answers to essential questions and have implications to improve your programs and services.
33. Review results for validity.
Check that your questions gathered anticipated data to satisfy objectives, then consider what changes you should make for future surveys
34. Inform future assessment plans with results.
Beyond future surveys, consider if results yielded from the survey necessitate any follow-up surveys or focus groups to learn more about a particular issue, topic, or outcome.
35. Leverage in marketing, fliers, and emails to students.
Beyond informing future assessment efforts, results and actions for improvement can be used as part of marketing for your programs and services. When students know that programs are based on student-collected interests or have been improved based on past student feedback, they may be more likely to engage.
36. Set aside time for the entire life of the survey.
Remember: a survey is more than just design and data collection. Factor in time to plan, design, pilot and test, revise, administer, collect data, analyze data, report, and put together action plans based on results. By setting aside appropriate time for the life of your survey, you’ll likely be able to engage and involve students or otherwise best execute a survey experience to garner optimal participation and results.
37. If it makes sense, consider leveraging a captive audience.
Don’t forget you can carve out time for students to complete your survey while participating in your event, program, or class experience.
38. Brand your invitations and instruments.
Take time to think of the color, logo, and message you intend to convey to students as they receive and experience your survey. It can be simple; consistency in colors, font, survey naming convention, and tone with email invitations and instructions can go a long way.
39. Offer incentives for participation.
You could consider incentives for standalone surveys or a reward program for the completion of multiple surveys. But be aware that incentives can attract students who only want the prize; so, be prepared for inauthentic or inaccurate responses from some students and perhaps only offer incentives for big or critical surveys.
40. Cluster assessments.
Some institutions have had incredible success coordinating with leadership to arrange for assessment day(s) when there are no classes, but students and employees are asked to engage in various assessment activities all day. This helps encourage participation with the promise of focused surveying at a given time of year instead of spreading it out across the calendar.
41. Consider paying for survey technology.
While there are, indeed, free tools and software available for surveys, you usually have to pay to have access to features like skip logic, non-respondent reminders, invitation personalization or custom coding tools, and robust reporting.
Feel free to let us know if you have additional tips or tricks! We’d also love to know which tip is your favorite or will be a priority starting point for your surveys. Connect with us on Twitter @HelloPresence and @JoeBooksLevy.