I know how important data collection is; I do quite a bit of it myself!
But like most everyone else, I sigh a little when I open a new, highly-involved survey that I’ve been asked to fill out.
We all have pretty busy lives and completing such surveys can amount to writing an extensive personalized essay. Often, I wonder: Is someone on the other end actually going to read all these words?! Will they even consider putting my ideas into practice?
Students have these concerns too, and increased online learning may be prompting an overflow of surveys into their inboxes. So, it’s no wonder that some of your surveys might yield inconclusive results or, even more likely, extremely low response rates.
You may wonder what makes students decide to click and open up a survey, much less fill it out thoughtfully. One piece of the puzzle is the possibility of survey fatigue. So, let’s dig in, explore what survey fatigue is, and help you keep it at bay as you fact-find with and for your students.
What is it?
There are actually two separate kinds of fatigue associated with surveys.
One is survey-taking fatigue. It’s the principle that, if you are asked to take too many surveys, without being offered any incentives, you’ll be less likely to even attempt them. There might also be whole demographics of students who won’t respond, such as those who hold full or part-time jobs and thus, are too busy to take surveys. This is obviously far from ideal.
There is also another form of survey fatigue that’s perhaps even more insidious: Students may still take surveys despite being tired of them. This could even happen within a single survey; the sheer length or intensiveness of the questions could push respondents to do the easiest thing — write short, unhelpful answers or skip questions entirely in order to get through to the end.
Joe Levy, executive director of assessment and accreditation at National Louis University (and fellow Presence writer), spoke with me about his own experiences of survey fatigue on college campuses. He points out some of the consequences of when survey response fatigue sets in.
“People may be more likely to quit the survey early or not give open-ended feedback,” says Joe. “They may not think critically or conscientiously and just plow right through. It’s something to try to combat.”
Both forms of survey fatigue affect college students. Student affairs professionals, faculty members, academic administrators, and even student leaders create surveys because they feel quick and easy. However, turning to surveys too quickly and too often contributes to the problem of survey fatigue, so it’s valuable to really consider if a survey is vital before sending one out.
Why is it a problem?
Survey fatigue isn’t just annoying for respondents. It’s detrimental to assessment, as low response rates and half-hearted responses muddy the waters of the data. If your respondents aren’t representative of your whole student population, people will rightly take your results less seriously.
Polly Albright, an analyst at the University of Chicago’s Pritzker School of Medicine, has worked in assessment and data-collecting settings in both her current work and in prior positions. She has seen how survey fatigue can impact data quality and told me all about it.
“The data quality goes down if you’re hounding people to take a survey,” she says. “And if the student just does it because they are told to, they will whiz through it. You won’t have valid data.”
These muddied results create a vicious cycle, wherein data isn’t conclusive so it is discarded or ignored, leaving a continued need for answers.
I want to help you break the cycle of casual surveying — with tips from assessment professionals who can help you get great data through well-crafted surveys.
6 Ways to avoid it
The most important step in avoiding survey fatigue is what you should not do; don’t create a five- or 10-question online survey form, send it to hundreds or thousands of students, and then promptly forget you did it, never using the data.
Instead, by combining thoughtfully created surveys with many other research methods, you can get valuable data without having to overtax your student population.
Some of these steps can occur before you even send out a survey, and others are for when you decide a survey is absolutely essential.
1. Focus groups and informal chats
You likely interact with students all the time. You may constantly communicate with a student leadership team about next semester’s activities or have students in and out of your email inbox or video conferencing “office hours”. Either way, you’ve got lots of time for informal chats.
Early on, when you’re first thinking about sending out a survey, consider if you can instead ask the students around you for their take on the issue informally. You can even seek out students who don’t normally enter your sphere for a 20-30 minute focus group. You’d be surprised by the depth of the answers you can get!
Just make sure you aim to represent as diverse a range of demographics as possible and remember that small sample sizes can easily be non-representative.
Orchestrating informal chats may seem difficult virtually. But remember that if your office uses social media, you can read the room with quick polls. Not every info-gathering effort needs to aim for academic-publication-level rigor.
“You can gather simple data through a one-question poll, saying ‘Student Life did x, y, z; did you think this was good?’ Your feedback loop doesn’t have to be a 50-question survey. You can gather useful information on the fly.” – Polly Albright
Generally, the lighter the load for students, the less likely your feedback request is to cause survey fatigue. When you are pretty sure about a path forward and just want some quick feedback, don’t put out a whole long survey. A one-question poll should work just fine!
2. Observation and artifact analysis
Assessment professionals want us to remember that there are so many more ways to solicit feedback and data than surveys.
Observation and artifact analysis, for instance, can produce valuable data through a critical reading of things — like meeting minutes, which are likely already taken and published. Those minutes can be a source of information about what students like and dislike, prefer, and want to change.
“Artifact analysis can include things like scanning social media feeds, blogs, and other elements,” Joe says. “This is research, too, but people don’t think about it as data collection… like they have to do something more.”
Using these methods puts the hard work on you, but great data collection also requires lots of work to unpack. So students will likely appreciate you saving them time by choosing a different method.
3. Previously gathered data
Another key point from our assessment experts is that your institutions (along with their peer colleges and universities) have been doing a lot of research for a lot of years; there’s a chance that there is already apt research on the subject you’re trying to learn about!
“Surveying can be a knee-jerk reaction because people think they should do a survey,” says Joe. “But there may already be data collected that you can use.”
Before sending out a survey, figure out if the phenomenon you want to uncover might appear at other institutions or show up in your institution’s yearly data for accreditation. You’d be surprised how much is already known that you can use as a basis for making a move forward in your work.
4. Your institution’s survey calendar
Many institutions have a schedule for when they release major surveys related to assessment and accreditation.
Joe Levy and his assessment and accreditation colleagues schedule surveys carefully.
“We’re conscientious about the timing of surveys,” he says. “We want to be coordinating survey activity, especially for new surveys. We make sure to fit a new one into the existing calendar of surveys.”
If you know that you need to release a survey to a large number of students, talk with the assessment team about getting on that calendar and doing your research with lots of advance notice. This way, your students won’t have just worked through an important 20-page survey only to get your request — to fill out 10 more pages — the next morning.
5. Beta testing
If you indeed conduct a survey, make sure that you don’t set up yourself to later realize, “huh, I guess no one understood that question!”
Psychometrics is a scientific field that studies how our brains process things like survey questions.
“There are people who spend their whole doctorate programs studying psychometrics and writing a good survey question,” Polly says. “You don’t realize the question you’re really asking until you get the results back.”
So, find one or two students who, you trust to be the first testers of your survey. Ask them to take it “cold” — without any supplemental instructions or explanations from you — and then tell you honestly what the experience was like.
If they thought questions were redundant, cut them or mesh them together. If they thought three long-answer questions in a row were exhausting, consider which one is most needed. The easier you can make the survey (without jeopardizing the things that make it important), the better.
Assessment professionals are here for you too; they study psychometrics and know that the way that you write a survey questions matters a lot to how the responses come back. Talk to the assessment person on your team about how they can help with your survey creation!
6. Share the results
One way to help alleviate both kinds of survey fatigue is to showcase clearly how the last survey results turned into real action.
“Encourage communication before and after the survey, to let people know what’s coming, but also to share results from the past survey,” Joe says. “That can be part of your survey invitation. ‘These are the changes that we’ve made based on this feedback in the past, and we’d love to hear from you.’”
With these tips, I believe you’ve got the background and know-how to avoid survey fatigue while still tuning your choices to the needs of students. Both the data you collect and the way you collect it help your focus stay student-centered!