As human attention spans continue to decrease, it’s harder to capture the attention of students. Many institutions continue to send out long surveys and provide incentives with hopes students will take them. That’s no longer the case.
Asking students to take surveys and have them completed is a big challenge facing professionals at colleges and universities.
Institutions who over-survey students are missing out on a big opportunity. By providing different ways to collect data from students you gain more insight to their student life experience, thus, strengthening relationships with them. A survey may be just a survey in their eyes and developing various platforms to collect feedback will help you understand what their relationship is like with your institution.
Here are some questions to reflect on:
- Do you think students actually want to fill out a survey?
- How many times have you personally soared through taking a survey just to get the freebie that you might win at the end of it?
- When was the last time you filled out a survey when you were asked?
A well-executed survey has the power to uncover insights that you can use to hone your student engagement strategy, shape your programs, and gauge student satisfaction with campus life.
For those looking to conduct research or answer questions through surveys, it may seem exciting and discouraging at the same time. It’s fun to think about the range of answers you could receive, yet also know that you could be doing a better job at reaching a larger population of students.
There’s competition with other department’s surveying efforts on campus, leading to the ever destructive student survey fatigue. Survey fatigue doesn’t only impact professionals who are new to surveying, experienced surveyors struggle with it too. How can you expect to improve feedback or participation at events when students aren’t interested or don’t have the attention span to fill out these surveys? While over saturation of employing surveys from campus constituents can be to blame, the quality of survey is also a major factor in survey fatigue.
Take a serious look at all of the surveys your institution distributes to students. How can you cut down? When you have the opportunity to collect information through a survey, keep it short when you’re looking to capture student’s attention.
“You get 5 questions. Focus on what you’re collecting and who you’re collecting it from.” -Darby Roberts, Assessment Professional
Survey bias is real. In surveys, there are potential sources of errors and biases present, some of which we can measure and others can sneak into our projects without anyone noticing. With every survey you employ, there is going to be some sort of margin of error.
For example, your own opinions, perspectives, and identities come into play when creating a survey. As humans, our viewpoints inherently dictate the the creation, design, and analysis of surveys. Sometimes framing questions and deciding how to interpret results can be a good thing, but it can also impact the quality of the data you aim to collect. Without observing students behavior while they’re taking the survey, we can’t understand their reactions to specific questions and how they’re worded, or how they feel about the design or structure of the survey.
A type of survey I’ve seen employed at many institutions are self-reported questionnaires to gather information from students. Using self-reported surveys are attractive: they’re easy to use, typically free (SurveyMonkey, anyone?), and can be easily sent to large population samples. However, there are a few dangers to self-reporting among students when we want to collect verifiable data.
For example, many students who fill out surveys are told the information is anonymous, and if there is not trust between students and administrators, they may feel they’ll be judged by the information they submit. Even despite students best efforts to be honest and accurate, the information they provide may be skewed or incorrect.
Relying on only survey data negates it’s impact, we must triangulate data to validate the data collected (Gall, Gall & Borg, 2010). If you want to encourage participation and stop diluting data through surveys, you need to find other ways for students to engage with different materials like focus groups, interviews, and data that already exists at your institution.
“Assessment is more than surveys. We often need varying data sources to understand what’s happening.” -Brian Bourke, Student Affairs Professional
There’s a number of other alternatives to surveys that provide us with a holistic view of students including but not limited to: interviewing groups of learning communities, individual interviews, quick polls at programs and events, rubrics for projects, and observing behavior. We outline different styles to receive feedback and understand if learning outcomes are met below.
Focus on Cohort Learning
Gaining feedback from students in cohorts or focus groups are helpful because they create intentional learning communities where the group can start together, process together, and end together. Cohorts are beneficial because of the strong cohesion they can create between group participants and the rich qualitative data that often arises in the form of story telling.
For example, instead of sending a survey to a student who took a resume-building class, create a rubric and have them submit a new resume after they attend the class. Use direct assessment through identifying if the students achieved specific career development learning outcomes. Another way to assess student learning from a career development perspective could be focusing on trends over time. Seniors who come into a career services office during their first semester of their senior year may have different questions than a senior that will come in for the first time during their last semester in college.
Another form of assessment professionals can use instead of surveys is observing behavior during meetings. For example, have staff can take detailed notes after one-on-one’s with students: whether it be student conduct, academic advising, or in a supervisory role, taking detailed notes helps to understand whether student learning outcomes are being fulfilled.
Track Quantifiable Data Over Time
Collecting accurate data challenges the formed opinions made about students. Collecting data allows to take a micro and macro approach to assessment of student learning outcomes by identifying trends of the student body. The process of mapping student engagement allows you to better understand the humans that inhabit the environment.
Creating reports with student participation data and demographic information allows student affairs professionals and students leaders to have a better snapshot of engaged students, successful events, and if student learning outcomes were met. At Presence, we believe in giving professionals the power to obtain accurate information by providing data visualizations that accurately represent the information professionals want and need to track.
Experiment with Polls via Mobile
More than 50% of surveys or polls will be conducted on a smart mobile device in 2016.
This is a great time to experiment with various surveys or polls while testing them on different devices, like how it looks on an iPhone compared to a Samsung. Keeping surveys concise helps when a student is scrolling through their phone and helps reduce survey fatigue.
Tracking student involvement via mobile also helps to triangulate data collected. When students check-in to events via card-swipe technology, staff and administrators have the ability to see data analytics in real-time. Professionals can compare student demographic attributes when students check-in via Presence, paired with poll questions, and compare it with student stories as a qualitative sampling from an event.
These may seem like small factors when understanding different ways to collect data but it makes a large difference to a student who can efficiently fill out a survey over lunch or while strolling across campus. Technology will continually change the way we collect information on the college student experience.
It may be tempting to find the ‘answers’ in your data right away, yet it’s important to stop and reflect. It’s valuable to communicate a story to an audience by finding the stories in the data and understanding what the numbers are trying to tell you.
Using more than one way to obtain data helps identify themes and the answers you’re looking for. Many people only employ surveys because a higher source is telling them to do so, it’s ‘always been done’ that way, or using surveys is the only way they think they can obtain data. Even then, after the surveys are sent out, some are never followed up with or the person who created it decides the data is too hard to sift through or they don’t have enough time.
If you feel like you have too much data in front of you, try pairing down broader goals. Create two or three goals that are action-based and accept that you can’t fix everything at once. Which goals align with the priorities of your institution or department? Start from there.
Its Not Them, It’s You
Students will take surveys in e-mails that are relevant, directly impact them, and hold interesting content. However, if students aren’t taking surveys or if you’re receiving inaccurate data, you must re-evaluate how to obtain it from different platforms.
Assessment is here to stay and investing in resources that help you diversify data, triangulate it, and validate it will help you to provide the best support for your students.
What are your tried and true practices for collecting data for programs and events?
We’d love to hear your experiences with using various platforms and how you’re incorporating new ways to collect data. Tweet us @CheckImHere! Thanks for reading.