Ask eAIR invites questions from AIR members about the work of institutional research, careers in the field, and other broad topics that resonate with a large cross-section of readers. Questions may be submitted to eAIR@airweb.org.
This month’s question is answered by Kimberly Thompson, Senior Director Assessment & Institutional Research, University of Phoenix-Corporate Office.
The ideas, opinions, and perspectives expressed are those of the author, and not necessarily AIR. Members are invited to join the discussion by commenting at the end of the article.
Dear Kimberly: I am now in an assessment role at my institution. How can my institution use survey data to make data-informed decisions and improve student success?
This is a great question. When we think about assessing student learning outcomes, we tend to focus on direct measures of learning such as demonstrations of knowledge through presentations and projects, or proficiency on exams, or other types of assignments designed specifically to measure students’ knowledge, skills, and abilities. But, survey data can also provide us with valuable insight into students’ experience at our institutions.
Survey data can come from various sources and includes institutionally-developed instruments as well as national surveys. The primary advantage in developing surveys in-house is that the topics and questions can be tailored and customized to provide highly relevant information. National surveys are beneficial because they typically include comparison data which allows the institution to benchmark its results.
The best way to make survey results useful in decision making is to provide key information from the survey results. I have found that when I provide data tables, graphs, and other general summaries of survey data, the data rarely actually gets used to inform decisions about curriculum, student support services, or other processes that impact students. When I find one or two interesting bits of information from a survey and then present that information in bullet points or other targeted communications, stakeholders become very interested and frequently ask for more information.
Another method I’ve found to be helpful for encouraging the use of survey results in decision-making is to lead small group discussions. Rather than gathering stakeholders together to watch a presentation of survey data, I bring groups of stakeholders together for a facilitated conversation or brainstorming session using recent survey results. There are several strategies that can help get folks interested and engaged in these discussions:
Using the CIRP College Senior Survey, you could ask questions such as, “What percentage of our seniors said they frequently ask questions in class?” or “What percentage of our seniors reported they fell asleep in class?” This strategy can be used with any survey.
Show several key results in a bulleted list and ask the group, “What is the most surprising item to you from this list of results?” Be prepared with follow-up questions to keep the discussion going.
Bring stakeholders together and seat them at tables of five-six, but have them sit in preassigned tables so they are with stakeholders from other disciplines or administrative units to encourage sharing of reactions to survey results from multiple functional area perspectives. Then provide a few salient results and ask them to discuss the results so they can provide reactions to a series of discussion prompts. Results can then be shared with the larger group. This is a good strategy when you are presenting results to very large audiences and cross-functional audiences.
If your institution has an intranet or other internal communication vehicle such as a newsletter, provide survey results periodically in short “Survey Bites” (or another catchy title). This helps to socialize the information and keep it on colleagues’ radar screens.
Another use of the intranet or newsletter is to ask questions in a poll and then after an individual “votes,” the correct response is shared along with a short narrative about the information. Questions could be similar to the examples in the first bullet.
No matter what techniques or strategies you end up using, the key to success is to get folks interested in the results and to help them see practical implications that are meaningful to their role. You’ll know you are making progress when people start to ask when the next survey results meeting is, or invite you to present more results or to be part of task forces, planning groups, or other key leadership meetings so you can share survey results.