The Changing Role Of Surveys: An Interview with Steven Felker
eAIR recently caught up with Steven Felker, Director of Institutional Research and Effectiveness at Thomas Nelson Community College, to chat about surveys. Thomas Nelson Community College is one of over 20 colleges in the Virginia community college system, serving around 10,500 academic credit students each year and around 2,500 additional workforce training and continuing education students. About 25% of their students have a military affiliation.
What is your team’s role in survey research?
Felker: Our office, Institutional Research and Effectiveness, coordinates all of the college’s major surveys. We have several that we do every year, including a spring student experience survey that goes to all current students to find out about their experience at the College, their level of engagement with different programs and services, and their satisfaction with those services. We also have a graduate survey and a graduate follow-up survey that we send six to nine months after our students graduate. The latter focuses on what they’re doing at that point in time and if they’ve transferred or entered into the workforce in an area related to their program. We coordinate and work on all of those. We also have multiple ad hoc surveys each year and we’re essentially involved in every survey that carries the institution’s name.
You mentioned a graduate follow-up survey. That’s often a challenging area in terms of response rates and availability of contact details for students post-graduation. What is your process for that?
Felker: We are revamping the way we do it, though our existing process has been pretty successful for us. We’ve been doing it as a postal survey so we use the addresses that we have on record for our graduates. We typically send two mailings, and we’ve been getting fairly consistent response rates of around 20 to 25%.
Recently, we’ve improved the collection of personal student emails to the point where I think we’re almost ready to try a hybrid approach as a pilot. We will still do the mailing version but we will also give the option of an online version or the short URL. We will email it out and distribute it in some other ways so we can get a sense of how all that works. Likely we will try to move to an electronic format in the future.
Do you conduct exclusively in-house surveys, or do you participate in external surveys as well?
Felker: We use Qualtrics to build our own internal surveys, and most of the ones I’ve mentioned are our own instruments. We do participate in the Community College Survey of Student Engagement (CCSSE), typically every three years. We participated this last fall in the #RealCollege Survey. It’s a national survey of student basic needs out of the Hope Center at Temple University. It’s really designed to start getting at food insecurity issues, housing insecurity issues, those kinds of things that our students experience. We just got the results from that maybe a month ago so we haven’t fully delved into them, but it’s really started some new and different conversations at the College. The results have led to an interest in expanding some of the community support services that we have, such as food pantry and those kinds of things. Our whole system, the Virginia Community College System, has just participated in a national diversity survey. It looks at all aspects of diversity and how inclusive the colleges are, from the perspective of employees and students. This is another example of how we are trying to leverage an external instrument to get nationally normed results to see how we are doing compared to other colleges.
How have you been leveraging surveys in the pandemic to understand what your students are facing? Have surveys become more important in this past year for your institution?
Felker: I think they have. Partly that’s due to the pandemic, but we have also started to ask a lot of deeper questions about student experience. For a long time, the focus was on whether students were satisfied at the surface level with the different services and programs that we offer. Last year, because of the pandemic and emerging social justice issues, we started asking deeper questions about ourselves as an institution. For example, how accessible are we? Surveys can become more important and more challenging as you get to those types of questions. In the fall, we did a couple of surveys—one for employees and one for students—looking at the challenges they were experiencing through the pandemic and what they saw as their likely preferences coming out of the pandemic. For example, do students still want online support services? We started to look less at whether what we’ve done in the past has been helpful to the students and more at what our students and employees would like to see in the future. I think that’s an emphasis shift that we’ve gone through during the pandemic.
Is there anything you found through the surveys that surprised you and became an area of institutional focus?
Felker: One that we found a while ago, which we ended up using as a quality enhancement plan for our accrediting agency, is we learned we really could improve on the way we’re doing academic advising. CCSSE results indicated that it wasn’t working that effectively for our students, and we used some of our own surveys to start getting a more immediate pulse check on how things were changing so we weren’t waiting three years for CCSSE to know if one of the initiatives around advising was making a difference. One of the more recent things that I think has been kind of a surprise for us during the pandemic—or at least a surprise to me—was that when we are asking our students about challenges during the pandemic, I think we expected to see a lot more technology challenges or access to technology issues. However, what they told us, which I guess I should’ve known because I experience it almost every day, is that a challenge has been having a dedicated, quiet environment for learning. When they are on campus, they have testing centers, and different services and spaces are open for students to come and have a quiet place to take exams and to study in the library, and those kinds of quiet spaces just don’t exist in the same way right now. That’s something that we have really taken to heart. We have started to think about how we make that quiet learning space available to our students again, maybe even at a higher priority than offering services that have really been fairly successful in an online environment.
What advice would you give an IR office that is looking to expand or improve its survey work?
Felker: I would suggest focusing on the end goal. Often we think we need a survey, but we don’t know exactly what question we’re asking or what we want to be able to do with the information. I have found that the more you get groups involved early in the conversation about the research questions, the better off you will be in the long run both in terms of having actionable information and the buy-in of stakeholders, so offices are much more likely to use the results. We have found this incredibly useful, especially at a time when surveys are considered to be the quick answer to everything. Really engaging people in that longer-term conversation about the purpose of what we want to learn and how we are going to use the results, I think that’s probably the most critical component of survey work.
Is there anything else you would like to add?
Felker: I recognize that surveys are important, but there’s such a propensity for going to survey data right now. I think that the more offices like ours can find ways to integrate surveys into existing research questions, but not rely on them as the sole source of data around those questions, the better we will be able to serve our colleges and students. We’ve got a lot of deep questions that surveys cannot necessarily address for us anymore. Through focus groups or other means, we have to have more interaction with those we are trying to learn from. That’s going to become increasingly important for us. I do think, though, that sometimes for someone like me it’s too easy to just want to go with the quantitative data. Surveys have really helped us understand the “why” behind students not returning, what barriers are they facing, and other insights that data housed in our information systems are likely never going to tell us. I think there’s a lot of value there, but I would be a proponent for using a lot of different methods to gather information about our students.
Nic Richmond, Ph.D., has an extensive research and data analysis background that includes analyzing data on lunar and Mars crustal magnetic fields and the application of quantum mechanics and solid state physics to deep Earth studies. Since 2008, she has worked full time in higher education research leading the Strategy, Analytics and Research team at Pima Community College, where she serves as Chief Strategy Officer.