NSSE: Revisions and Opportunities

The updated National Survey of Student Engagement (NSSE) launched in 2013
. eAIR spoke with NSSE about the changes and resulting opportunities for colleges and universities.  

Alexander McCormick is Senior Associate Director of the Indiana University Center for Postsecondary Research (CPR) and Director of NSSE. He is also an associate professor of educational leadership and policy studies at IU. Robert Gonyea is CPR’s Associate Director for Research and Data Analysis.

Alexander McCormick  Robert Gonyea

Interview by Leah Ewing Ross 

eAIR: How did the revisions to NSSE evolve?  

The revisions to NSSE reflect maturity in the conversation around student engagement. The transition from the five fairly broad Benchmarks of Effective Educational Practice to 10 new Engagement Indicators allows institutions to explore more specific summary measures. We elevated the ability to report on high-impact practices, and new topical modules provide opportunities to examine subjects of special interest while maintaining the option for comparison analyses. 

In order to keep NSSE relevant and useful, it needs to be refreshed from time to time. Our goal is what evolutionary biologists call punctuated equilibrium – that is, we will keep the survey stable for long periods to facilitate trend analyses, but in order for NSSE to remain relevant and useful, we can’t freeze it in its original design from the early 2000s. We understand that change disrupts trend lines and introduces complications, so we identify areas of improvement over time and implement changes when the trade-off for the disruption is well worth it.  

The revisions were informed by feedback from participating institutions. Even though the summary measures changed, relevant comparisons to peer groups are still meaningful. A document on our website cross-walks the changes so institutions can easily identify new items as well as those that were not changed or were slightly modified.  

eAIR: If an institution has never participated in NSSE, or hasn’t done so in a long time, what is particularly compelling about the revised version that might attract attention?  

We have ratcheted up the usability of NSSE data. NSSE is now web-based only, which allows institutions to invite all first-year students and seniors to participate, not just samples of students. Also, we included the ability to drill down to fields of study, or for larger institutions, to the school or college levels.  

Since 2010, we’ve produced academic major field reports at no additional cost. Also, the new institution version of our web-based interactive Report Builder allows users to query data by student and institution characteristics, conduct trend analyses (currently limited to the previous version of the survey for obvious reasons), and produce reports. This tool is particularly useful for institutional research offices that do not have the resources to purchase additional custom analyses from NSSE or to carry out sophisticated analyses. 

eAIR: What was the most fun part of revising NSSE?  

The fun part was redesigning our reports, which allowed the NSSE staff to be creative and responsive to user needs. The new layouts presented a natural opportunity to rethink the purpose, organization, and usability of NSSE reports. Previously, we provided institutions with lots of tables that may have required significant reformatting and interpretation in order to effectively share information with colleagues outside of IR. Our new reports provide colorful graphics and quick overviews in formats that can be sent directly to a president’s cabinet and other key stakeholders.  

Also, we’re extraordinarily proud of our NSSE colleagues, all of whom went the extra mile to make this happen. We spent two years piloting the changes while also running the regular survey, all without additional staff members. We are also indebted to some 70 institutions that piloted the revised survey, some of which engaged in side-by-side testing of the old and new surveys. Other institutions allowed us to conduct student focus groups and cognitive interviews as part of this process. The continued collaboration of a large number of individuals and institutions is vital to NSSE’s success, and the survey development process illustrates this perfectly. 

NSSE staff members welcome questions about the updated survey and would like to hear how they can help institutions move through the changes. For more information about NSSE, including the 2013 annual report, A Fresh Look at Student Engagement, visit the NSSE website.

Share your comments, questions, and feedback below.




To add a comment, Sign In
Total Comments: 2
Eric posted on 1/16/2014 2:12 PM
I would be interested in seeing how the changes to survey questions affect the psychometric properties of the NSSE regarding reliability and validity. Are there plans to present these properties at a relevant conference such as the AIR Forum like Chen did in 2008?
Bob posted on 2/20/2014 12:11 PM
Thanks for this question. NSSE analysts are currently working on fully updating the conceptual framework and psychometric properties of the survey. We should have these posted on our Web site within few months and will certainly have this information available at the AIR Forum.

In the meantime, we have internal consistency and scale reliability stats for the Engagement Indicators on our Web site here: http://nsse.iub.edu/html/reliability.cfm. The Cronbach's alphas for the ten EI's range from .77 to .89 for FY students, and from .78 to .90 for seniors.

I hope this helps. Looking forward to Orlando!
Bob Gonyea