Special Features

  • Featured
  • 06.16.20

LMS Data and the Relationship Between Student Engagement and Student Success Outcomes

  • by Alexander Wagner, Director of Institutional Research, Lesley University

The urgency of using analytics within higher education to make strategic decisions is greater now than ever given the COVID-19 pandemic. This fall, many institutions will continue to provide remote teaching to at least a sizable portion of their student body. The forced shift to remote teaching and learning drives the need for the increased usage and reliance on Learning Management Systems (LMS), as highlighted at a virtual May meeting of the California Association of Institutional Research (CAIR). Due to this increase, more robust data and learning analytics (the measurement, collection, analysis, and reporting of data about learners and their contexts) are required to enable institutions to better understand the relationship between student engagement in their online courses and student success outcomes. Surveys alone—of students, staff, and faculty—do not provide the data to answer critical questions concerning online engagement and student outcomes.

We have been working for about two years on using LMS data for learning analytics. We started by doing so for a small segment of our online student population because we did not have the resources to purchase these capabilities from our LMS vendor or consulting companies. Rather, we developed a cross-functional approach, involving colleagues from units such as academic success, eLearning and instructional support, and IR. This resulted in our own approach to learning analytics which enables us to provide insights to many additional stakeholders.

Implementing this approach was not without challenges. We used two different SQL-based data extraction platforms provided by our LMS vendor. The first platform had a complex database schema as well as insufficient documentation. This complexity prevented us from utilizing learning analytics for all of our online students. Then, in January 2020, we gained access to a second platform providing us with improved access to LMS data. This second platform, free to us, uses a canonical data model that is much less complex in its schema. Our LMS vendor additionally provided us with free documentation and training for the new platform. This allowed us to more efficiently and effectively extract and process data on the use of our LMS. That was, of course, very fortunate timing given the forced shift to remote learning. 

Since March, we have developed a mutual understanding of the data and its contexts, including how the data can best serve the varying needs of our cross-functional team. This required us to link our LMS and SIS data, something that had not yet been implemented in the canonical data model. While technical skills are necessary, they are not sufficient to develop and implement successful and sustainable learning analytics, or any analytics for that matter. This has been shown by Mitchell Colver at the Center for Student Analytics at Utah State University.

Colver and his team’s Lifecycle of Sustainable Analytics1 inspired our approach to learning analytics and argues that analytics work best via socializing analytical tools by collaborating with end users. Implementing this in our case included developing an understanding of the context of LMS data and the needs of all of our involved stakeholders. In our experience, fundamental operational needs of stakeholders—such as advising (e.g., do online students participate in their courses?), financial aid (e.g., which online students are “here” and is it warranted to disburse Title IV aid?), and eLearning (e.g., are instructors meeting online course design recommendations?)—were our most important initial issues to resolve in the implementation of learning analytics for online students. 

We have also been able to analyze LMS activities. For example, extensive data on online sessions and course submissions, since we started to teach fully online this past spring. This allowed us to plan and implement interventions for students that needed assistance. Following the aforementioned lifecycle of sustainable analytics, we intend to provide our student-facing professional staff with the tools needed to successfully intervene in student success outcomes, promoting institutional change. This is a key aspect of the successful adoption of innovation within higher education institutions, and, to a large extent, is no different than what research has advocated for decades to actively manage change (e.g. Armenakis2 and others).

A cumulative analysis of online student engagement following our spring break showed strong relationships between multiple measures of online engagement and grades for the past spring term. Not surprisingly, students with the lowest online engagement levels had the highest share of “DFW” grades. Likewise, the more engaged students were the higher the share of “A” or “B” grades and the lower the share of students requesting passing grades. These relationships are so robust that we intend to utilize online student engagement measures as early predictors of student success in upcoming terms and to provide that analysis to our professional advisors for early interventions. Furthermore, we found strong predictive relationships between multiple online student engagement measures and our fall registration rates for non-graduating undergraduate and graduate students.  

Registration rates are, of course, not student outcomes. The spring to fall time frame is also not part of the typical approach for retention and persistence analytics. We fully expect, however, that this data will provide us strong insights into retention and persistence modelling once we have our official fall numbers. We also expect to gain insights that will allow us to include online student engagement measures in our traditional retention and persistence prediction models for our incoming cohorts. This will be especially helpful if we find we need to continue to rely on remote teaching and learning.

References

Colver, M. (2018). The lifecycle of sustainable analytics: From data collection to change management. Office of Student Analytics: Utah State University.

Armenakis, A.A., Harris, S.G. (2009). Reflections: Our Journey in Organizational Change Research and Practice. Journal of Change Management, 9(2), 127-142.