Recently, the federal government asked institutions that received Higher Education Emergency Relief Fund (HEERF) funding to report data for year two of the reporting cycle. The request this year included multiple items that were not on the prior year report. Aid offices that received this funding were required to report the data as instructed outside of the quarterly reports. On the University of Alabama in Huntsville campus, we were met with several challenges in how to produce the dataset in the timespan we were given. The lessons we learned, especially as a small office (1 full-time programmer, 1 full-time assessment analyst, and 1 part-time analyst), are shared below as a reflection of how we as IR/IE professionals can continue to improve.
Lesson 1: Understanding Reporting Language
Many of our challenges with this specific project stemmed from confusion about the definitions in the report. For instance, the guidance in the instructions led many institutions to believe that IR/IE offices already collected these data. This was because the documentation for year two pointed out IPEDS as a source of data with the IPEDS keyholder being a point of contact. However, in further reading, the IPEDS guidance was directed to IPEDS characteristics, but only in relation to classifying students as degree or non-degree, full-time and part-time, their ethnicity, gender, and federal aid eligibility. Although the classifications were aligned with IPEDS, the timespan of the report was not; therefore, many institutional research offices were met with dilemmas on how to pull and report these data. HEERF data used a calendar year reporting timeline instead of academic year and/or fiscal year, which also caused difficulties for many institutions.
To break this down, our office met with our financial aid unit to determine the best way to classify the enrollment data, another office pulled HEERF funding since it processes student payments, and the financial aid office pulled eligibility and Pell data. After reporting the data, we realized that for year three, we needed a better method. Our campus went back to the drawing board and enlisted the assistance of another office outside of our institution to find the best way to pull all reporting factors together in one report.
Lesson 2: Data Should Come from a Single File Source
Another valuable lesson our team learned through this process is that data such as this should always come from one file source, whenever possible. In this instance, multiple units worked on separate elements, and our only way to complete the task was to match multiple files. This can be problematic and can potentially cause duplication of records. IR offices do not necessarily deal with financial aid eligibility, and here lies a case of who may be responsible for such data on college campuses. A clear delineation of data governance would be opportune here, and methods surrounding this solution would be void if the data custodians could assist programmers in retrieving and validating the data.
Lesson 3: Data Custodians’ Role as Content Experts
IR offices that report to IPEDS focus their reporting efforts on students that have already received aid; therefore, students are already identified as eligible. Whereas, the HEERF report’s definition of “eligibility” included factors that many small IR offices might not ever analyze. For example, financial data is fluid. Many campuses define financial data as transactional and not in the realm of IR reporting efforts. Balancing budgets and tracking fund types is not truly an expertise of the IR world, but the HEERF report has required us to step outside of our comfort zones of regular census reporting to be involved in quarterly and annual financial reports.
Although our programmers are really good, it takes someone with an extensive array of knowledge to truly understand the tables on our financial side to dictate back what we need to pull and how that data should be interpreted. Since our office is small, we do not have the resources to truly understand every aspect of the entire campus and/or every table. Therefore, it is crucial for data custodians to be fully aware of the tables and data they own and maintain for projects such as these.
Lesson 4: Communication Is Crucial
Additionally, the interpretation of subsequent enrollment periods and having to default a student’s status one way or the other is not an easy task, especially when financial records are transactional. The tables seem straight forward for reporting, but on the coding side, we have been met with many challenges outside of the norm.
What is key for these types of requests is proper communication of ownership and responsibility of the detailed reports that are then submitted to the federal government. These efforts begin at the federal government and traverse their way down to the stakeholders involved in reporting these types of data.
Lesson 5: Involvement from the Collective
In this case, the communication veered into asking ourselves who within the collective (institution as a whole) should be involved and at what point should they be included in the conversation. Although our office became involved in the reporting efforts during year two, we did not receive the request until about two weeks before it was due back to the federal government. Since our office was not involved with reporting during year one, we were also not privy to the data submitted during that cycle because the request did not involve census reporting on our end. During year two, it became apparent that the custodians of financial aid and finance needed our assistance. Also, in year two, IPEDS keyholders received a message from the federal government indicating the upcoming due date was about two weeks out. Yet, there was little to no clarity around of how keyholders should be involved.
With any type of grant or aid, we should automatically know that institutions will be responsible for reporting data back to the federal government. At the very least, keyholders should be aware when the timelines occur for these types of reports so that we are not blindsided by scenarios such as these and so that we have ample opportunity to develop and maintain the data needed for clean reporting efforts. As with any new reports, it takes time to build consistency. These reports are one-offs and temporary, but they also carry a heavy weight for an institution.
Lesson 6: Future Reporting Modifications
We can hope that there are lessons learned across the collective should we ever fall into these situations ever again as a nation. If this should have been added to our reporting timeline for IPEDS, then maybe in future reporting cycles, we should seek modified temporary reporting calendars and also ensure our state keyholders are reviewing the data to ensure accuracy. Maybe opportunities to invoke change are evident, and how we address future requirements should be discussed. If anything, we have all learned a great deal over the last two years, and we should make every effort to keep progressing forward.
Be aware, be prepared, and be ready to take on challenges such as these, however they may look within your organization. On the flip side of this, congratulations to those offices that assisted in these reporting efforts. Here’s to year three!
Melanie S. Simpson is Assistant Provost of Institutional Research, Effectiveness, & Assessment at the University of Alabama in Huntsville.