IR's Role in Advocacy

Ask eAIR invites questions from AIR members about the work of institutional research, careers in the field, and other broad topics that resonate with a large cross-section of readers. If you are interested in writing an eAIR article, or have an interesting topic, please contact eAIR@airweb.org.  

This month’s question is answered by Laura Fingerson, Academic Director of Institutional Effectiveness, Capella University.

The ideas, opinions, and perspectives expressed are those of the author, and not necessarily AIR. Subscribers are invited to join the discussion by commenting a the end of the article.

Dear Laura: What do you feel IR’s role is in advocating for certain groups of students? Do you believe it’s okay to highlight specific populations that may be experiencing problems at your institution, or do you feel compelled to “just provide the numbers?”

The short answer is: absolutely. As IR/IE professionals, I believe we should advocate for what we see in the data, whether a win or a problem, because our role affords us a unique position in our institutions. If not us, then who?

The long answer is in three parts:

First, as we increasingly democratize our institutions’ data so stakeholders across the institution can be data-informed decision makers (students, faculty, and staff as outlined in AIR’s Statement of Aspirational Practice for IR), our data are engaged with at smaller and smaller aggregations. Leaders of our schools and programs are interested in and held accountable to the metrics for the groups for whom they are responsible. Thus, leaders are not generally asking about those groups that do not directly fall in a functional area.

01-quotea.png
As IR/IE professionals, we are the ones who look at the data every day across the institution. Our lens crosses all of our institutions’ programs, schools, student groups, demographics, specialized accreditations, and more.

For example, at my institution, we do not have a single leader responsible for all undergraduate programs or all masters programs, so it is up to IR/IE to show and interpret the data by degree level. For one of our Key Performance Indicator (KPI) metrics, we show the data in line charts by our six schools, for our six deans. The trend lines can be all over the place and make no sense. Then, we show the same data in trend lines by our three degree levels (bachelors, masters, and doctoral), and we now see a clear, logical, and actionable pattern.

Second, as IR/IE professionals, we are the ones who look at the data every day across the institution. Our lens crosses all of our institutions’ programs, schools, student groups, demographics, specialized accreditations, and more. We are uniquely situated to understand any one group’s data in the context of all the other data we see every day. Because of this, we can often spot issues before the campus leaders do. Conversely, we are also able to assuage leaders’ fears when the data may look scary, but are within the boundaries of what we see in other similar programs or across the campus.

One example across institutions is graduation rates, which are difficult to understand in isolation of a program because they need the context of the rest of the degree level, the institution, student risk factors, and the external higher education landscape.

Third, given our professional training and our access to data visualization and analytic tools, we have an obligation to use those tools to understand and make data-informed recommendations to serve our institutions and every stakeholder in it.

At my institution, there was a time when analysts were encouraged to just show the data, and let the stakeholders do their own interpretations and make their data-informed decisions. This is similar to an institution’s older-style Fact Sheet that might contain only tables of numbers. This approach did not last long for two primary reasons:

  • First, our decision-makers did not always have the analytic skills to interpret the data in order to make their data-informed decisions. Not all of our campus leaders have a background or competencies in data, research methods, or statistics.

  • Second, we have highly skilled analysts in IR/IE. To not use their expertise and insight to interpret the data and make recommendations is a missed opportunity.

Finally, embedded in many of our institutions is a mission of equity, where we commit to serve all students, regardless of their positions at our institutions or their backgrounds. When we find things in the data, if it shows a problem or a success, we are obligated to bring it to light. Equity in higher education is a passion of mine, and my role in institutional research and effectiveness allows me to provide a vital data-informed perspective to the equity work internal to my institution, and externally in higher education.

 

 Comments

 
To add a comment, Sign In
There are no comments.