• Interview
  • 07.30.21

Higher Ed Dashboards: An Interview with ASR Analytics

  • by Nic Richmond, Chief Strategy Officer, Pima Community College
Dashboard tips

eAIR talked to John Van Weeren and Heather Hutchings with ASR Analytics to hear their perspectives on dashboards in higher education. ASR provides analytic consulting services, with expertise in the fields of business intelligence, predictive modeling, and data mining—and in the design and development of customized technology tools that support forecasting, simulation analysis, optimization modeling, and data visualization.

What place do you think dashboards have in institutional research? For example, are there particular metrics you think we should all look at in this way?

For institutional customers, dashboards are a useful way to communicate about strategic initiatives. They enable institutions to really bring the data perspective to strategic plans, which tend to be vague and "unmeasurable" unless someone from IR gets involved. Dashboards help provide the data, together with an explanation, story, and context.

Also, the static fact book is a candidate for the dashboard to present the term-by-term or period-by-period status of general standard operational and quantitative measures. For example, common fact book content that lends itself to a dashboard are year-over-year total enrollment, enrollment breakdown by IPEDS race/ethnicity, fall-to-fall retention rate, percent of students receiving financial aid, total financial aid awarded, and the list goes on.

Are dashboards as useful as some people would argue or are they somewhat overhyped?

Are dashboards overhyped? Yes, to the extent that nobody wants to talk about what is actually useful or how they support strategic initiatives. There needs to be a focus on what the intended action or change is that people expect from the data as opposed to "oh, that's a number." So what? There must be a focus on what you are supposed to do with the information, and that is the key issue people overlook.

What are the top three things you think everyone should consider when designing a dashboard?

  • What is the purpose? How will the dashboard lead to or support an action? What are you going to do with this information?
  • Keep it simple, visual, clear, and timely.
  • Stay focused for specific audiences, with context and drill down to detail as separate pages/workflows. Avoid cluttering and ensure the information fits on one page with no scrolling while maintaining readability.

What are the top three things you think should be avoided?

Well, essentially the converse. Don't pick an area for analysis and throw every related metric up there. Take enrollment, for example. There are far too many perspectives with different goals and definitions to make sense as a single dashboard.

Avoid useless color, lines, and pictures, as well as multiple fonts, flashing, animations, 3D anything, and other distractions that do not directly contribute to the story or data understanding. This is the biggest issue with most dashboards.

Do not try to provide every possible detailed list report anyone would want within a single dashboard. Dashboards are not going to answer every question. Trying to do so creates complexity that makes dashboards less usable.

For an institution new to dashboards, what considerations are most important when picking software and determining where to start?

Every institution has data all over the place and in various systems. Integration is always a challenge regardless of the tool you choose. You want to find (internally, if already available) or pick a tool that allows you to leverage all those data sources consistently without recreating the queries all the time. You especially want to avoid duplicating derived data definitions, which can lead to data inconsistencies and maintenance headaches. So a tool with a shared semantic layer is very beneficial. You must recognize that the institutional data model is actually far more important than the dashboard tool itself. The work is always in organizing and integrating data for automation and consistency. People tend to focus on the dashboard tool itself and what is pretty. Most tools can make a pretty presentation. The hard part is always in the data structure.

A second very important consideration is how you want to deploy dashboards. Will they be on a public site for large consumption by unknown/non-authenticated users, or will they be for internal users and employees only? Some tools have better licensing models for internal authenticated users versus anonymous public end users on an open website. Look for tools that don't penalize broad consumption with license costs if you plan to do a lot of public sharing.

How do you anticipate dashboards—and reporting in general—evolving in the next 5–10 years? What do we need to be ready for?

We are loath to prognosticate! But, we are always skeptical of wild, too good to be true claims like "recommendation engines that tell you what to do" or "Integrated AI automation," which take control away. There are far too many pitfalls there, especially with “black box” analytics, where there is no visibility into the models, methodologies, or accuracy. That said, higher education is 5–10 years behind most industries, so look at where other industries are now and that's where you'll be. To us that means information and metrics presented as simple views integrated into the transaction systems and business process. Data and analysis will be integrated in the data processing workflow and visual all in one place—the place where we are taking the actions and need to make decisions in real time. Consider an admissions counselor entering and updating the status of an application as ready for acceptance, showing at the top of the page the impact this change in status makes on the makeup of the incoming class—how it influences the incoming class quality and diversity goals. Or, imagine a departmental manager processing a purchasing requisition for approval and seeing the graphical representation of the impact of that decision on the total remaining budget for the fiscal year. Previously, people wouldn’t see these results unless they ran a separate report from a separate system.

 


Nic RichmondNic Richmond, Ph.D., has an extensive research and data analysis background that includes analyzing data on lunar and Mars crustal magnetic fields and the application of quantum mechanics and solid state physics to deep Earth studies. Since 2008, she has worked full time in higher education research leading the Strategy, Analytics and Research team at Pima Community College, where she serves as Chief Strategy Officer.