If you’ve attended an edtech conference at any point since 2012, then you’ve likely seen some variant of Gartner’s analytics maturity model. While some fair criticism of the model exists, there are good reasons for its ubiquity. It successfully maps out the growth trajectory institutions face in their efforts to move successively through four types of analytics–Descriptive, Diagnostic, Predictive, and Prescriptive–in a way that is intellectually intuitive. The model regularly makes the rounds in Twitter and LinkedIn feeds because, like a good TED Talk, it takes the abstract concepts we interact with daily and wraps them in a tidy, understandable package.
Here’s the logic of the model: As an institution strives for increasingly sophisticated levels of analytics, the value of the information produced similarly increases. If executed well, an institution steadily ascends from a place of hindsight, through insight, and into foresight – knowing what happened in the past and why it happened, what will likely happen in the future and how to influence this likelihood for the better.
As both a higher ed and edtech professional, I’ve worked with dozens of college and university leaders who state this analytics maturity process as an aspirational goal, especially with matters pertaining to enrollment, retention, and persistence. They dream of leveraging their data to know who in their student body they failed to retain, what factors and experiences these students had in common, and how they can predict and prescribe policies and practices that minimize risk for future students. In an era of declining enrollments across all of higher ed, predictive and prescriptive actions are a pragmatic necessity for institutions that are increasingly reliant on tuition dollars to simply stay open.
All the pieces would seem to be in place for widespread analytic success, right? A well-known model exists and can be readily applied to real-world problems facing higher ed. Why is it, then, that so many institutions struggle to achieve even the first level of Gartner’s model–Descriptive Analytics–let alone the higher, more “valuable” levels?
I speculate one reason is that inherent in Gartner’s model (or any model of analytics maturity) is the presumption that relevant, meaningful data is abundantly available to build the model upon. Institutions with actionable analytics aspirations must ask themselves three critical questions to ground themselves against this presumption and determine if their analytic foundations are aligned for desired outcomes.
What data do you value?
In my time as a faculty member, one of my favorite activities with first-year students was an exercise that challenged them to consider their values. First, students were provided a list of 50 values and their definitions. These values were all framed in a positive light and included qualities such as status, wealth, adventure, independence, helping others, philanthropy, autonomy, community, and so on.
After reviewing the list, students were tasked with crossing off any values they did not immediately identify with. Inevitably, students would cross off just a few, leaving them with 40 or more values remaining. Through a series of guided discussion prompts, students were asked to iteratively reflect upon and simplify their lists, ultimately arriving at their final five values. The process of finalizing these five values was a healthy challenge. By reducing their lists, students were forced to prioritize; for example, they may have felt conflicted in admitting they valued security over adventure (or vice versa), but in doing so they were being honest with themselves.
For first-year students, the act of identifying these values was an early step in aligning their true values to their new higher ed environment, and they could then meaningfully invest their time and learning efforts (both in and out of the classroom) accordingly.
The same logic applies when asking what data your institution values. The temptation will be to respond similarly as my students initially had, casting as wide a net as possible. However, knowing what specific data is valued is fundamental in effectively scaling up an analytics model, as the goal is to produce more of what you value. Let’s consider class attendance as a straightforward example: Common sense would dictate that class attendance data is a valuable starting point for an analytics model concerned with enrollment, retention, and persistence patterns.
What data do you collect?
As evidenced by my prior students, it is very easy to say one values something; it is more challenging to demonstrate that value through action. Investing in data collection is where the pivot from “talk to action” begins. Pausing to consider the alignment (or lack of) between the data you say you value and the data you collect provides opportunity for institutional self-reflection and honesty.
For example, your institution’s valued data point–established through official policy–may state that class attendance is mandatory (barring exceptional medical or legal circumstances). The question then becomes: Is class attendance collected in a systemic and scalable way that produces analyzable data?
Here we arrive at a “gut check” moment: If your institution states that it values attendance data but doesn’t have a scaled, systemic way to collect it, then there is a misalignment that calls the true importance of this value into question. Perhaps your institution is actually investing those data collection efforts and resources elsewhere – if so, then that collection effort points to what is truly valued. In such a circumstance, you will need to go back and re-answer the previous question with your newly informed perspective.
This may be obvious, but it must be stated: Valuing data points alone is not enough to make an analytics model function. Both basic and sophisticated tiers of analytics require that valued data points be collected in a consistent, systemic manner.
What data do you presently act on?
Circling back to my work with first-year students, the follow up to the values exercise was a class session on the nature of time and goals. In it, we discussed the tendency for our present-selves to outsource responsibility to our future-selves. Students could easily relate on a micro-level, thinking of their Friday afternoon tendency to put off an assignment for their future-Sunday-night-self to complete. Our actions in the present enable us (or divert us) from ultimately arriving at a point where our life is aligned to our values.
Our institutions may fall into similar patterns of thinking on a macro-level. We need to pause and consider what data we act upon in the present, regardless of what level of analytic sophistication we aspire to in the future. Institutions should think about this at both the aggregate and individual level. Continuing our attendance data example, if your institution values class attendance data and collects it in a scaled and systemic way, are the aggregate results of this data readily available for faculty and staff to consume? Are data-informed targeted interventions in place at the individual level, where appropriate?
If the answer to either of these considerations is “no,” then you will need to modify your expectations about when an analytics maturity model will prove its impact. We must address inertia today if we expect accelerated action tomorrow.
Alignment and analytics
The popularity of the Gartner model persists because it succinctly states many institutions’ aspirations. Predictive and prescriptive analytics are increasingly important for the welfare of any college or university and embracing this reality will only be to an institution’s benefit. Pausing to consider the data your institution values, the data it collects, and the data it acts upon will provide real opportunity to correct any areas of misalignment, and in turn create a strong foundation upon which to build a successful analytics model.
- How higher ed can set students up for successful internships - September 27, 2023
- How to prioritize data protection this school year - September 26, 2023
- Creating a positive campus for the new academic year - September 25, 2023