If you’ve attended an edtech conference at any point since 2012, then you’ve likely seen some variant of Gartner’s analytics maturity model. While some fair criticism of the model exists, there are good reasons for its ubiquity. It successfully maps out the growth trajectory institutions face in their efforts to move successively through four types of analytics–Descriptive, Diagnostic, Predictive, and Prescriptive–in a way that is intellectually intuitive. The model regularly makes the rounds in Twitter and LinkedIn feeds because, like a good TED Talk, it takes the abstract concepts we interact with daily and wraps them in a tidy, understandable package.
Here’s the logic of the model: As an institution strives for increasingly sophisticated levels of analytics, the value of the information produced similarly increases. If executed well, an institution steadily ascends from a place of hindsight, through insight, and into foresight – knowing what happened in the past and why it happened, what will likely happen in the future and how to influence this likelihood for the better.
As both a higher ed and edtech professional, I’ve worked with dozens of college and university leaders who state this analytics maturity process as an aspirational goal, especially with matters pertaining to enrollment, retention, and persistence. They dream of leveraging their data to know who in their student body they failed to retain, what factors and experiences these students had in common, and how they can predict and prescribe policies and practices that minimize risk for future students. In an era of declining enrollments across all of higher ed, predictive and prescriptive actions are a pragmatic necessity for institutions that are increasingly reliant on tuition dollars to simply stay open.
All the pieces would seem to be in place for widespread analytic success, right? A well-known model exists and can be readily applied to real-world problems facing higher ed. Why is it, then, that so many institutions struggle to achieve even the first level of Gartner’s model–Descriptive Analytics–let alone the higher, more “valuable” levels?
I speculate one reason is that inherent in Gartner’s model (or any model of analytics maturity) is the presumption that relevant, meaningful data is abundantly available to build the model upon. Institutions with actionable analytics aspirations must ask themselves three critical questions to ground themselves against this presumption and determine if their analytic foundations are aligned for desired outcomes.
What data do you value?
In my time as a faculty member, one of my favorite activities with first-year students was an exercise that challenged them to consider their values. First, students were provided a list of 50 values and their definitions. These values were all framed in a positive light and included qualities such as status, wealth, adventure, independence, helping others, philanthropy, autonomy, community, and so on.
After reviewing the list, students were tasked with crossing off any values they did not immediately identify with. Inevitably, students would cross off just a few, leaving them with 40 or more values remaining. Through a series of guided discussion prompts, students were asked to iteratively reflect upon and simplify their lists, ultimately arriving at their final five values. The process of finalizing these five values was a healthy challenge. By reducing their lists, students were forced to prioritize; for example, they may have felt conflicted in admitting they valued security over adventure (or vice versa), but in doing so they were being honest with themselves.
- How department chairs can support new faculty - September 28, 2022
- How to increase security for hybrid education networks - September 26, 2022
- 3 ways to improve the non-traditional student experience - September 23, 2022