Institutions hoping to improve graduation success for students could look to the British government for inspiration.
In 2010, the British government, looking for innovative solutions to some of their most challenging social problems, formed the Behavioural Insights Team, better known as the Nudge Unit.
The Unit identified places where a combination of data-driven insights and simple interventions can have positive social impacts. For example, to help curb unnecessary antibiotic usage (which saves money and may help fight antibiotic resistance), the Unit sent letters from the country’s chief medical officer to prescription-happy doctors notifying them that 80 percent of their peers prescribed fewer antibiotics than they did.
As a result, the doctors decreased their antibiotic prescriptions by 3.3 percent in a 2016 pilot program – a notable outcome, given England’s goal of decreasing antibiotic prescriptions by 5 percent over five years.
The UK did not originate, nor is the only nation to pilot, this approach. Cass Sunstein, one of the authors of the book, “Nudge,” which brought attention to this approach in 2008, worked with President Barack Obama on a similar initiative in the US.
Graduation success nudges for higher ed
Today, colleges and universities are beginning to adopt similar approaches to improve student graduation rates, though their “nudges” are based more on strong signals from the institution’s data and predictive insights.
Related content: 10 ways to use analytics to increase student success
Del Mar College is a community college in Corpus Christi, Texas, that is designated by the U.S. Department of Education as a Hispanic-Serving Institution (HSI). We partnered with Civitas Learning to better identify the strongest signals of student risk and take precise action in support of those students. With our current Title V HSI grant program, we created a campaign focused on improving graduation outcomes among that group.
We started by identifying students who were at least 75 percent of the way toward earning an associate’s degree, but at risk of not graduating due to academic or life challenges. Once identified, we quickly reached out to these students and provided the appropriate support.
In some cases, the nudge was as simple as reminding those students who had enough credits to fill out their application for graduation. But other students, who despite their previous academic progress, were shown to be at risk for dropping out before they walked across the stage. For these students, it was critical to intervene with the right resources–and with the right message.
As a result of the outreach campaign, we saw a 31 percent overall increase in the number of students graduating in spring 2017 compared to spring 2016. For Hispanic students, the increase reached 32 percent.
In higher education, it’s tempting to over-complicate student success programs. We forget the simplicity of a nudge and how effective that kind of outreach can be, especially when it’s timely and personalized. Instead of automatically sending the same message to all students, we can dramatically move the needle on student outcomes when our actions are informed by a stronger signal on what’s impacting our students.
This isn’t to say that all nudges are a quick fix or silver bullet, however. The same nudges don’t work for everyone or in every instance. It’s critical to have a platform to take action and measure impact.
As The Economist notes, the type of social comparison that led to lower antibiotic prescriptions in the UK and more energy conservation in the US are seen as unlikely to work in France. There, people–as one adviser to the French government explained–“have a tendency not to comply as easily with perceived social norms.”
Similarly, each higher education institution must look at its own population–and data–to understand where nudges can have the greatest impact for their students, as well as where students may need more than a nudge to ensure they stay on the path to graduation. While they’re well-intended and directionally correct, we cannot assume that all practices work the same for all students.
To change behavior and outcomes, we must learn to look at our data and design meaningful, timely and personalized outreach.