Efficiency is the top reason schools are using AI in college admissions, and many also hope to reduce bias in admissions.

8 in 10 colleges will use AI in admissions by 2024


Efficiency is the top reason schools are using AI, and many also hope to reduce bias in admissions

Key points:

  • Many admissions departments already use AI–and more are following suit
  • Still, many admissions professionals are concerned about the ethics around using AI
  • See related article: How is Gen Z using generative AI?

Since the latest version of ChatGPT launched last fall, there has been a lot of discussion about students using AI tools and the potential effects this will have on education. For example, last spring an Intelligent.com survey found that students of all ages are replacing some of their tutoring sessions with ChatGPT.

But what about the other side of the equation? It’s common knowledge that companies are using AI tools to evaluate applicants’ resumes, so are educational institutions doing the same? In September, Intelligent.com surveyed 399 education professionals with an in-depth knowledge of their schools’ admissions processes.

Key findings:

  • Half of educational admissions departments currently use AI; 82 percent will by 2024
  • Majority of schools using AI will allow it to have the final say on applicants
  • Efficiency is the top reason why schools are using AI in college admissions
  • 2 in 3 admissions professionals are concerned about the ethics of AI

82 percent of educational institutions will use AI in college admissions by next year

Among the nearly 400 respondents surveyed, 21 percent work at a private high school while 79 percent work in higher education. A higher proportion of those who work in a private high school say their school currently uses AI in admissions (79 percent) vs. those who work in higher education (50 percent). However, a higher percentage of higher education institutions than private high schools plan to implement AI (37 percent vs. 9 percent).

Fifty-six percent of all respondents say their school currently uses AI in their admissions process. Seven percent say their school plans to implement AI in admissions by the end of this year, 19 percent by 2024, and 5 percent by 2025 or later. Thirteen percent say their school has no plans to implement AI in admissions.

“When we hear about the use of AI in college admissions, it can conjure up fears of a machine without emotional sensibilities using some mysterious rubric to sort applications into ‘accept’ and ‘reject’ piles,” says Professor and Higher Education Advisor, Diane Gayeski.

“Indeed, there are currently AI tools in use that do exactly this, and we need to understand the advantages and risks associated with them. However, like all matters associated with the rapidly emerging set of programs called ‘AI,’ their purposes and processes are much more varied.”

Among respondents who work in higher education, a larger percentage of those employed at public schools say their school currently uses AI in admissions (55 percent) or plans to (34 percent) vs. those who are employed at private schools (38 percent currently use AI and 43 percent plan to).

Additionally, higher education employees who work at large schools were more likely to say their school currently uses AI in admissions (59 percent) or plans to (34 percent) vs. those who work at medium-sized schools (37 percent currently use AI and 45 percent plan to) and those who work at small schools (28 percent currently use AI and 44 percent plan to).

AI most commonly used to review transcripts and letters of recommendation

When all respondents whose schools currently use AI in admissions were asked what it is used for, 73 percent say the AI reviews letters of recommendation, 71 percent say it reviews transcripts, and 61 percent say AI is used to communicate with applicants.

Among all respondents whose schools plan to incorporate AI in their college admissions, 63 percent say AI will be used to review transcripts, 55 percent say AI will review letters of recommendation, and 54 percent say it will be used to communicate with applicants.

“As this survey points out, most institutions are now using or intend to use AI to evaluate students’ applications including essays, transcripts, and recommendations,” continues Gayeski. “But if you think that before AI existed, seasoned admissions experts or professors carefully read and considered each application, you’re wrong.

“Most institutions get thousands of applications, and they are often quickly reviewed and scored by temporary workers who use scoring tools provided to them by the admission office.  They have ways to calculate scores for their overall grades, factoring in the rigor of a student’s chosen courses and strength of their high school.

“Colleges expect that applicants get assistance with their essays, and mostly look at them to see if they can explain some irregularity in the pattern of their grades (such as if they experienced a health crisis in their sophomore year which accounts for an unusual number of absences and some poor grades) or whether a student can really articulate why they have chosen a specific major or institution.

“Because colleges need to pay attention to ‘yield’ (the percentage of students who were accepted and then actually choose to attend their institution), applicant management systems track the time an applicant spends on their website and if they actually visited campus.

“Many colleges subscribe to services that analyze past classes, and they use that data to score applicants on the likelihood that they will be successful and how much financial aid they’ll need. Bottom line: a lot of the admission process is rather ‘mechanical’ even without AI,” Gayeski explains.

Majority say AI will have the final say on whether an applicant is admitted or not

Among respondents whose schools currently use AI in college admissions, 87 percent say the AI ‘sometimes’ (43 percent) or ‘always’ (44 percent) makes final decisions about whether to admit applicants or not.

Similarly, among respondents whose schools plan to use AI in college admissions, 74 percent say the AI is ‘somewhat’ (45 percent) or ‘very likely’ (29 percent) to make the final decision about applicants.

“AI tools are being used in many parts of the admission process,” says Gayeski. “Most colleges are using ‘chatbots’ to answer common questions of applicants when they access the admissions website; this allows for 24/7 quick and consistent answers without tying up staff resources.

“Platforms like Element 451 allow colleges to customize communication with applicants, and enable them to quickly email or text them with updates on the process or unique messages such as invitations to come on a campus tour that keep them engaged.

“Other tools provide automated review of transcripts, a process that is usually very time-consuming when applicants are looking to transfer in AP courses from high-school or credits from other colleges. For instance, SIA claims that it can scan hard copies of transcripts and extract exact courses, grades, and credits in seconds without error.

“Going beyond this, platforms like StudentSelect actually scan essays and notes from interviews ‘to provide dozens of personality insights, skills, and other non-cognitive traits that paint a more complete picture about each of your applicants’ according to their website,” Gayeski says.

Efficiency is the top reason schools are using AI; many also hope to reduce bias in admissions

Among respondents whose schools are planning to incorporate AI, 85 percent say they plan to do so in order to increase efficiency in admissions, 42 percent say it’s to make more informed decisions, and 38 percent say it’s to decrease bias. Ninety percent of respondents in this group believe AI is ‘somewhat’ (57 percent) or ‘very likely’ (33 percent) to help with reducing bias in the admissions process.

When respondents whose schools currently use AI in admissions were asked why they chose to do so, 85 percent say it was to increase efficiency, 70 percent say it’s to make more informed decisions, and 56 percent say it’s to reduce bias in admissions. Ninety percent of respondents whose schools already use AI say it is ‘somewhat’ (22 percent) or ‘very likely’ (68 percent) that AI has helped with reducing bias in the admissions process.

“Human decisions are not without bias, and AI tools might actually be used to combat systemic racism or ageism,” Gayeski continues. “For instance, according to the chief technology officer of StudentSelect, their product eliminates age, racial and socioeconomic bias because it doesn’t look at dates, candidate names or ZIP codes, and it can actually be used to look at historic admissions data to identify underrepresented groups.”

Two-thirds are concerned about the ethical implications of AI

Despite the majority of respondents believing that AI can or has reduced bias in admissions, 65 percent of all admissions professionals surveyed, including those whose schools have no plans to implement AI, say they are ‘somewhat’ (35 percent) or ‘very concerned’ (31 percent) about the ethical implications of AI.

When respondents were asked to give any final thoughts about AI in admissions, write-in responses included the following. Please note that the quotes have been edited for clarity:

  • “My main concerns are in regards to the human element. It is not uncommon for circumstances to be taken into account and I’m not sure if AI can do that yet.”
  • “At present, the help that AI brings to our work is still very great. Whether it will have a negative effect in the future, we will have to see.”
  • “Can be a useful tool to assist in the process but should not have major decision making capability. Human insight into decisions is needed.”

“Using AI beyond routine communication or data analysis in the admission process is fraught with complications,” Gayeski says. “Machine language systems can be created to analyze essays and grades using the kinds of rubrics we now provide to staff, and they can certainly complete those reviews more rapidly and accurately.

“However, they are generally based on algorithms profiling previous applicants who were accepted and who then succeeded at an institution. The University of Texas at Austin tried this with a home-built system created for and by their graduate program in computer science, but they dropped it because they realized it had the potential to replicate bias by merely admitting students like the ones they’ve admitted for decades.

“There have been a number of prominent critiques of AI systems for their implicit bias. For example, Amazon dropped its AI-powered recruiting tool because they found that it favored women applicants over men simply because in the past, there were vastly more men who applied for tech jobs there.

“Like all software tools, AI-enabled admissions platforms reflect the kinds of decisions and criteria that humans already use, and are only as good as the data fed into them – ‘garbage in, garbage out.’ If the discussion about AI illuminates what is already a pretty inconsistent and opaque process, it’s already helped us make important strides,” Gayeski finishes.

This press release originally appeared online.

This press release originally appeared online.

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Laura Ascione