As colleges obsess over rankings, students shrug


The main factors in students' school choices might surprise some in higher education.

When US News & World Report debuted its list of “America’s Best Colleges” nearly 30 years ago, the magazine hoped its college rankings would be a game-changer for students and families. But arguably, they’ve had a much bigger effect on colleges themselves.

Yes, students and families still buy the guide and its less famous competitors by the hundreds of thousands, and still care about a college’s reputation. But it isn’t students who obsess over every incremental shift on the rankings scoreboard, and who regularly embarrass themselves in the process. It’s colleges.

It’s colleges that have spent billions on financial aid for high-scoring students who don’t actually need the money, motivated at least partly by the quest for rankings glory.

It was a college, Baylor University, that paid students it had already accepted to retake the SAT exam in a transparent ploy to boost the average scores it could report. It’s colleges that have awarded bonuses to presidents who lift their school a few slots.

And it’s colleges that occasionally get caught in the kind of cheating you might expect in sports or on Wall Street, but which seems especially ignominious coming from professional educators.

The latest example came last week at Claremont McKenna, a highly regarded California liberal arts college where a senior administrator resigned after acknowledging he falsified college entrance exam scores for years to rankings publications such as US News.

The scale was small: submitting scores just 10 or 20 points higher on the 1,600-point SAT math and reading exams. Average test scores account for just 7.5 percent of the US News rankings formula.

Still, the magazine acknowledged the effect could have been to move the college up a slot or two in its rankings of top liberal arts colleges. And so it was hard not to notice Claremont McKenna stood at No. 9 in this year’s rankings, which to people who care about such things sounds much sweeter than No. 11.

“For Claremont, there is I would think a psychologically large difference between being ninth and 11th,” said Bob Schaeffer of the group FairTest and a rankings critic. “We’re a top 10 school,’ (or) ‘we’re 11th or 12th’ — that’s a big psychological difference. It’s a bragging rights difference.”

If it was an effort to gain an edge, it backfired badly. Another popular list, Kiplinger’s “Best College Values,” said Friday it was removing Claremont McKenna from its 2011-12 rankings entirely because of the false reporting. The college had been No. 18 on its list of best-value liberal arts colleges.

Competitiveness may be naturally human, but to many who work with students, such behavior among fellow educators is mystifying. Contrary to widespread perceptions, they say, students typically use the rankings as a source of data and pay little attention to a school’s number.

“When I started in this business, I thought, ‘The rankings are terrible,'” said Brad MacGowan, a 21-year-veteran college counselor at Newton North High School outside Boston. “But spending all this time with students, I just don’t hear that much about them. I’m sure it’s colleges that are perpetuating it.”

It’s hard to know how common cheating like that reported at Claremont McKenna is, given that while US News cross-checks some data with other sources, it relies largely on colleges themselves to provide it. Modest forms of fudging through data selection are undeniably common, especially in law school rankings.

The most high-profile case of outright cheating involved Iona University in New York, which acknowledged last fall submitting years of false data that boosted its ranking from around 50th in its category to 30th.

But most rankings critics say by far the most pernicious failure of colleges isn’t blatant cheating, but what they do more openly — allowing the rankings formula to drive their goals and policies.

Colleges, they argue, have caved to the rankings pressure in a range of ways. A big one is recruiting as many students as they can to apply, even if they’re not likely to be a good fit, just to boost their selectivity numbers. And they’ve showered shower financial aid on high-achieving, and often wealthy, kids with high SAT scores.

In the mid-1990s, roughly one-third of grant aid, or scholarships colleges of all types awarded with their own money, was given on grounds other than need (typically called “merit aid’). A decade later, they gave away three times as much money — but well over half was based on merit.

Yes, some colleges recruited better students, but there was a price to be paid. Consider a 2008 study by The Institute for College Access and Success that examined the $11.2 billion annually four-year colleges were awarding in grant aid. Of that, $3.35 billion was awarded as merit aid.

That would have easily covered the $2.4 billion in unmet need-based aid that the colleges said their low-income students still faced.

Rankings critic Lloyd Thacker, founder of the group Education Conservancy, calls that a shift in financial aid from “charitable acts to competitive weapons.” Or, as Schaeffer describes it, “they end up giving the money to rich white kids.”

The vast majority of students attend college within three hours of home, so national rankings have little meaning.

What matters? Usually more mundane or subjective concerns. One student who went to MacGowan’s office last week for a college planning meeting, junior Bridget Gillis, said she’d yet to even see a college ranking guide. Her criteria: “If they have my major, if it’s a nice campus, how big it is, if they have the sport I want to play in college (field hockey).”

The latest version of a huge national survey of college freshman conducted annually by UCLA’s Higher Education Research Institute asked students to list various factors affecting their choice of college. Rankings in national magazines were No. 11 for current college freshmen, with roughly one in six calling them very important, well behind factors such as cost, size and location.

Those findings may be somewhat misleading. The leading factor cited, by almost two-thirds of students, was their college’s “academic reputation,” which can be hard to disentangle from its ranking.

A reputational survey ranking accounts for 25 percent of a college’s score in US News, and fame from a high US News rankings contributes to reputation, even if students say the ranking itself wasn’t a factor. Such circularity is one of many things critics dislike about the US News methodology.

But the survey data do suggest students generally heed the magazine’s advice not to use the rankings to make fine-grained distinctions between schools.

“As someone who is asked every year to comment on the rankings, it seems to me that who cares most is the media,” John Pryor, who directs the UCLA survey, wrote in a blog post last year. “Second would be college presidents and development officers. Way down the list seem to be those who are actually trying to decide where to go to college.”

Thacker says the rankings do have negative psychological effects on students, though usually only the top 10 to 15 percent who are applying to competitive colleges. But it has affected a much broader swath of colleges that have been unable to suppress their competitive urges for the educational common good.

“It has more an impact on colleges, presidents and trustees than it does on students,” Thacker said. “The colleges have shifted resources and changed practices and policies that were once governed by educational values to serve prestige and rank and status.”

That effect, he says, is dishonorable, even if some colleges at least feel guilty about it. More than 80 percent of college admissions officers surveyed for a report last fall by the National Association for College Admission Counseling felt the US News rankings offered students misleading conclusions, and roughly the same proportion agreed they caused counter-productive behavior by colleges.

Yet more than 70 percent said their schools promoted their ranking in marketing materials.

The fact that the highly regarded dean apparently involved in the scandal at Claremont McKenna may have been driven to submit inflated test scores is an indicator of the scale of pressure that surrounds the rankings, said David Hawkins, director of public policy and research at NACAC, the counseling group. That pressure comes from all corners of the university — trustees, alumni, presidents, even politicians,

“It’s clear from the (Claremont McKenna) story that admission offices are under pressure,” he said. “The key question is, how do you stop the madness?”

Bob Morse, who oversees the US News rankings as director of data research, says many of the behaviors the rankings have incentivized in colleges are benign. He points to universities like Northeastern and Southern California that have moved up in recent years through concerted efforts to improve their stats in variables that go into the formula — but which also are good for students.

Things like more small classes, programs to boost retention, higher faculty-to-student ratios. And why, Morse asks, should colleges be criticized for casting a wider recruiting net?

But even Morse, who says colleges paid the rankings little attention when they debuted in 1983, says he’s been shocked by how seriously they now take their standing, and the lengths they go to move up.

“None of those things when we first started we had in mind would even happen or even could happen,” he said. “It’s evolved in ways that have taken on a life of their own. To us, it’s proof people are paying attention.”

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Oops! We could not locate your form.

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.