- eCampus News - https://www.ecampusnews.com -

Ranking of online college programs meets scrutiny

[1]
A University of Florida official called the U.S. News rankings 'messy.'

U.S. News and World Report’s first-ever “Top Online Programs” ranking has met sharp criticism from online-learning organizations, including one that said the lists were flawed by “scattershot methodology” that revealed a lack of expertise in the rankings process.

The magazine, which has published a national college ranking since 1983, included online college programs in a separate ranking this year [2], breaking the lists into sections: faculty and credentials, student engagement and assessment, and student services and technology.

Colleges that ranked highly in every category were included in an honor roll listing.

Arizona State University [3] was tops in the technology category, Westfield State University [4] was ranked No. 1 in online faculty credentials, and Bellevue University [5] in Nebraska was best in student engagement.

Schools had to offer at least 80 percent of their courses online to qualify for the U.S. News rankings.

Campus technology officials hailed the inclusion of online college programs as a widely-read acknowledgment that online classes have entered the mainstream, no longer seen as inferior to traditional brick-and-mortar classes.

The U.S. News rankings haven’t been exempt from critique, however, including a comprehensive critical analysis from distance-education advocates at the WICHE Cooperative for Educational Technologies (WCET) [6], a group that pushes for online course advancements in higher education.

In a Jan. 10 blog post, Russell Poulin, deputy director of research and analysis for WCET, pointed out that of the 2,000 surveys it sent to colleges, U.S. News received less than half of them back. The bachelor’s degree rankings had the most responses with 194, making the rankings appear less comprehensive than once thought.

“The small number of useful surveys speaks to the scattershot methodology used by U.S. News,” Poulin wrote. “We would have thought that they would have engaged experts to develop questions and then pilot test the survey on a sample of institutions.”

Formulation of the U.S. News criteria would have been bolstered by consulting experts in online education “best practices” rather than campus officials in charge of online college programs, Poulin wrote.

“They did interview people at distance-education programs, but that is far different than fully engaging people cognizant of online education policies in developing the questions,” he wrote. “As a result, this survey became a large-scale pilot test with questions that many institutions could not or would not answer.”

In a lengthy explanation of the magazine’s methodology [7] for the college rankings, U.S. News says it was careful about who participated in its first ranking of online college programs.

For instance, schools that launched online course offerings for the 2011-12 academic year were removed from the survey results “because of their inabilities to supply a full academic year’s worth of data.”

University officials also had their fair share of critical comments about the magazine’s lists.

Andy McCollough, associate provost at the University of Florida (UF), said the U.S. News rankings of online college programs were “messy” after the school was ranked alongside Pace University, Florida Institute of Technology, and Westfield.

“It’s a good start, [but] it’s a very difficult task,” McCollough told The Gainesville Sun [8], adding that the lists didn’t take into account the many areas of focus within UF’s undergraduate programs.

U.S. News also came under scrutiny for focusing its results on “inputs to the educational experience” rather than outcomes like student graduation rates, average grades, and student satisfaction with their school’s online college programs.

“With the focus on outcomes by accreditors, the Department of Education, and institutions, this seems like a major problem,” Poulin wrote.

U.S. News, on its website, said the publication’s editors “analyzed the quantity and quality of data collected to determine which questions could be used for rankings” once they had received responses to their survey from colleges and universities.

Poulin took issue with the magazine’s strategy.

“Instead of focusing on a few pre-tested questions that would lead to assessing quality, the survey was a smorgasbord of questions that they decided whether or not to use in the end,” he wrote. “That’s a fairly disrespectful use of staff time at our nation’s colleges and universities. Also, institutional personnel could have suffered from survey fatigue and not answered some questions—only to learn later that they skipped a crucial question.”