New brief discusses why comparing institutions based on standard factors may not work for today’s expectations

compare-institution-PIRSAccording to a new brief, major considerations must be taken into account for the Obama Administration’s proposed college rating system, the Postsecondary Institution Ratings System, or PIRS, including students’ socio-economic backgrounds, and the mission of the institution.

A number of key elements in PIRS have yet to be defined, as policymakers continue to seek input from the higher education community.

According to the Department of Education (ED), President Obama will seek legislative changes to the Higher Education Act to ultimately link student financial aid to institutional outcomes such as the percentage of students receiving Pell Grants, average cost of attendance, student loan debt, graduation rates, and/or transfer rates.

A 2013 White House fact sheet noted that the ratings would compare colleges “with similar missions,” but did not provide details on how colleges would be grouped.

ED has been tasked with developing and publishing the new college ratings system by the 2015-16 award year.

The brief, “Peers in PIRS: Challenges & Consideration for Rating Groups of Postsecondary Institutions,” commissioned by the National Association of Student Financial Aid Administrators (NASFAA), uses institutional case studies to illustrate some of the differences and similarities among colleges and universities.

It makes the case that any postsecondary outcomes need to be “corrected” for various inputs, such as the characteristics and backgrounds of entering students, and provides examples that speak to the feasibility of “mission” as a peer-group identifier.

“Having an accurate picture of student outcomes at similar institutions is a worthy goal,” said NASFAA President Justin Draeger in a statement. “But this must be done thoughtfully lest we do more harm than good. We know that comparing institutions—even those with seemingly similar missions—is not as simple as it appears. Our research shows that student demographics and needs vary widely, even at schools with very similar missions.”

The brief outlines three key considerations for the Administration when developing PIRS:

(Next page: 3 key considerations for comparing institutions)

1. Input adjustment

According to the brief ‘input adjustment’ means adjusting outcomes to reflect inputs, such as the characteristics and background of entering students.

“Although indicators such as graduation rates, student persistence, and labor market outcomes are commonly used as measures of institutional performance, information about students’ academic preparation and other factors is often not taken into account,” explains the brief. “The failure to account for the characteristics of entering students and institutional mission can lead to misleading comparisons.”

In other words, says the brief, given the wide differences in the characteristics of students who enroll in postsecondary institutions, a true measure of value added during their education needs to take into account their starting points.

2. Determine the right peer group factor for comparison

If the goal is to assess institutional performance, notes the brief, the comparison variables might be different. For example, in order to determine the value added, academic background (SAT/ACT), student financial income (percent receiving Pell), student demographics, and institutional characteristics (e.g. enrollments, Carnegie classification), might be used to calculate a predicted graduation rate for each institution, which can then be compared to actual outcomes.

“Grouping higher education institutions often differs depending on the goal of the classification,” says the brief. “[Yet], it is important not to use too many variables to define the peer groups, both for practical reasons and face validity. However, using a small number of variables can leave substantial differences among potential comparison colleges.”

The brief also emphasizes that those who use PIRS should “have the option of customizable comparison groups, based on comparing particular institutions on region, selectivity, programs offered, and student age.”

Some of the variables currently being considered in other higher-education data comparison groups, including the IPEDS Technical Review Panel (TRP) include:

  • Use of distance education
  • Enrollment size
  • Selectivity
  • Region
  • Level—institutions that have a highest award offered that is different from the majority of degrees
  • Predominant undergraduate credential

3. Diversity can exist even within broad categories of institutions based on mission

Institutions in the same sector and state may vary widely in terms of the characteristics of their programs.

For example, the brief explains that at four-year institutions, the extent of research and proportion of graduate students differs considerably, as does the extent of public service and extension programs.

“Community colleges have outcomes ranging from completion of certificates of varying lengths, completion of associate’s degrees, transfer rates to four-year institutions, and even non-credit work,” says the brief. “As a result of different missions, the mix of programs may differ substantially across colleges, which can distort comparisons even within broad institutional types. The mix of programs of varying levels and types as well as research and other activities will impact outcome measures.”

One strategy for deciding on the “appropriate way” to compare groups might be to use available data on institutional mission as the first cut, then program and student-related factors as the second cut, explains the brief.

For more information on the three considerations, as well as state-by-state institution comparisons, read the brief.

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Oops! We could not locate your form.

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.