Learning outcomes pilot study gathers data from 59 higher education institutions.
A new project focused on advancing learning outcomes has demonstrated that rubric-based assessment can be scaled and can offer up valid findings, along with actionable information, about student learning. This information could be used to improve curriculum and assessment design, and to improve program and class effectiveness in an effort to advance learning outcomes at colleges and universities.
These findings come from the pilot year of the Multi-State Collaborative to Advance Learning Outcomes Assessment (MSC) project, which launched in 2011 and supported by the Association of American Colleges and Universities (AAC&U) and the State Higher Education Executive Officers (SHEEO) Association.
In its pilot year, the project initially engaged faculty in 59 institutions in nine participating states. The nine states participating in the MSC during the pilot year were: Connecticut, Indiana, Kentucky, Massachusetts, Minnesota, Missouri, Oregon, Rhode Island, and Utah. (See: www.sheeo.org/msc for the full list of participating institutions involved in both the pilot and continuing phases of the MSC work.)
As part of the pilot study, more than 7,000 samples of student work produced for course assignments in students’ regular courses were uploaded to a web platform developed by Taskstream. 126 faculty members were trained, and then independently scored students’ work to produce a preliminary landscape analysis of student achievement at the participating schools.
Samples of student work were collected and evaluated for achievement in three important learning outcome areas: written communication, critical thinking, and quantitative literacy. The faculty members used common scoring rubrics—called VALUE rubrics—that were developed and validated by faculty as part of AAC&U’s Liberal Education and America’s Promise (LEAP) initiative.
“What this pilot study showed is that faculty from a variety of disciplines, from dozens of colleges and universities, from nine different states across the nation could assess the work students had done and evaluate it in a consistent and reliable way,” said SHEEO President George Pernsteiner. “There was no special test. There was no time away from the classroom. There was, however, a common understanding by faculty from diverse places and backgrounds of what constituted learning and whether students had demonstrated it.”
The MSC is part of AAC&U’s ongoing VALUE (Valid Assessment of Learning in Undergraduate Education) initiative originally launched in 2007. In its pilot year, SHEEO and AAC&U tested the feasibility of cross-state and cross-institutional efforts to document student achievement without using standardized tests and without requiring students to do any additional work or testing outside their regular curricular requirements. All the student work samples were assessed using common rubrics developed and tested against student work by teams of faculty at hundreds of individual institutions across the country.
In the MSC pilot study, 126 faculty from across the participating states and campuses used the common VALUE rubrics to evaluate student work and scored only work products produced by students from institutions that were not their own.
The pilot successfully demonstrated that:
• A wide array of institutions can develop sampling plans to provide reliable samples of student work from across a variety of departments and that demonstrate achievement of key cross-cutting learning outcomes.
• Faculty can effectively use common rubrics to evaluate student work products—even those produced for courses outside their areas of expertise.
• Following training, faculty members can produce reliable results using a rubric-based assessment approach. More than one-third of the student work products were double scored to establish inter-rater reliability evidence.
• Faculty report that the VALUE rubrics used in the study do encompass key elements of each learning outcome studied, and were very useful for assessing student work and for improving assignments.
• A web-based platform can create an easily usable framework for uploading student work products and facilitating their assessment.
• Actionable data about student achievement of key learning outcomes on specific key dimensions of these important learning outcomes can be generated via a common rubric-based assessment approach.
While the findings from the pilot study are not generalizable across the entire population of students in the participating states or nationally, the study found within the cohort of participating institutions some clear patterns in students’ achievement levels.
Using a 0–4 rating scale, much higher percentages of student work products were rated at either a “3” or “4” in four-year institutions than were rated at those levels in two-year institutions in the project. Significant numbers of students nearing degree completion at two-year institutions demonstrated high or very high levels of achievement on key outcomes.
“The calls are mounting daily for higher education to be able to show what students can successfully do with their learning,” said AAC&U President Carol Geary Schneider. “The VALUE Multi-State Collaborative is a very important step toward focusing assessment on the best evidence of all: the work students produce in the course of their college studies. It is exciting and inspiring to see the results of this project. Higher education owes a debt of gratitude to the educational leaders and faculty members in the participating states who helped develop this pilot study and contributed to these illuminating results.”
See the MSC pilot summary slide deck for full information on demographics of students in the pilot study institutions, rubrics used in the assessments, and preliminary results of scoring of student work products.
For more information, see VALUE and Multi-State Collaborative on Learning Outcomes Assessment.
Material from a press release was used in this report.
- College degrees don’t equate to career readiness - August 8, 2022
- 4 scenarios for the future of postsecondary data and analytics - August 4, 2022
- How to ensure your remote testing is humanized, yet secure - July 28, 2022