- eCampus News - https://www.ecampusnews.com -

Should HED use this popular K-12 assessment?

Though resources such as graduation rates and standardized test scores are often used by prospective college students to evaluate an institution’s offerings, before-and-after learning measurements might be more effective, according to new research.

A test of this learning, such as a test to measure students’ writing skills during four years of college, should be used by all colleges and universities, according to a researcher at Rice University.

The researcher found that Rice undergraduates’ writing skills improved 7 percent over their college years, and college-ranking websites could help prospective students narrow their college search by providing information on how students improve skills such as writing during their education at various schools.

The 7 percent improvement during the four-year college span that researchers found over a nine-year study period was based on measurements of Rice undergraduates’ expository and persuasive writing skills.

The study is highlighted in the article “Improvement of Writing Skills During College: A Multiyear Cross-sectional and Longitudinal Study of Undergraduate Writing Performance,” which appeared in a recent edition of Assessing Writing.

“Colleges and universities seldom perform such before-and-after comparisons to see how much — or whether — students improve over their college years,” said James Pomerantz, a professor of psychology at Rice and a co-author of the study. “If you scour the web looking for information about how well students progress while pursuing degrees at America’s colleges, you will be hard-pressed to find a single school that provides this information.”

(Next page: Could institutions apply this method to assess improvement of other skills?)

Pomerantz said that college-ranking systems compare schools mainly on factors such as graduation and retention rates, but noted that these rankings can be improved simply by lowering the standards required to receive a diploma. He also noted that rankings rely heavily on schools’ reputations, which can be out of date and are deeply subjective.

And while standardized test scores on entering students may be objective, Pomerantz said, top-ranked schools accept students who are so talented and well-prepared to begin with that their successful graduation and subsequent careers seem all but preordained.

“Thus it’s not clear whether selective colleges can claim any ‘value added’ credit — whether students graduate in better shape than they arrived at college,” he said. “This is why we were interested in creating a study to evaluate how student skills improved. Writing ability seemed like a good place to begin, since it’s a fundamental skill, and one that future employers look for closely in college graduates.”

In addition to finding that Rice undergraduates improved their writing skills, Pomerantz and his fellow authors determined that this improvement was consistent for both male and female students and for those individuals majoring in both natural sciences and engineering as well as students majoring in humanities and social sciences. The finding held whether the testing was done longitudinally (tracking students over their college careers) or cross-sectionally (taking a snapshot of upper- and lower-class students at one moment in time).

The study was conducted on 303 Rice undergraduates between 2000 and 2008 and used a method modeled on clinical trials for new medicines, including random selection of subjects and blind scoring of writing samples by Educational Testing Service professionals who were unaware of the purpose of the research. The test consisted of Rice undergraduates writing answers to multiple prompts designed to tap their expository and persuasive writing skills. Students wrote their responses in campus lecture halls under timed conditions, on a fixed time of day and day of year. Their responses were then transcribed so factors such as handwriting skill would not affect their scores.

“While it is reassuring to learn that the major investment of time and effort in college — not to mention the large financial investment — pays off in a measurable way, more questions remain,” Pomerantz said. “How much do students improve in other skills, such as quantitative reasoning and critical thinking? Do students improve as much these days — with our new teaching resources and learning technologies — as they did 20 years ago? And how does the improvement measured at Rice compare with that at other institutions?”

Ultimately, Pomerantz hopes future research will encourage widespread, rigorous before-and-after testing of skills developed in college.

“These tests are not difficult or expensive to perform, and when carried out regularly they can show a school what parts of their educational system work well and what parts need adjustments,” he said. “Whether colleges have refrained from genuine value-added assessments because they have overestimated their cost and difficulty, or whether it was from fear that the results might not be flattering, it’s time to move forward and make this practice regular and universal. Without it, we don’t know whether students are actually improving in fundamental skills like writing. Some graduating seniors have expressed concern that their writing skills might have actually deteriorated in college from lack of use compared with their writing experiences in high school. Our results provide some reassurance that such fears are unfounded.”

The study was funded by The Spencer Foundation and is available online here [1]. Daniel Oppenheimer ‘00, Franklin Zaromb, Jean Williams and Yoon Soo Park were co-authors of the study.

Material from a press release was used in this report.