Cracking the MOOC ‘assessment nut’

Massive open online courses (MOOCs) might not offer actionable lessons to professors and campus administrators until the experimental classes have a reliable assessment apparatus, a Penn State MOOC professor said.

Student feedback will be key to the growth of MOOCs.

Kathryn Jablokow, who headed Penn State’s “Creativity, Innovation and Change” MOOC that ended in November, said one of the central challenges was offering students feedback as to how they performed in the massive course.

She was part of a Dec. 2 faculty panel on MOOCs as part of the third annual World Campus Faculty Convocation at Penn State’s University Park campus. The MOOC, co-taught with Penn State engineering professors Jack Matson and Darrell Velegol, was the university’s first in combination with MOOC platform Coursera.

The more than 150,000 students enrolled in the Penn State MOOC experimented with peer assessment — a model derided by many in higher education — and optional projects to earn certificates for the class.

Jablokow said MOOC assessment still had a long way to go.

“If we can crack the assessment nut, it will enhance the quality of the experience tremendously,” she said.

MOOCs of every kind are still in an experimental stage, Jablokow said, meaning some approaches will work and others will prove fruitless as educators get a feel for how to conduct classes with tens of thousands of students.

“It’s an experimental process,” Jablokow said. “Some things work, and some things don’t. The power to do certain things with MOOCs is tremendous, but they’re not going to do everything well, and we shouldn’t expect them to.”

Connect with us on Twitter using the hashtag #eCNMOOCs.