4 reasons why online assessments aren’t there yet

New research indicates technology, and student proficiency, has a long way to go

online-assessments-digital As skills like problem-solving, communication, collaboration, and technological proficiency become more valued in today’s economy and with employers, more students are being asked to create “maker” homework and projects, highlighting their knowledge of a subject through highly personalized, creative work. But can digital tools accurately assess these works?

According to a new study, “Digitizing Practical Production Work for High-Stakes Assessments,” researchers C. Paul Newhouse and Pina Tarricone from Edith Corwin University in Western Australia aimed to determine whether highly creative student work could accurately and feasibly be tested through summative high-stakes assessments.

The researchers noted that, currently, the major “obstacles” to formally assessing student “maker” work is cost and accuracy—costs associated with both sending out the work to be formally assessed as well as time constraints on faculty; and accuracy associated with providing a fair and reliable judgment of subjective student work.

However, thanks to the current availability and lowered costs of digital technologies, it may now be possible to reduce costs and provide a fair high-stakes assessment for students’ creative works.

“For high-stakes external assessment it is important that the form of assessment reflects the requirements of the practical curriculum,” notes the report. “Typically, our societies have addressed problems by developing technologies, so it is reasonable to look for technological solutions to these problems of assessment. Could digital technologies contribute to solutions?”

(Next page: Not quite there yet)

Many factors must be considered for good assessments, says the report, such as comparable judgments between contexts, scores divorced from the subjectivity of assessors, and management of the work of large groups of students spread across wide jurisdictions.

And it’s these factors researchers kept in mind when conducting their study over three years. The study investigated the potential to use digital representations of practical work for summative assessment in two senior secondary courses: Design and Visual Arts. More about the methodology can be found in the report.

However, though students can digitize their creative works for digital assessments, there are still many obstacles to this method:

1. Students aren’t tech-savvy right away: Though students in the Design class found scanning of their paper-based portfolios to digital portfolios straight-forward, students in the Visual Arts class found that creating digital representations of their artworks through software was difficult and time-consuming.

The report suggests that based off of student feedback, if more time was given to complete the digitization process and more training was given on how to complete this process, this method could be successful.

Students also preferred using their own digital technologies to digitize their work than the ones provided by the course.

2. Campus digital technologies are often not adequate: As noted above, many students in both courses found the digitization process of their work easier and better represented when they could use their own digital technologies, rather than the ones chosen by the campus.

Students in the Design class agreed that using a color A3 scanner to convert paper portfolios to PDF files required little editing for scoring, but most teachers felt the quality of the scans was not adequate; lacking in resolution and accurate color representation. All agreed that the video and online technologies used by students worked best. For the Visual Arts class, it was “technically impossible to adequately represent each type of artwork in digital forms using SLR digital and video cameras and the specifications developed for the study,” explains the report.

3. How you assess digitally matters: Researchers found that the type of online assessment mattered to the work at-hand, with not all types of assessments able to accurately determine student skill and learning. Only one type of assessment, comparative pairs scoring, was reliable.

According to the report, this unreliability is likely due to “the quantity and complexity of information in the portfolios, making it difficult to judge using a set of criteria.” And though the comparative pairs method of scoring worked well with the Visual Arts class, most students and teachers emphatically stated that they would not like their artwork to be assessed using digital representations, as they felt digital models were not accurate in representing their physical work.

4. Digital tech isn’t for every medium: Though the design class preferred a digital portfolio over a paper portfolio of their work due to “better alignment with the intentions of the course” and the opportunity to better showcase their tech-savvy to future employers, Visual Arts students saw the “value” in digital representation, but not for assessment purposes, since the “quality would be inconsistent and that subtle features of artworks would not be captured.”

The report concludes by explaining that though, overall, the study found that it is feasible to use inexpensive technologies to create digital representations of creative work for the purposes of summative assessment, due to student and teacher attitudes, time needed, and technology available, “much still needs to be done to gain acceptance of some of the main stakeholders, including students, teachers and the community.

For more in-depth coverage of this study, read the full report.

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Oops! We could not locate your form.