Mixed reality is poised to have an incredible impact on instruction and preparing students for the workforce.

At EDUCAUSE 2018, educators from Case Western Reserve University (CWRU) shared how the university is developing and implementing small- and large-scale immersive augmented reality and mixed reality learning resources with great success.

The projects stem from the university’s Interactive Commons, which explores how cross-departmental teamwork and new technologies can foster innovation and new ways of teaching and learning. So far, they have yielded a fair amount of data, along with increases in student engagement, time savings, and more positive learning experiences overall.

“The workforce is collaborative, and we need to communicate across disciplines–curriculum has to drive those interactions,” said Erin Henninger, executive director of the Interactive Commons at CWRU. “We want to think about what kind of classroom we’re putting our students in in the future–the classrooms we’ve been creating for 100 years may be doing those students a disservice.”

CWRU is already using Microsoft’s HoloLens mixed reality headsets with students, and will deploy 32 HoloLens devices in its new Health Education campus, slated to open in July 2019. They can be used in any number of ways, such as to help medical students explore human anatomy together in a mixed reality environment. 

The Microsoft HoloLens gives students a new way of visualizing environments, from cadaver dissection to emergency response simulations and training.

“It gives us new ways to interact,” Henninger said, including virtual cadaver dissection, distance-based guided repair where a student is in one place and an instructor coaches him or her from another, telemedicine, and more.

Easy integration played a key role in the success instructors have seen so far. “You need little to no training in order to use this,” said Mark Griswold, IC faculty director. “We wanted something simple to use.”

So far, the IC has run 20 trials with the HoloLens in areas such as human anatomy and physics.

One trial dealt with a human breast dissection, which is traditionally challenging in cadaver dissections because the tissue tends to deteriorate rapidly, Griswold said.

One student group used the HoloLens lab for a virtual dissection, and another group went into the cadaver lab. Both groups were compared pre- and post-test. The HoloLens group’s post-test scores increased by about 8 percent compared to the cadaver lab group. Ninety-seven percent of the HoloLens group reported a positive experience, versus only 71 percent in the cadaver lab.

The two approaches could be used together, Griswold added, with students using the HoloLens lab first to avoid tissue deterioration issues, then working in the cadaver lab.

“People learn better when they’re engaged in groups,” Griswold said, noting that mixed reality approaches have shown proven enhancements in student achievement and long-term retention.

“We feel this engagement and collaboration better represent the jobs students will go into in the real world,” he added.

Another simulation revolves around emergency response training, in which students use a HoloLens to respond to a car accident and assess the surrounding environment and the victim’s injuries.

“We wanted to create a simulation that’s easy to use in classrooms and that’s scalable–something that could give us data on what students are learning,” said Sue Shick, an IC instructional designer. “This is a career and tech field, and we have to show whether students are meeting certain competencies.”

The simulation collects a surprising amount of data, such as: where students are gazing and what they’re examining; measuring movements; collecting speech data to gauge how calm they are and how engaged they are with the victim; turn-taking and collaboration with fellow students in the simulation; and knowledge assessment.

All this data is analyzed and helps faculty visualize and assess the learning taking place.

“One of the things we’re super interested in is thinking about multi-modal learning analytics and figuring out what the data said to us, and if it’s meaningful at all in terms of the 4Cs,” Shick added.

About the Author:

Laura Ascione

Laura Ascione is the Managing Editor, Content Services at eSchool Media. She is a graduate of the University of Maryland's prestigious Philip Merrill College of Journalism. Find Laura on Twitter: @eSN_Laura


Add your opinion to the discussion.