A university team is analyzing how data from voice and facial recognition technology could help instructors incorporate active learning.
A Carnegie Mellon University (CMU) assistant professor is using voice recognition technology that analyzes talk patterns to better inform instructors about what’s happening in their classrooms.
The voice recognition technology used in the classrooms will help teachers gain insights about what students are learning and if they are collaborating and analyzing concepts, said Amy Ogan, assistant professor of human-computer interaction in CMU’s School of Computer Science.
Right now, the technology is focused on the sounds that occur in a classroom. It detects who is talking, when, where, and for how long–all the features of talk data, Ogan said.
In giving this technology to instructors, the goal is “to give faculty an understanding of who is speaking, where, when, and how, so they can incorporate more active learning into the classroom,” Ogan said.
Teachers receive a dashboard that displays data about classroom activity, which helps determine how their actions are impacting student outcomes. The dashboard displays different lights, such as red and green lights, that correspond to how teachers might want to change or continue their teaching approach.
(Next page: How the dashboard and data guide instructors)
The sensors analyze the overall sound of the classroom, and not necessarily specific things being said, Ogan said. “The reason for this is to make sure that as we test this technology, we want to protect students’ privacy and their educational data, and making sure privacy stays in place as we’re analyzing the sounds.”
For example, the sensors could pick up on whether more sounds are coming from the front of the classroom versus the back, and whether that correlates to a teacher being more attuned to students in the front rows.
“I think one of the things we’re noticing is that even if you are incorporating active learning, it’s very easy to focus on the students at the front of the classroom raising their hands, and this data can let teachers know whether they’ve got an equitable spread of participation across the classroom,” Ogan said.
In between classes, teachers receive strategies that correlate to the data, such as ways to engage students in the back of the classroom who might not be as motivated to participate. Those strategies come from classroom literature on educational psychology and are sent to teachers’ phones.
Eventually, Ogan said, teachers will be able to use that data to influence their teaching in real time.
Moving forward, Ogan and her colleague, PhD student David Gerritsen, are working on techniques using cameras to detect patterns of emotion, posture, and other things students do in the classroom.
“This is a really great design challenge–we have to be able to take that data and display it to teachers in a way that isn’t distracting to them,” Ogan said. “We’re working with a university right now with lots of lecturing. When the system detects that students haven’t participated in a while, we flash a big red screen on the instructor’s laptop to notify them to incorporate some student interaction. That’s why we’re doing lots of work directly with instructors–so they have input on what the system does and how it looks. It doesn’t work if you don’t have buy-in from the instructors.”
Ogan said she hopes to move to K-12 classrooms in the near future.