A university recently crunched all of the “Big Data” it had gathered on a course and made a surprising discovery.
Out of the two professors who taught the course, one had significantly lower performing students. But this was a professor who had won several teaching awards and was well-respected by campus leaders.
What was going on here, the researchers wondered as they sifted through all of the data points at their disposal.
They could only draw one conclusion based on the data at hand: the poorer student performance was because the professor was not the level of teacher everyone believed him to be.
The conclusion, it turns out, was wrong.
All the calculations and algorithms failed to point out something most professors now know nearly instinctively: the course was at 8 a.m., and so the class was mostly made up of procrastinators who had waited until the last minute to sign up for the course. Hardly a room full of over-achievers.
“When you have 5,000 data points, how do you know what question to ask?” said Sherry Woosley, former associate director of institutional effectiveness at Ball State University, as she related this real-life incident.
That, she said, is one of the primary concerns as universities and colleges turn to Big Data technology to help make sense of all the information they have accumulated. More than 1,000 institutions are working with Big Data in some way, and universities are investing millions of dollars in super computers and data research centers.
Critics like Harper Reed, the chief technology officer of President Obama’s reelection campaign, warn that universities are being romanced — by adaptive learning platforms or computer companies like IBM — into financially backing what will turn out to be a fad.