ed-tech-future-101

What are the “black swans” of higher ed-tech?


Panel of experts discuss where education technology could be headed in the future (for better or for worse), and how to react to these changes.

ed-tech-future-101According to ed-tech experts, responding as flexibly as possible to often unpredictable trends and new technologies is critical for maximizing their potential positive impact on learning.

Black Swans” was the latest webinar hosted by The New Media Consortium in their Beyond the Horizon series, and featured a panel of five experts led and moderated by Dr. Ruben Puentedura, founder of the Hippasus educational consulting firm.

According to Puentedura, a “black swan” is an event that is unpredictable ahead of time, has a major impact on the world around it, and, in retrospect, should have been predicted due to later rationalization and explanation. For example, the economic downturn of 2008 would be a black swan.

In education, of course, any black swan event can make a huge impact, and thus, it is vitally important for decision makers to frame their thinking in a way that allows them to respond flexibly when that impact occurs. Rather than buckle under the pressure or simply revert to more familiar habits, it is important for higher education leaders to get creative and imagine how their institution might be able to change traditional processes to benefit from the unexpected black swan.

Black Swan Potential

According to the panelists, there are three areas of interest, as identified by the NMC’s Horizon Report, that could greatly affect higher education: net privacy, the network, and machine learning. Depending on institutional and societal preparedness, black swans can potentially lead toward both utopian and dystopian versions of the future. Thus, the best and worst case scenarios were each imagined and examined for each of the three hot-button technology topics.

Lev Gonick, CEO and co-founder of OneCommunity, led the discussion on the network and its potential. With the net as we know it about to celebrate its 50th year, Gonick predicted that changes would increasingly accelerate and that users would likely look back on the first 50 years as the most innocent, nascent, and utopian. Due to the tremendous source of potential for earning independent capital that the internet provides, it is likely that more and more devices and services will be brought onto the network, with 150 billion devices/machines to be connected to the network in the next 5-10 years alone. For colleges and universities, that means the shift from BYOD to BYOA may be more critical to effectively manage than ever.

In a perfect world, Gonick believes devices would tell the network all about a person’s activities to recommend activities for users and assist them with exercise, sleep, social engagements, and learning. With learning analytics still in their infancy today, it is possible that they will play a much richer future role in education to provide precise and highly personalized learning right from the network. Gonick even speculated that, with the advances in the genomics field, humans could take pills or use implants to maximize learning potential in the future.

On the dystopian side, Gonick was careful to note that while “the future is here,” it is also unevenly distributed. In order to stop the technological gap from growing, the network should be evenly distributed, but it is currently hard to envision the gap closing as the network continues to advance. On another note, learners always need time to absorb what they have learned, but with the network so prevalent, as Gonick put it, “the human need for private space is under assault.”

Wendy Shapiro, Boston associate dean of Learning, Design and Technology at the University of Massachusetts, led the discussion on internet privacy. According to Shapiro, as society becomes increasingly reliant on logging onto the internet, just about every website tracks browsing information or shopping habits to some degree. Society, however, must decide on how much weight they place on either side of the tricky balance between privacy and convenience, such as how much one wants search information to be tailored to their preferences or if personal information should auto-fill on websites. The potential of being connected to the internet in regards to learning is huge when it comes to adaptive technologies and immersive learning experiences, but it is important that students don’t become mere analytic data to the point that they aren’t making decisions about their own learning, she explained.

In the utopian sense, explained Shapiro, the information collected by machines via learning analytics would allow students to take an active part in their learning, help them with retention, and ensure that material would adjust accordingly to help bring them the greatest success. If education was truly tailored to every individual student, learning would be more efficient for the student both financially and in terms of time. Furthermore, Shapiro speculated that in a perfect world, students could connect their information to a school’s LMS and every other service on the campus, while still being able to manage their own data and decide how it was used.

In her vision of a worst-case reaction to this black swan, Shapiro warned that as more and more education happens online, traditional methods of monitoring assessments are turning towards anti-cheating surveillance technologies (for example, software that monitors facial expressions, personal identifications, internet browsing, and even knuckle movements), which could have a negative effect on the learner. Such an invasion of privacy could even turn learners away from the joy of education all together. Furthermore, when it comes to the issue of privacy on the net, Shapiro reiterated the importance of realizing how important personal identity is, as well as establishing boundaries to protect this identity from the Internet of Things.

Ed-tech consultant Bryan Alexander described machine learning, a combined enterprise between artificial intelligence and robotics, as the “boldest of the black swans.” As devices continue to push the boundaries of processing speed, Alexander stated that there really was no telling how much machine learning could evolve in the next 10 years. However, Alexander identified personalized tutoring software in the form of readily available and quickly downloadable apps as likely becoming extremely popular in higher education in the future. These apps could lead to broader intellectual engagement for students, as they would feature an equal meeting of minds where the tutoring software remembers the learner.

Alexander’s view of a utopian future hinged upon machines assisting and complementing human endeavors. In other words, machines and humans could complement each others’ weaknesses. For example, machines could assist with repetitive tasks (writing form letters, etc.), allowing humans to expand random creativity; machines would then need to push the expectations of their programming and lead them to an appreciation of human style and eccentricities. Finally, with unlimited access to tutorship and knowledge, it is possible that scholarship could undergo another renaissance stemming directly from technologically advanced higher education campuses.

For his dystopian vision of the future, Alexander warned that artificial intelligence must be designed with a hard limit to ensure it never becomes better than, or attempts to replace, humanity. For instance, when machines are used to completely supplant the need for human labor, both physical and mental, what happens to humans? Does a horrific class system emerge where the 1 percent owns the machines and makes all of the money while the rest of the population can barely get by? Or worse, does a scenario like the one in the film The Terminator emerge? Of course these are extreme examples of where things could go on a societal level, but the same ideas about limiting artificial intelligence are important on college campuses as well. If personalized tutoring became too advanced, all faculty members would be reduced to adjuncts paid by the hour or minute–if students went to physical campuses at all.

Tom Haymes, director of Technology and Instructional Computing for the Houston Community College System, summed up his thoughts on all of these issues by noting that humans don’t always move at the speed of technological changes when black swans emerge, and instead fall back on what they know out of fear of change. Haymes advised that training for all involved in the institution was vitally important to take full advantage of new ideas or technology. Higher education leaders were advised to keep their eye on the direction of trends to best lead smooth transitions for students, faculty, and society at large.

Additionally, Haymes identified transparency as a key component for avoiding fear-based responses to black swans, as well as their dystopian futures. An opt-in culture should also be championed when it comes to machines using personal data. As long as machines are regulated and taught to respect human learning and privacy rather than utilized as a replacement for human functions, the utopian futures imagined by the panelists seem possible, he said.

“Collective, collaborative thinking is our best way forward,” Alexander concluded.

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Oops! We could not locate your form.

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.