We all know that AI is already in place in our daily lives. It enables the processing of very large data sets, learns from daily life, helps us to both create and identify “deepfakes” and promises to replace millions of jobs in the coming few years. Of course, there are prototype autonomous cars -- even self-driving buses operating on some university campuses.
In the sphere of learning, artificial intelligence drives adaptive learning models today that can personalize the learning experience. This is not a trivial advancement. It heralds a new era in pedagogy and practice. No longer will we need to aim our presentations to the middle of the class, ignoring those advanced learners, eager for more, and those less prepared learners who may be lagging behind the rest of the class. AI now adapts and individualizes the presentations to meet the students where they are with a goal to bring all to mastery of the learning outcomes for the class.
Georgia Tech professor Ashok Goel proved the viability of an AI virtual teaching assistant “Jill Watson” years ago, and he continues to refine the model. This may further place AI onto the front line of directly engaging students in learning. The potential is enormous. AI can take the tedious roles of teaching, such as repetitive answering of elementary and procedural questions such as “when is the assignment due, and how many pages are required for this paper?” It is already making teaching faculty more efficient at their jobs.
All of these are huge steps forward. Yet these advancements pale compared to what is coming next!
Brain-computer interface (BCI) is just what the name implies -- it is directly connecting the brain and the computer. There is no need for eyes, for ears, for keyboards or for voice instructions. BCI perceives your thoughts and responds with actions that can be directly inputted to the brain.
Jo Best, ZD Net’s tech expert reporter, writes, “Researchers have been interested in the potential of BCIs for decades, but the technology has come on at a far faster pace than many have predicted, thanks largely to better artificial intelligence and machine-learning software.”
Most early prototypes have been developed as brain-implanted sensors for those with physical impairments. However, more advanced technologies such as Neuralink -- originally founded by Elon Musk -- take a different approach. These approaches use external or implanted sensors to interpret thoughts in language form, not articulated vocally, and direct them into a computer running an AI algorithm:
Like a lot of BCIs, Neuralink's was framed initially as a way to help people with neurological disorders, but Musk is looking further out, claiming that Neuralink could be used to allow humans a direct interface with artificial intelligence, so that humans are not eventually outpaced by AI. It might be that the only way to stop ourselves becoming outclassed by machines is to link up with them -- if we can't beat them, Musk's thinking goes, we may have to join them.
The direct connection of mind and computer raises many questions, such as what is human memory? If you need to only think a question in your mind to get an instant answer to that question from a computer, is that the equivalent of virtual memory?
For example, you might think, “I wonder what the weather will be today” and you receive an instant response in your brain with the forecast. Nothing is vocalized. Nothing is heard. The response is instant, as if you knew the forecast all along and were just recalling it.
The director of the National Center for Adaptive Neurotechnologies, Jonathan Wolpaw, describes the current state of research:
"The unique thing about BCIs is that they provide the brain with a new kind of output, which is output from brain signals. Instead of driving muscles, you go directly to a part of the brain, measure its activity in one way or another, and you convert that into some sort of action. The individual pieces of the brain that have evolved, as far as we understand, with the sole purpose of controlling muscles, they are now being turned into the outputs themselves," Wolpaw says. "The basic question for BCIs is how well the brain can learn to do this new kind of thing that it wasn't designed for, and it wasn't evolved to do. And the answer up to the present is, it can sort of do it, but not all that well with our current methods," he added.
As this is refined, what will this mean for education? If creating virtual memory and developing virtual instant recall are possible at speeds that already exceed the speed that pulses travel along human nerves, what will be left to teach and learn at the advanced levels? If AI programs can apply critical thinking, philosophical frameworks and more, what is left for education? Who at your institution is considering the impact of BCI and AI in the coming five years? How will you prepare?