• Online: Trending Now

    Unique biweekly insights and news review from Ray Schroeder, Senior Fellow for UPCEA

Affective Artificial Intelligence: Better Understanding and Responding to Students

Artificial intelligence is recognizing and responding to human emotions, oftentimes better than many humans.

August 21, 2019
 

As a longtime professor of communication, I am fascinated with the cognitive characteristics of artificial intelligence as they relate to human communication. Image processing, computer vision, speech recognition and pattern recognition are parts of the sophisticated processes in artificial intelligence communicating with humans. But at the superficial level, these comprise only part of the communication process.

One of the challenges in person-to-person communication is recognizing and responding to subtle verbal and nonverbal expressions of emotion. Too often, we fail to pick up on the importance of inflections, word choices, word emphases and body language that reveal emotions, depth of feelings and less obvious intent. I have known many of my colleagues who were insensitive to the cues; they often missed nonverbal cues that were obvious to other more perceptive people.

Kendra Cherry from Verywell notes that “research has identified several different types of nonverbal communication. In many cases, we communicate information in nonverbal ways using groups of behaviors. For example, we might combine a frown with crossed arms and unblinking eye gaze to indicate disapproval.”

And that brings me to just how artificial intelligence may soon enhance communication between and among students and instructors. AI in many fields now applies affective communication algorithms that help to respond to humans. Customer service chat bots can sense when a client is angry or upset, advertising research can use AI to measure emotional responses of viewers and a mental health app can measure nuances of voice to identify anxiety and mood changes over the phone.

“Machines are very good at analyzing large amounts of data,” explains MIT Sloan professor Erik Brynjolfsson. “They can listen to voice inflections and start to recognize when those inflections correlate with stress or anger. Machines can analyze images and pick up subtleties in microexpressions on humans’ faces that might happen even too fast for a person to recognize.”

Too often we fail to put ourselves in the position of others in order to understand motivations, concerns and responses. Mikko Alasaarela posits that humans are bad at our current emotional intelligence reasonings: “We don’t try to understand their reasoning if it goes against our worldview. We don’t want to challenge our biases or prejudices. Online, the situation is much worse. We draw hasty and often mistaken conclusions from comments by people we don’t know at all and lash [out] at them if we believe their point goes against our biases.”

That can be a significant challenge in online classes. Too often, I fear, we miss the true intent, the real motivation, the true meaning of posts in discussion boards and synchronous voice and video discussions. The ability of AI algorithms to tease out these motivations and meanings could provide a much greater depth of understanding (and misunderstanding) in the communication of learners.

Sophie Kleber writes in Harvard Business Review, “In January of 2018, Annette Zimmermann, vice president of research at Gartner, proclaimed: ‘By 2022, your personal device will know more about your emotional state than your own family.’ Just two months later, a landmark study from the University of Ohio claimed that their algorithm was now better at detecting emotions than people are … Emotional inputs will create a shift from data-driven IQ-heavy interactions to deep EQ-guided experiences, giving brands the opportunity to connect to customers on a much deeper, more personal level.”

With AI mediating our communication, we can look to a future of deeper communication that acknowledges human feelings and emotions. This will be able to enhance our communication in online classes even beyond the quality of face-to-face communication in campus-based classes. Algorithms that enable better “reading” of emotions behind written, auditory and visual communication are already at work in other industries. It will not be long before these will be available to enhance our communication in online classes.

Are faculty considering how they might best use this added knowledge? Are you preparing faculty members for this prospect? Is your university prepared to consider the privacy considerations that this technology raises?

Read more by

Inside Higher Ed's Inside Digital Learning

Back to Top