I have a confession. I tried McGraw-Hill’s LearnSmart Advantage software. It’s kind of fun.
I’m a little scared to admit this because I’ve been hard on adaptive learning software previously, going so far as to urge others to “resist” its adoption.
The LearnSmart trial module is on U.S. States and Capitals, and while the content is relatively basic, I believe the interface is consistent with how the software works for other subjects.
You’re presented with a question – sometimes fill in the blank, and sometimes multiple choice – and then asked whether or not you know the answer, selecting from: Yes, Probably, Maybe, No-Just Guessing.
If you guess correctly, you receive points, more points the higher your degree of confidence prior to the question.
It’s basically video game trivia. It actually reminded me very much of the bartop coin-operated trivia games I used to play in the '90s when I was in graduate school.
Let’s not get into why I was alone in bars playing coin-op trivia.
I spent 15 minutes playing around with LearnSmart Advantage, approximately 14 minutes more than I would’ve spent studying a list of facts about states and capitals. There’s no doubt that playing trivia games is more fun than reading flashcards or poring over lists of information.
I also picked up some things I didn’t know, or used to know, but forgot. I’m sure at some point in grade school I learned all 50 state capitals, except that between then and now, my brain had decided that Alabama’s capital is Mobile, rather than Montgomery. I also now know (or was reminded of) the fact that the U.S. Space & Rocket Center is in Alabama.
As a brute force fact crammer, the software seems useful.
But if this is adaptive learning software, I have to question who and what is doing the adapting, because it isn’t the software.
The longer I worked, the faster I went in order to more quickly move up the scoreboard displayed in the upper right corner of the screen.
This meant that I was “thinking” less. I also didn’t care about the context of the information I was learning, or how any of it fit together. My goal was to get the “right” point-worthy answer as quickly as possible. This actually resulted in some idiotic, knee jerk mistakes, like declaring Iowa City as the capital of Iowa, when I’m well aware it’s Des Moines.
Because I was thinking less, when I didn’t immediately know the answer, I was more likely to just quickly guess to get it out of the way so I could get on to the next question that I might know. The software was happy to correct me so I might remember it for next time.
This actually happened with the question asking the location of the U.S. Space and Rocket Center. My first blush guesses were either Texas or Florida because I know that’s where space launches occur. Neither of those felt right, though, because I know that Houston (as in, “Houston, we have a problem”) is the Johnson Space Center and Florida’s is named after Kennedy and is located at Cape Canaveral. Not thinking of any alternatives, I went with Texas, and I was wrong.
Later on, though, I realized I could have figured out the answer if I’d just stopped and puzzled it out.
The key was my teenage crush on the actress, Lea Thompson.
I trust that I’m not the only teen of the '80s who had a crush on Lea Thompson, who played Marty McFly’s mother in the Back to the Future movies.
I loved her with the white-hot intensity of a thousand suns, and in my delusional pubescent brain, I was certain she’d love me back if we ever met and wasn’t put off by the fact that I was 15 and she was 24.
I could overlook it if she could.
One of the lesser-known movies of her mid-'80s heyday was her Back to the Future follow-up, the non-blockbuster Space Camp (also starring a very young Joaquin Phoenix), where she plays a space camper who, along with her training team, is accidentally blasted on the space shuttle into orbit without proper communications equipment and only 12 hours of oxygen.
It’s a great/terrible movie and I loved it (and her), and what’s more, it takes place at Space Camp held at the U.S. Space and Rocket Center in … Huntsville, Ala.
This is information that had been housed deep in my brain, but because of the nature of the interaction with the software, I wasn’t working to access it.
The sales pitch on adaptive learning software is that the software adapts to what the student knows or doesn’t know and adjusts its presentation accordingly. But rather than engaging in the kind of relational and associative thinking – what neuroscientists call “episodic” memory, while working with the program – I was relying on “semantic” memory, which is more like a big filing cabinet stuffed with the things we can recall relatively easily.
The process reminded me of another video game, Rock Band. As a one-time reasonably serious amateur drummer, I bring some skills to the game, and often have little trouble “passing” the challenges of the individual songs fairly quickly by playing the parts as I imagine they should be played if I had a real, rather than a virtual drum set in front of me.
Except that I could never get perfect scores because there were inevitably parts where what the game wanted me to do and what I thought the music indicated I should do were not the same. My instincts said that a ride cymbal was right, but the program insisted on the crash.
It took a lot of training on my part if I wanted to approach “perfection” on the game’s terms. I had to undo what I knew as “right” or at least sounded right in my head, in order to please the structure of the game.
I wanted to play music. They wanted me to hit colored pads in a prescribed order.
There’s a difference.
I see the same limitations in asking our students to interact with software of this kind. To succeed, they need to adapt their thinking to the software flowcharts. The only way the software is adapting is to shove the flowchart you keep getting wrong back in your face until you get it right.
Now, it’s not like the capital of Iowa is open to interpretation, but I’d rather my students embrace a learning process that is open and exploratory, that fosters the development of personal curiosity in order to take ownership of the material.
I want them connect the things they learn to each other, where they can draw a line (albeit indirect) from their movie star crush to the location of the U.S. Space and Rocket Center.
Isn’t that a kind of knowledge ownership that the software can’t actually facilitate?
Maybe this is a bias rooted in my discipline, writing, and how I wish my students to achieve self-regulation in the work, to eschew rigid templates, and instead see writing as a process where we’re oriented towards the audience’s needs, attitudes, and knowledge. I want them asking questions, not waiting for my answers or to tell them what to do.
I want them thinking as writers do.
I think the same must be true in other disciplines. Does Biology not want them thinking like biologists, Engineering like engineers? Botany like botanists?
Even in subjects (unlike writing) where knowing lots of information is a necessary precursor, couldn’t this sort of learning be hamstringing students when it comes to later growth?
When we are interacting with software, even very sophisticated software like “sandbox” video games such as Grand Theft Auto, we are still in the world of flow charts and objectives determined by the game. We must adapt if we wish to “win.” The right strategy to defeat an objective in Grand Theft Auto is the one programmers put into the game.
And no offense to McGraw-Hill, but LearnSmart is no Grand Theft Auto. It couldn’t be and still make money. (GTA V by itself cost $265 million to develop.)
The LearnSmart software also plays into one of the chief developmental weaknesses in my students, their passivity. The age of standardized testing has trained them to wait for the authority to tell them the right answer. Software prompting them what to do is almost native behavior.
I bet the LearnSmart software feels very comfortable as well as comforting to them.
But comfortable is the last thing I want my students to be. Learning of the kind we strive for in higher education is more sandbox than flowchart, a place that should be open to a certain amount of play and exploration.
I’m sure assessments designed to test how much they learned from working with the software proves that they worked well with the software.
But this learning seems very limited to me. Worse than limited, even, uninspiring.
Students need to find a way inside the material for themselves so in it they can create meaning unique to them. Put another way, they need to learn how to learn, something that is a custom job for each student.
This doesn’t happen for every course, but that’s okay. That’s part of what students go to college to discover, which pathways they want to build for themselves. Relying on software to do the human work makes this more difficult.
Students should be growing, not adapting.
I've done far more adapting to Twitter than the other way around.