Blog U › 
  • Just Visiting

    A blog by John Warner, author of the story collection Tough Day for the Army, and a novel, The Funny Man, on teaching, writing and never knowing when you're going to be asked to leave.

Why 21st Century Learning Should Be More Like the 19th Century
March 7, 2014 - 2:11am

At Higher Ed Beta, Steven Mintz, Executive Director of the University of Texas System’s Institute for Transformational Learning, argues in “Five Ways that 21st and 20th Century Learning Will Differ”  that we are in the midst of a data-driven educational transformation.

I would say that he offers predictions, except they aren’t so much predictions as commands, as he utilizes the future imperative throughout:

1.  A 21st century education will be geared toward 100 percent proficiency.

2.  It will rest on the science of learning. 

3.  It will be data-driven.

4.  It will be personalized.

5.  It will take advantage of technology in ways that truly enhance the learning experience.
 

It’s hard to be against Professor Mintz’s vision because it’s a kind of educational near-Utopia. Point number one is a kind of No Child Left Behind for higher education, where we will no longer rank students by grades, but utilize rubrics that emphasize “learning outcomes” that will “help all students master the skills and competencies that we seek.”

It’s a promise of higher completion rates and students who leave college with the skills and abilities that employers seek.

Point two is how we’re going to achieve this, apparently by embracing the “science of learning.” According to Mintz, “Recent research in neuroscience, cognitive and developmental psychology, and assessment has taught us a great deal about student learning, and instructors will increasingly be expected to apply these findings to improve their teaching.”

Mintz’s list of things neuroscience and cognitive and developmental psychologists have to teach us are long, and impressive, and I embrace their practices wholeheartedly.

My only quibble is that we didn’t learn these things thanks to data and science and we definitely didn’t need 21st century technology to uncover them. In fact, they go back to the late 19th century philosophy of John Dewey. As an example, Mintz’s criteria of “grounded cognition,” “higher-order thinking” and “mental modeling” are just jargony terms for Dewey’s hands-on learning and experiential education. Reacting against the rote memorization of the industrial age, Dewey believed in problem-based learning, and teaching lessons appropriate to each student's cognitive development and abilities.

Perhaps it’s reassuring to hear that neuroscience confirms what we’ve long witnessed with our own two eyes.

And to the extent that this technology helps frame the conversation about higher education around the student experience and learning, we should be thankful.

But here’s where I, personally, part ways with the techno-utopians. Professor Mintz believes that this personalized learning will be data-driven, and I think this is magical thinking and also, not really necessary.

The power of data is in aggregation and predicting trends. Your grocery frequent shopper card helps stores understand how many bags of Doritos and cases of Miller Lite they need over Memorial Day weekend. I have seen the power of these things up close in my former work in marketing research.

But what data doesn’t do, and can probably never do, and even shouldn’t do, is predict, with certainty, the individual behavior of someone walking into the store. Even if a shopper has purchased a case of Miller Lite every week for year, we don’t know that what they need in a particular trip is another case of Miller Lite. We don’t know that they decided to detox for a week, or they had a house guest that showed up with a case of Miller Lite, which supplied their needs for that week.

As amazing as these tools are, human beings are not actually reducible to the sum total of our data. Anyone who believes otherwise isn’t thinking very seriously about the human experience. The advances in technology have perhaps made this a tantalizing possibility, but it is and forever will be a fantasy.

Additionally, to use my own future imperative, whatever marginal gains advanced education analytics may provide will be more than offset in the costs as we layer another administrative apparatus on top of the work of instruction. I can only imagine the infrastructure necessary to realize Mintz’s vision. Institutes for Transformational Learning for everyone! How many assistant provosts for data collection and privacy assurance are we going to need to manage these things?

Dewey has already given us a blueprint that works pretty well by recognizing that learning is an inherently social activity and the key to fostering an atmosphere conducive to learning is in empowering teachers to put these sound pedagogical practices to work. To the extent that technology can help us foster this social interaction, I’m for it, but more screen time doesn’t make sense to me. Anything that separates teachers from their students is counterproductive, and as we’ve seen in the madness wrought by No Child Left Behind and the double and tripling down on Race to the Top and Common Core, the thirst for measurement and assessment is the enemy of genuine education.

I imagine Mintz and others who share his vision imagine they’re going to give teachers tools that will help them, but more and more it looks to me like they imagine that if they can just get the algorithm right, they won’t need us anymore.

Education mediated through machines, even sophisticated ones like computers, is mechanized, no matter what kind of fancy language or good intentions it’s cloaked in.

--

Were telegrams the 19th century's Twitter?

 

 

 

Please review our commenting policy here.

Search for Jobs

Most

  • Viewed
  • Commented
  • Past:
  • Day
  • Week
  • Month
  • Year
Loading results...
Back to Top