You have /5 articles left.
Sign up for a free account or log in.

The Music Man. Willy Loman. Jay Gatsby. P. T. Barnum. Adam Neumann then; Elizabeth Holmes now.

Hucksters all.

The huckster is an American archetype. Originally, the word referred to anyone who sold small items door to door, but it came to describe someone who promotes or sells products of questionable value aggressively and dishonestly.

What’s especially striking is that popular culture’s view of the huckster is highly equivocal. We admire hucksters’ chutzpah. Their gall, cheek, impudence, audacity and brio strike many of us as commendable and worthy of imitation.

From Dale Carnegie and other proponents of salesmanship and self-improvement (beginning with Benjamin Franklin’s 1758 The Way to Wealth), a key to success is self-confidence and the ability to project a positive attitude.

The backslapper, the glad-hander, the con artist and the confidence man embody traits that we marvel at. Even as we claim to be repelled by their bravado, we stand in awe of their daring, boldness and guts.

Also, hucksters tend to come across as sincere. Hucksters rarely think of themselves as deceitful. No one believes their hyperbole, promises, exaggerations or lies more than they do. The most effective con artists, after all, are those who have messianic hubris and a savior complex.

Of course, ed tech proved to be fertile ground for technology evangelists. Intelligent robotutors in the sky. Personalized adaptive learning. Autograders.

Hucksters don’t simply prey on the vulnerable or the naïve. We’re all susceptible to the allure of the pitch man and the three-card monte dealer. All of us are gullible. All of us are credulous. We’re all vulnerable to hype and the futuristic. We all possess the will to believe.

That especially true now. Ours is a historical moment when the unimaginable strikes us as possible. After all, Silicon Valley firms did reinvent transportation, with Uber, Lyft and the electric car; banking, with Paypal, Venmo and bitcoin; retail sales with Amazon; and even friendship with Facebook.

Who’s to say, then, that it wasn’t possible to reinvent teaching and learning?

If we can summon a car at a moment’s notice or get food and groceries delivered within two hours, shouldn’t digital technologies, learning algorithms, machine learning, predictive analytics and artificial intelligence let us accelerate learning, expedite time to degree and eliminate achievement and equity gaps?

Audrey Watters, who has been instrumental in calling out the bogus claims of ed-tech entrepreneurs on her Hack Education website, recently published Teaching Machines, a history of automated teaching tools and the chimera of personalizing learning, from Sidney Pressey’s mechanized test giver to B. F. Skinner’s behaviorist bell-ringing operant conditioning chamber that would allow students to learn at their own pace.

Hers is a cautionary tale of pitch men who overpromised and underdelivered. Her book not only demonstrates that the history of educational technology is a forgotten history of failed experiments and flawed thinking, but that ed tech is more than software or devices—it is a system of misguided assumptions, beliefs, language, practices and outdated psychological theories that rests on certain premises:

  • that learning can take place alone and in isolation and without teachers;
  • that learning outcomes can and should be standardized;
  • that education is reducible to content and skills and that learning is sequential, consisting of consecutive “atomic” steps that can be programmed in advance;
  • that audiovisual material, interspersed questions and positive reinforcement are sufficient to make learning immersive and interactive;
  • that digital technologies can accelerate and democratize access to high-quality education; and
  • that critical thinking and higher-order thinking skills are irrelevant to teaching precisely because they’re difficult to measure.

Her overarching argument is that despite its vows to personalize and customize learning, educational technologies tend to “strip away student agency and selfhood”—the autonomy to pursue one’s interests and pathway.

As Watters shows with lively prose and vivid anecdotes, as early as 1866, when a device to teach spelling received a patent, inventors were touting teaching devices as “magic wands” that could teach “arithmetic, reading, spelling, foreign languages, history, geography, literature or any other subject in which questions can be asked in such a way as to demand a definite form of words … letters … or symbols” (as a 1911 patent claimed).

Watters makes it clear that ed tech continues to bear the imprint of behaviorism and functionalism. Our current notions of nudges and of education as assessable competencies are, she argues, updated versions of earlier ideas that stand in stark contrast to the constructivist and inquiry ideas and the emphasis on creativity and individual expression embraced by many educators today.

Watters’s book also carries a potent political message: that ed-tech entrepreneurs have historically been staunch critics of schooling as it is. Their attacks on the “factory model” of education need, in her view, to be understood as thinly veiled criticisms of recalcitrant unions, Luddite teachers, unimaginative school bureaucrats and shortsighted legislators. They may speak about supporting teachers, schools and universities, but their goal is to profit at their expense. As for their talk about improved learning outcomes, their products tend to emphasize technological quick fixes that significantly oversimplify the complexities of teaching and learning.

But before we throw out the baby with the bathwater, we need to recognize that technologies can indeed enhance education. Most parents know firsthand the value of Khan Academy and BrainPOP tutorials or of Wikipedia.

Without a doubt, students do benefit from immediate, specific feedback and content tailored to their individual needs and interests, and instructors would benefit from off-loading their most mundane tasks. And, as my colleague George Siemens argues, education does involve trade-offs: with cost, efficiency and scalability on one side of the equation and a truly personalized education on the other.

In 1980, a decade before the introduction of the internet browser and a year before IBM’s PC debuted, the South African–born MIT computer scientist Seymour A. Papert published Mindstorms: Children, Computers, and Powerful Ideas, which argued that computers can completely transform the way we teach. Computer literacy, Papert claimed, would combat mathphobia, replace rote learning with inquiry and exploration, and teach logic, functions, problem solving and conceptual understanding in ways that learners would actually find engaging and fun.

Yes, digital technologies can indeed change teaching—for better and for worse. On the plus side of the ledger, ed tech offers exciting new ways for students to construct and share ideas, practice skills, visualize data, annotate texts and make presentations. It can also mine data, to monitor student engagement and identify areas of confusion and misunderstanding, prompting timely interventions.

More negatively, as we’ve learned since March 2020, thanks to Zoom U, digital learning far too often saps the social interactions that, as the Soviet psychologist Lev Vygotsky insisted, lie at the heart of engagement, motivation and persistence, and learning.

As even ed tech’s harshest (and most balanced) critics like Justin Reich, the author of Failure to Disrupt: Why Technology Alone Can’t Transform Education, acknowledge, educational technology has a valuable role to play in education’s future. But that’s only the case if it’s used as a creative pedagogical tool—to facilitate interaction, collaboration, analysis, access to resources and presentations—and as a way to free instructors from lecturing so they can devote their time to mentoring and scaffolding learning, and not as a replacement for the serendipity, improvisation, clash of interpretations and emphasis on human connection and development that lie at the heart of a genuine education.

Steven Mintz is professor of history at the University of Texas at Austin.

Next Story

Written By

More from Higher Ed Gamma