You have /5 articles left.
Sign up for a free account or log in.

Just leave me alone!

I’m sure I lost count of how many times I shouted this during my childhood, running away from whomever was tormenting me in the moment. Teachers, parents, siblings, friends, strangers…all were on the receiving end of “Just leave me alone!” at one point or another.

One of the great projects of childhood is to become an autonomous person. School can and should be one of the places where we’re simultaneously most encouraged and most protected when it comes to developing this autonomy, working on our skills of self-regulation as we figure out who we are and what we believe. It is hours together with other small humans, interacting in a cooperative space under adult supervision, where ideally we are encouraged to identify and explore our curiosities.

Sometimes that means we need to just be left alone.

School as a place to develop autonomy has been gradually eroded over the years as a broad embrace of standardization and “accountability” has come to dominate our school systems, stripping teachers of the freedom they need to work with their students. At the same time, increasing inequality has left some schools lacking even the basic necessities to do their work.

School has never been perfect, and even what we mean by “school” varies significantly depending on circumstance and privilege, but I believe it to be undeniable that there is a greater disconnect between our values and our reality than prior to inception of this era of school “reform” roughly launched with 1983’s “A Nation at Risk.”

Some believe we can introduce technological solutions to address these problems. In a recent Gates Foundation/Chan Zuckerberg Request for Information, they declare a desire to identify “state-of-the-art” technologies for use in schools.

Education reporter Benjamin Herold has been cataloging various examples of ed-tech which “tracks” students, their keystrokes, their behavior, and even their facial expressions, often without the student’s knowledge. 

One of these technologies is Emote, as described by Herold, “a mobile app that makes it easier for a wide range of school staff, from bus drivers to teachers, to record and share their observations of when students appear sad, anxious, angry, and frustrated.”

When a child arrives in school, if they are observed to be angry or upset by a staff member, this is logged into the app. Later, a teacher may see additional evidence, creating another alert. The goal, according to Emote CEO Juilan Golder, is to prevent “escalation.”

Student behavior can also be tracked longitudinally. Maybe a student is grumpy or sleepy every Monday, suggesting something is amiss at home. The app will know.

Golder tells Herold, “There’s more interest than we can handle at this point.”

There is mounting evidence that school is demonstrably bad for students’ mental health. The incidence of anxiety and depression are increasing. Each year, more students report being “actively disengaged” from schools.

In my forthcoming book, Why They Can’t Write: Killing the Five-Paragraph Essay and Other Necessities, an early section discussed “problems,” the barriers which stand between the status quo and helping students become better writers. The first two chapters in this section are “The Problem of Atmosphere” and “The Problem of Surveillance.”

“The Problem of Atmosphere” discusses the evidence of the ways schooling has been separated from learning, learning having been sacrificed on the altar of standardization. Curiosity and self-regulation are devalued next to efficiency and a (very shallow) notion of “achievement.” School becomes a kind of gauntlet, an academic version of The Hunger Games where an unseen gamemaker determines your fate, seemingly by whim.

“The Problem of Surveillance” discusses the encroachment of real-time data collection and tools of surveillance – such as “parent portals” or apps like ClassDojo – into student spaces. These are part and parcel of the “problem of atmosphere” as students are tracked an monitored throughout the entire school day. These technologies are already doing damage.

There is simply no evidence that real-time data collection or instant feedback is conducive to learning. Yes, we all benefit from instruction, but we also significantly benefit from private practice time, alone, where we are governed by our own intrinsic motivation.

Imagine a child besotted by learning a new subject or new skill, how they will squirrel themselves away in private, seeking space to explore, to screw up without scrutiny or comment, seeking help only when they have exhausted their own potential. This kind of space is necessary. It’s vital.

It’s human.

Now imagine a child interacting with something called FaceMetrics, and its Read2Play app. 

Utilizing video working in concert with AI, FaceMetrics monitors the child’s behavior as they read, tracking eye movement to see how carefully or closely they might be reading, possibly saying, “You missed some paragraphs,” if it suspects skimming.[1]

It will reward children with game time if they read diligently. “Well done,” they app will tell children, “you read carefully, and you deserve another half an hour in the game.”

Now consider that we know quite a lot about how to get children reading – reading aloud to them at a young age, modeling reading behavior, establishing a book-rich environment, allowing them free choice of reading, encouraging reading of any kind, etc… – and think about what happens when instead, we inject spyware outfitted with AI into the equation.

Does that seem sensible? Even if the technology works exactly as well as the developers claim, would it be helpful in achieving what we desire for those students? If a child is struck by a particularly interesting or evocative passage and they stare off into space, having a good, satisfying think, will the app chastise them for inattention? 

Reading should be its own reward, not something to plow through so they can get to the "game." What happens to curiosity? To intrinsic motivation?

Go back to Emote, the scenario in which a staff member sees an upset child entering school. What if, instead of logging the emotion in an app, they have an actual interpersonal relationship with the student and can ask, “Is everything okay?”[2]

Sometimes “Just leave me alone!” is a demand for privacy, while other times, it’s a cry for attention. The only way to tell the difference is by knowing the child as an individual. Either way, what happens when children never have the experience of being alone?

I vowed to myself to stop writing so many “WTF is up with this ed-tech?” posts because they’re all “WTF is up with this ed-tech?” but then I read about the plans of the well-heeled and powerful and alarm bells ring in my head and I can’t stop myself.

I’m trying to be sensible and measured here, but the truth is, I want to scream. I cannot believe what is being done to students in the name of “progress” and “achievement.” The cart of “what technology can do” is many miles ahead of the “what students would benefit from” horse.

I don’t know what to say, other than it must be resisted and rejected.

If this doesn’t work, I may just have to start screaming.


[1] Of course knowing how to skim a reading is an important skill, but never mind because the AI can’t tell the difference between productive skimming and not reading.

[2] It seems as though none of these companies considers how the use of the technology impacts the culture itself. It’s easy to envision a staff member or teacher spending all their time logging data on student behavior to satisfy the app, rather than using their time to relate to their students directly. Teachers in New South Wales, Australia recently “put the brakes” on a “horrible” classroom tool which had them spending more time on data entry than teaching. 


Next Story

Written By