You have /5 articles left.
Sign up for a free account or log in.
Black Box Thinking: Why Most People Never Learn from Their Mistakes--But Some Do by Matthew Syed
Published in November of 2015.
I read Matthew Syed excellent book Black Box Thinking during the same time that I was participating in a discussion at the ELI Conference on Leading Academic Transformation.
We kept talking about how to construct successful examples of organizational change to support student learning. I kept wanting to talk about failure.
In my fantasy world academic conferences morph into big book club gatherings. We all read and discuss the same book - with the catch that the book is not about higher education. We read books that get us to think laterally about the challenges that we face. Each time I bring up this idea my friends say that nobody would have time to read the book before the gathering - but I am not giving up.
When Syed is talking about the “black box” in Black Box Thinking, he is referring to the flight data recorder (which is really bright orange) that all airplanes carry. If you engage in black box thinking, what you are doing is thoroughly and honestly reviewing all the actions that preceded any failure. Black box thinking is catch-all concept for using data to learn from failure - as well as a cultural orientation that sees mistakes as learning opportunities.
In Black Box Thinking, Syed contrasts aviation with hospitals. The aviation industry has embraced the idea that accidents are learning opportunities. Pilots are required to report to a central organization any accidents, mistakes, or failures. This reporting is anonymized, pervasive, and consistent - and is followed up by rapid recommendations for changes.
Hospitals and doctors, conversely, have inconsistent policies and cultural orientations towards learning from medical error. Some medical organizations, such as Virginia Mason (profiled by Syed), have developed an activist culture around discovering, tracking, and learning from bad outcomes and mistakes. Other hospitals and health care systems, however, still do not have consistent policies or cultural orientations that view failures as learning opportunities.
The result, as discussed by Syed, is that flying has become one of the safest activities that people can engage in. While hospitals have become safer, the progress has not been nearly as fast as in aviation. Some estimates put medical error as the third-leading cause of death in the U.S.
What would higher ed look like if we applied black box thinking to our industry?
Very infrequently are randomized control trials applied to the interventions that we make to improve learning and student success. We all love to talk about shifting our classrooms from passive to active learning environments, but how rigorous has our testing been of these programs?
One thing that I think about is how a badly done flipped course is much worse than an average lecture course. My guess is that we have many more examples of poorly executed classes built on an active learning / constructivist philosophy than we have stellar examples. We would be wise to study these failures as well as the successes.
Higher ed has also spent billions (I assume - it would be good to know the real dollar figure) on educational technologies. How often to our technology implementation budgets have room built in for evaluation and assessment? At our professional conferences we tend to trumpet our successes - but we seldom dive into our failures.
I’d like to create a conference where the only topic is higher ed failures. We would gather with all of our stories of when projects, initiatives, and even whole institutions went wrong. We would dive deep into the internal mistakes and external forces that led to the sub-optimal outcomes. The higher ed postmortem conference. Would you come?
What are you reading?