Reading 'Redirect' From an EdTech Perspective

Redirect, Timothy Wilson's excellent new book, can be read in two ways. The first is to understand this book as analysis of effective techniques to modify and influence behavior.

March 4, 2013

Redirect: The Surprising New Science of Psychological Change by Timothy D. Wilson

Redirect, Timothy Wilson's excellent new book, can be read in two ways.  

The first is to understand this book as analysis of effective techniques to modify and influence behavior.  

There is a growing literature on the salience of internal motivation, framing, and choice architecture in understanding and shaping human action. 

Redirect, based largely on Wilson's own research as a professor of psychology at the University of Virginia, makes an excellent and important contribution to books that attempt to synthesize the academic findings on behavioral change across a range of social science disciplines.

I strongly recommend reading Redirect if you enjoy books such as Nudge, Predictably Irrational, The Power of Habit, Drive, and Sway.

The second way to read Redirect, and the one that I think is extremely relevant to the work that many of us do in educational technology, is as a warning against initiating programs or methods based not on empirical research but on common sense.

Much of what we do at the intersection of learning and technology is about change.  

We try to work with the instructors to introduce changes in their courses that will increase levels of student engagement and long-term retention.

We run workshops and engage in faculty training on methods to catalyze active learning, often using the new set of tools available to us such as discussion boards, blogs, wikis and other web 2.0 platforms.

The fact is, very little of the specific interventions that we recommend  have been empirically validated via experimental methods. There are nowhere near enough true experimentation going on in learning technology.   

How many presentations that we attend at conferences, talks delivered by our peer institutions or by vendors, contain a control group for comparison?    

We can point to precious few case control studies with random assignment of both instructors and students. Rather, we have our own experiences, biases, and theories to support our assertions around both investments in learning platforms and the effectiveness of particular course re-design methodologies.    

This is not to indict the entire field of learning technology. I have seen examples of good experimental design in assessing the value of particular edtech interventions.  NCAT, the National Center for Academic Transformation,  is leader in this area.

The sort of assessment and evaluation work that NCAT does is unfortunately rare in our edtech world. Most often, we lack both the resources and the time to pair our learning technology interventions with a control group. And randomly assigning our faculty to a treatment and control arm is almost impossible, as we rely on our most innovative and energetic instructors to try new things.

What we learn from Redirect is how often the best intended social interventions will fail. Wilson cites numerous examples of how well intended interventions, such as programs to curb teen drug abuse or risky sexual behavior, end up backfiring in the real world.   

Reading Redirect is a reminder that people in the change business, and that includes us edtech folks, need to prioritize empirical assessment (ideally with experimental design) of the higher ed initiatives that we support.  

What are you reading?


Back to Top