Title

Learning Analytics and 'The Tyranny of Metrics’

The costs, and potential benefits, of postsecondary data-driven decision making.

January 9, 2019
 
 

The Tyranny of Metrics by Jerry Z. Muller

Published in February of 2018.

I have this fantasy where I’m appointed as the newly created (and entirely fictional) Learning Analytics Czar. 

As Learning Analytics Czar, my first action is to have everyone on the Learning Analytics Committee read The Tyranny of Metrics.

I’m a fan of metrics. How else could I be Learning Analytics Czar? But I’ve learned over my 20 years in academia that the best way to win an argument is to argue the other side.  If you want to persuade an academic, start with all the ways that your argument is wrong. Want to start a new project or initiative? Lay out all the reasons that it may fail, and why starting is probably a bad idea.

Academics are contrary by training and personality.  We are deeply suspicious of consensus. If more than five of us agree on anything, we know that we all must be wrong.

Dr. Muller was inspired to write the Tyranny of Metrics by his own experience with the ever-metastasizing institutional reporting requirements while department chair of the History Department at The Catholic University of America. The book takes us on a tour of data-driven disasters in higher education, K-12, medicine, policing, the military, business, philanthropy, and foreign aid.  In all of these domains, the impact of an increased focus on metrics has been a series of unintended and mostly negative consequences.

Time spent collecting data is time not spent on doing actual work. Dollars spent on analyzing data are dollars not spent on actual work.

Once numerical targets are established, people will modify their behaviors to meet those targets. These actions may result in good looking numbers but bad outcomes for humans.  Set a target for crime, and the police will under-report criminal activity.  Report on surgical outcomes, and surgeons will stop operating on higher risk patients.

The obsession with data has caused many organizations, including universities, to prioritize management by metrics. Numerical indicators replace judgment, experience, and intuition as the rationale for decision making.  The local knowledge gained by long-term employees gets undervalued, replaced by a faith in quantitative indicators and data-driven best practices.

The Tyranny of Metrics is the right place for the work of anyone invested in learning analytics. Those of us who believe that that educators and institutions should be making data-informed decisions must avoid making the same mistakes as other industries.

Muller’s argument against prioritizing data for decision making, however, would have been strengthened if he had more clearly articulated the pro-metric argument.  Are there ways that higher education might find ways to appropriately and productively fold assessment, evaluation, and analytics into our institutional practices?

The people that I know in offices of institutional research would be the last to argue that data should be collected and analyzed for its own sake. Instead, it seems to me that higher ed assessment and evaluation professionals are deeply committed to the mission of their institutions, and the success of the students and faculty of the schools that they work. The Tyranny of Metrics would have been a stronger book if Muller had spent some time with institutional research teams, and with assessment and evaluation experts at learning organizations such as centers for teaching and learning (CTLs).

It is no doubt true that an obsession with metrics has caused organizations (including colleges and universities) to waste time, and to adopt behaviors that have unintended adverse outcomes.  We need not look any further than the US News rankings to see the perverse results of metric fixation.

It also equally true that making decisions in data-free zones is equally as destructive.  Educators should have the opportunity to make use of student outcome data to improve their teaching.  Schools should be able to figure out where investments to enhance student learning are yielding good results.

What has been your experience with being asked to collect, synthesize, analyze and report on data in your area of higher ed?

Are numerical targets and quantitate performance indicators replacing judgment, autonomy, and wisdom as cultural drivers at your institution?

Can we find ways to integrate learning analytics into a liberal arts educational context?

What are you reading?

Read more by

Be the first to know.
Get our free daily newsletter.

 

Back to Top