For a rising generation of administrators in higher education, the heart of education is innovative technology -- and faculty get in the way.
In a recent speech, the new president of Carnegie Mellon University, Subra Suresh, intimated his administrative philosophy, remarking that, “the French politician Georges Clemenceau once said that, ‘War is too important to be left to the generals.’ Some would argue learning is too important to be left to professors and teachers.”
The speech opened the inaugural meeting of the Global Learning Council (GLC), held at Carnegie Mellon in September. The GLC brings together a group of high-level university administrators, government officials, and corporate executives who aspire to be an at-large advisory group, akin to the National Research Council, for higher education.
Suresh could have used the help of an English professor to unpack the analogy. Presidents and provosts would be generals, not faculty, who are the soldiers in the trenches, so the fitting parallel would actually be “education is too important to be left to administrators.”
On that count, I agree.
Suresh’s phrasing was not a slip but a frank statement — for him, faculty have little place in decision-making. And I think that it captures the leaning of many current initiatives touting innovation and technology.
The classic definition of the university is that it represents the corporate body of the faculty. Like the protagonist of Flannery O’Connor’s Wise Blood, who wants to establish the Church of Christ without Christ, the New Leaders of higher education want to establish education without educators. Or more precisely, they want to call the shots and faculty to do what they're told, like proper employees. To wit, at the conference there were few regular faculty member in attendance (even if some of the administrators had started as or occasionally did guest spots as professors, it’s probably been a while since they devoted much of their work time to that realm), and there was certainly no social or cultural critic of higher education scheduled to speak. Rather than engaging much criticism or debate — which, after all, is a mission of the university, testing ideas — it had the character more of an infomercial.
The focus of the conference was to install technology in higher education as fast as possible, and the speakers included high-level figures from Google, Kaplan, edX, and various other companies with a financial interest in the changeover.
The only speaker who raised doubts about technology was a military person, Frank C. DiGiovanni, director of force readiness and training in the U.S. Office of the Undersecretary of Defense. In his talk he said that he found that, to be effective, education needs to “stimulate the five senses,” which does not happen with devices. In fact, he noted that there was a “loss of humanity” with them. He added in subsequent discussion: “I worry about technology taking over. The center of gravity is the human mind.”
It seemed a little ironic to me that the only person reminding us of a humanistic perspective was the military man, though it was clear that DiGiovanni had a good deal of experience with how people actually learned and that he cared about it.
The innovation mantra has been most prominently expressed by the business guru Clayton Christensen, who coined the phrase “disruptive innovation.” It has been the credo especially of tech companies, who come out with ever-new products each year. The theory is that businesses like the American steel industry have failed because they were set in their ways, doing things that were successful before. Instead, even if successful, they should disrupt what they’re doing. Hence, while Apple was making very good laptops, they went to the iPhone. Then to the iPad. Then to the Apple Watch.
Christensen has extended his theory to academe, in articles and his 2011 book, The Innovative University: Changing the DNA of Higher Education from the Inside Out (co-written with Henry Eyring). He basically sees higher education as set in its ways (hence the DNA metaphor) and ripe for a takeover by technology, and he holds up universities such as BYU-Idaho and the for-profit DeVry University as models for the future. He admits that Harvard University is still top of the line, but not everyone can go to Harvard, so, in cheery rhetoric (some of which is taken from the promotional literature of the colleges themselves), he sees these other schools doing what Walmart did to retail.
Christensen’s theory of innovation has been rebutted by Jill Lepore in a recent piece in The New Yorker,“The Disruption Machine.” She points out that most companies succeed because of sustainable innovation, not disruptive. Apple, after all, still makes laptops, and US Steel is still the largest steel company in the US. In addition, she goes on to demonstrate that a good deal of Christensen’s evidence is thin, not to mention that many of his examples of success have gone belly-up.
Besides holes in the general theory, it’s also questionable whether the kind of innovation that applies to technological or commodity products is readily translatable to education. Cognitivists have shown that education largely works affectively, through empathy, which requires live people in front of you. One learns by imaginatively inhabiting another’s point of view.
Moreover, most institutions of higher education have a different role than businesses — more like churches, which in fact is the analogy that helped establish their independent legal status in the 1819 Dartmouth decision of the U.S. Supreme Court. Something other than consuming goes on at universities, which gets lost in the ommercial model of higher ed.
Think of it this way: while I like to shop at Macy’s and hope it stays in business, I would not donate any money to it, whereas I have to universities and churches. Of course universities should use best business practices, but if they act primarily as a business, with a saleable product and positioning students as customers, then they abnegate this other role. This is an inherent contradiction that vexes the push to commercialize higher education.
This is not to say that there is no use for technology. The Online Learning Initiative, a project studying statistics pedagogy at Carnegie Mellon, shows that some online segments work better than large lecture sessions. But, if you read their reports, it’s clear that the experiment essentially offers a flipped classroom, and in fact students probably gain more faculty contact than in the lecture model. It’s more like a return to a tutorial model. Who knew students do better with professors?
What the rush for innovation is really about, as Christopher Newfield, a leading critic of higher education, has pointed out, is not a better theory of change but a theory of governance. As Newfield puts it, “it isn’t about what people actually do to innovate better, faster, and cheaper, but about what executives must do to control innovative institutions.” It’s all about top-down plans of action, with the executive issuing a plan to disrupt what you’re doing, and subordinates to carry it out. Hence Suresh’s brushing aside those pesky faculty, who traditionally decide the way that education should be. That might be O.K. for a corporation, but it violates any standard idea of shared governance and academic freedom, which holds that faculty decide the direction of education.
It’s also about politics. The vision of higher education that the New Leaders of higher education would like to install is not a traditional horizontal institution, in which faculty are generally of equal power. (For instance, I’m a professor at Carnegie Mellon like Suresh, so technically I have the same faculty rights and determine the content of my courses and research, not him — and fortunately I have tenure, so he can’t fire me for writing this, which he could if it were a regular corporation.) Rather, it has become an oligarchical institution, reliant on business deals and donations. Business corporations, after all, are not democracies but oligarchies, with decisions running from the owners and executives downhill.
The oligarchical leaning of the New Leadership became clear to me in a talk by Luis van Ahn, a young computer scientist at Carnegie Mellon and MacArthur Award winner. Van Ahn was animated and funny, bringing fresh energy to the proceedings. He evidently had made a killing in developing CAPTCHAs, those difficult-to-decipher wavy letters to verify you’re a human and not a bot online (in his PowerPoint he showed a picture of a man lying in a bed of money, which drew a lot of laughs).
Since then, he has developed and is CEO of Duolingo, a nonprofit designed to bring language training to people for free (or more precisely for their labor). It’s all online, and it’s self-funding: Duolingo sells crowdsourced translations from students to CNN or other businesses in need of them, and the money keeps the company going.
Van Ahn had several tenets of education, the first of which was that “the best education money can buy should be free.” I was with him on that, but I was not so sure about the rest.
One was that the best education should, “Be in your pocket, not in some building.” Again, if education relies on social contact and empathy, then we need a place for it other than the shallow contact of a screen. Think of it from the bottom up: children learn from the synesthesia of sociality, and those who are regularly read to by parents learn to read the soonest. What would a child be like if you locked him or her in a room with a device?
Moreover, while a program like Duolingo might be good for picking up a reading knowledge of a foreign language, I wonder about its transposition to speaking. While van Ahn attests to good testing results online, languages, after all, are not formulae but social. Anyone who has learned a foreign language knows that it’s a much different experience when you’re there, in front of live people.
Still, Duolingo seems like a good thing and an exemplary use of online. However, van Ahn had another tenet: that learning should be through a corporation, not through a government. He said that you cannot trust governments (most “suck” and “other people’s funding usually comes with other people’s ideas and influences”), which he drew from personal experience as an immigrant from Guatemala. That might be understandable in his individual case, but is deeply troubling to anyone who has a Jeffersonian sense of higher education and believes that it should be a public right and to cultivate citizens.
It boggles the mind to think that corporations would be better. What are the guarantees that they would be more free from “other people’s ideas and influences,” particularly of just a few people?
Perhaps if van Ahn is running them. (And still, he sold his previous project to Google, and one might question Google’s proprietorial policies, which we have little recourse to alter.) Governments presumably are based on the will of the people, whereas corporations are based on the will of their owners, boards, and executives, oriented toward gaining the most advantage for themselves. A poor government might fail to represent the will of its people, but the problem then is the lack of democracy. By definition, corporations represent a small, self-interested group.
While van Ahn seems like an admirable person and has put some of his money into good causes, his statement was the credo of plutocracy: the rich and powerful should rule, and their good effects might trickle down. But I don’t trust corporations as much as he does, particularly since they have brought us our current world of severe inequality.
American higher education was conceived as a remedy to inequality in the period after World War II, with policy documents like the 1947 Truman Commission Report setting out a plan to fight inequality “in so fundamental a right as education,” spurring state and federal funding to expand high-quality public colleges and universities and allow a greater number of citizens to attend them for minimal tuition.
The new technology reinstalls inequality, with the wealthy (and a few high-scoring poor) receiving bespoke higher education at elite schools, but most of the rest getting theirs on a screen — with great graphics! like a game!
The Graduate Workers of Columbia on Friday told Columbia University that a majority of teaching assistants and research assistants have signed cards asking that the United Auto Workers local be recognized as a union. A statement from the union noted that if the university does not voluntarily agree to collective bargaining, the UAW could ask the National Labor Relations Board to conduct an election and (assuming a majority of the graduate students back the UAW) certify the union. A Columbia spokesman said that the university was not commenting on the UAW request.
In 2004, the NLRB ruled that graduate teaching assistants could not unionize at private universities. (State laws, which vary, govern the unionization of T.A.s at public universities, and many such unions have existed for a long time.) Supporters of graduate student unions have been looking for a test case -- particularly with an NLRB that is more friendly to unions than the board was in 2004 -- to reverse that ruling. A UAW unit at New York University was headed toward being the test case, but NYU agreed last year to a union election, and the case was withdrawn.
James Kilgore, whose successful adjunct career was interrupted last year at the University of Illinois at Urbana-Champaign, will be back teaching in the spring semester, The Chicago Tribune reported. He has been hired to teach a global studies course. Kilgore's teaching was blocked after word spread about his past (including jail time) for his role in the radical '70s group the Symbionese Liberation Army. But the Illinois board last month cleared the way for him to resume teaching, and he has now been hired back. The University of Illinois at Chicago, which has played no role in the Kilgore controversy, is currently in danger of losing a large gift from a donor opposed to his rehiring at Urbana-Champaign.
In today's Academic Minute, Gary Small, a professor of psychiatry at the University of California at Los Angeles, details the effects increased screen time is having on teens. Learn more about the Academic Minute here.
In what sense does branching from your original field come with a punishment? Does the academy really want intellectual curiosity?
I am a historian, and I have published in Asian, Pacific, urban and American history. I don’t really consider myself an Asianist of the hardcore variety, (my Mandarin is rusty and my Malay limited), and for all that World History is touted, hiring in that area is often more the old-style “Empire” (“Britain and the World,” “France and the World,” “Iberian Empires” or sometimes “America in the World” which as far as I can tell is the new way of saying diplomatic history).
But unfortunately the academic world still has a need to pigeonhole us. A department will be hiring someone to teach (for instance), colonial North America, or Modern Germany. So obviously they want someone with training in that area. (Never mind that fact that many of us, once in a job, will end up teaching things that are a long way from our specialization.)
Back while I was still at the University of Cambridge in 2007, Simon Schama published a book about the transatlantic slave trade. At a conference, one of the speakers held up the book, slapped it, and said, “How could he write this? He’s an expert on 17th-century Holland!” I thought my Ph.D. was a license to go anywhere in history. Hearing that comment, I wondered whether I had made a huge mistake.
My Ph.D. topic was stumbled into, rather a compromise based on source availability and timing. I am proud of the project (and the book it became), but it’s not an area I wish to pursue further. So I work on different things. Fortunately, I’m in a department now where they don’t seem to mind what I research, as long as I’m publishing. But to grant agencies, I think I look a bit flaky.
And certainly to people like that conference speaker, I present an odd figure. I assumed that my training in history (in Susan Stryker’s words, a “black belt in looking shit up”) meant I could turn those skills to any period of history (language issues notwithstanding). I never realized I would be shackled to my Ph.D. topic for the rest of my life (perhaps because the historians I most admired, like Schama, are those who had displayed broad intellectual curiosity and turned their focus on widely divergent regions and periods).
In terms of history outside of the academy, the general public wants broad declarative histories. Books on the theme of “The X that changed the world” are common (even histories of apparently small things have to be on the grand stage). Meanwhile in academe our focus remains narrow. There was once a time when academic historians wrote broad narratives for dissertations. Then we turned to ever smaller elements of history, to be examined to a microscopic level. David Armitage and Jo Guldi have suggested we may be returning to the longue duree in academic works, but it may be slow in coming.
I still believe that the training of a doctoral program should allow us to use those skills anywhere, allowing for the time required to get up to speed on the scholarship in a new field. After all, if I could do that in three years as a fresh graduate student, I should be able to do it again now (and probably quicker since I’ve done it before). It disturbs me that there are people who believe our ability to learn and grow as scholars should end the second we are handed our Ph.D.s (with our future publications just being further iterations of the same subject as our dissertation).
With the growing need for Ph.D.s to consider careers outside the academy, a broader perspective is useful -- nonprofits, think tanks and museums want broad skills and flexibility, not narrow interests. This means also having open-minded professors -- open to careers outside academe, and open to different fields.
Katrina Gulliver is a lecturer in history at the University of New South Wales. You can find her most of the time on Twitter @katrinagulliver.