Stop me if you already know this one.
You are explaining how to solve quadratic equations to a classroom full of 18-year-olds. Narrating as you map out the steps on a whiteboard, you occasionally scan the room for raised hands. No questions before you move on?
Of course you won’t find out until later, grading the tests, that in fact there was confusion in the poker-faced audience. By then it may be too late; even if a student manages to squeak by despite bombing quadratic equations, that gap in knowledge might cause a stumble down the line.
“It’s the Swiss cheese effect,” says Philip Regier, the dean of Arizona State University’s online arm. “You can’t have a big hole in your knowledge. If you get a C, you know 70 percent” of the material for one course. “But the missing 30 is likely to be important to passing the next course.”
Everyone, from administrative bean-counters to students themselves, has a stake in pulling off the perennial balancing act of getting students through courses without setting them up to fail subsequent ones. The economic, political and personal pressures of college completion are mounting.
And yet in many college classrooms, especially the largest ones, attempting to gauge how well students are absorbing new concepts as they are being taught still involves a combination of divination and educated guesswork. For all their aptitude and credentials, college professors cannot read minds.
But what if they could? What if a professor teaching algebra could look out at her students and, in the blink of an eye, see exactly what each student understands, and what they don’t?
It may sound like science fiction. But according to the companies that are selling increasingly sophisticated teaching tools, it is merely science.
And for the largest public university in the country, it is hardly fiction. Arizona State University has become ground zero for data-driven teaching in higher education. The university has rolled out an ambitious effort to turn its classrooms into laboratories for technology-abetted “adaptive learning” -- a method that purports to give instructors real-time intelligence on how well each of their students is getting each concept.
This is what happens when the dawning age of high-volume data collection and analysis collides with higher education: College professors become less like mind-molders and more like mind-readers. But it also stands to do more than that.
The so-called Big Data movement, which has been largely co-opted by the for-profit education industry, will serve as “a portal to fundamental change in how education research happens, how learning is measured, and the way various credentials are measured and integrated into hiring markets,” says Mitchell Stevens, an associate professor of education at Stanford University. “Who is at the table making decisions about these things,” he says, “is also up for grabs.”
here are a few different ways to try to wrap your head around the implications of the Big Data movement in higher education. You can think about how it changes the experience of teachers and students in the classroom. You can think about how it informs the strategic thinking in university administrative offices. You can think about how it binds nonprofit universities and for-profit product vendors in new legal relationships.
Jose Ferreira thinks about Ethan Hawke.
“I think about ‘Gattaca’ all the time,” says Ferreira, the founder and CEO of Knewton, Inc., one of the more powerful players in the industry of education analytics.
Ferreira is referring to a 1997 film by Andrew Niccol. “Gattaca” takes place in a dystopian future where a person’s biometric profile — genetic data that is used as a gauge of natural ability — serves as the basis of what opportunities he will get. Hawke plays a scrappy underdog who, because of his poor genetic predispositions, is sorted into the “invalid” class upon his birth. He spends the film trying to shed the stigma and finally succeeds, but only by gaming the system.
It's a weird reference coming from a man who is trying to help build a regime of predictive data analytics in higher education — especially since one of Ferreira’s goals is to create individual, psychometric profiles that would presume to say, with statistical authority, what students know and how they learn. Such records could theoretically follow those students into the job market, profoundly affecting how they are viewed by graduate school admissions committees and potential employers.
But we’ll come back to that.
Big Data stands to play an increasingly prominent role in the way college will work in the future. The Open Learning Initiative at Carnegie Mellon University has been demonstrating the effectiveness of autonomous teaching software for years. Major educational publishers such as Pearson, McGraw-Hill, Wiley & Sons and Cengage Learning have long been transposing their textbook content on to dynamic online platforms that are equipped to collect data from students that are interacting with it. Huge infrastructural software vendors such as Blackboard and Ellucian have invested in analytics tools that aim to predict student success based on data logged by their client universities’ enterprise software systems. And the Bill & Melinda Gates Foundation has marshaled its outsize influence in higher education to promote the use of data to measure and improve student learning outcomes, both online and in traditional classrooms.
But of all the players looking to ride the data wave into higher education, Knewton stands out.
It is not just because Ferreira makes bold claims about the power of his software, although he certainly does do that. It is because between the company’s obsession with data-mining; its facilitation of the adoption of digital content and a “flipped classroom” model of teaching; its endorsement of self-paced learning; its implications for alternative credentials for job-seeking graduates; and its exceedingly intimate vendor-client relationship with a traditional nonprofit university, Knewton stands at the nexus of most major trends -- and many of the points of tension -- in higher education.
nder the leadership of Michael Crow, its iconoclastic president, Arizona State University has branded itself aggressively as a “new American university.” To the extent that such a university must cultivate close ties with outside business and technology interests, SkySong is its shining office park on a hill.
Everything about the 42-acre SkySong campus, built in 2008, screams “innovation.” The architectural centerpiece of the complex is a white, glow-in-the-dark fiberglass structure in the shape of a collapsing circus tent. The two finished buildings on the grounds house the offices for 79 private companies and eight Arizona State “units,” including the university’s Learning Science Institute and ASU Online, the massive online arm that the university co-runs with Pearson, the for-profit education company. “It is a place... where ideas and university research become new technology and commercial enterprises,” reads a SkySong pamphlet. “A place where the future is invented.”
The future is online and adaptive learning, says Regier, the ASU Online dean.
Which is in part why the university initiated a full deployment of the Knewton system, for both its online and “blended” courses in August 2011. It agreed to pay the company $100 for every registered student taking a Knewton-powered course, a fee the university passes on to students in lieu of the cost of a traditional textbook. There was no pilot phase.
That month the university put 5,000 first-year students into remedial math courses powered by Knewton. The pass rates for those courses jumped from 66 percent to 75 percent, and the university got a lot of good press.
Not everybody was thrilled. The decision to put the Knewton software to work so in so many classrooms so quickly caught Wayne Raskind, the university’s former director of mathematics, off guard. “The contract was signed very quickly and without a lot of consultation, at least on my part,” says Raskind, who is now dean of liberal arts and sciences at Wayne State University. The former math dean says he found out about the Knewton deal by reading an article on the student newspaper’s website while at a conference in New Orleans.
Raskind says he was not opposed to using adaptive tutoring software to standardize certain parts of the curriculum, especially developmental courses, which stretch the department's resources and in which standardization has some pedagogical merit. But the administration’s swift and unilateral decision to call in Knewton made him anxious. “It was never explained to us or to me why we had to implement something that was going to affect so many people, both students and faculty, in such a short time frame,” he says.
There was grumbling among the department’s rank and file as well, but dissent was more or less moot. By the time I visited Phoenix in the fall, Arizona State had committed to a full deployment of the Knewton technology, and Raskind, after being deposed as math director, had left for Wayne State. (He says there were a number of considerations that went into his leaving, but that the Knewton scrape did figure into it.)
Classroom to classroom, the early returns from the Knewton-powered courses have been all over the map, varying widely by instructor over the last three semesters. The math department has seen fewer than half the students pass one section of a course, while 100 percent of students pass another section. Last spring, one remedial math course, MAT194 (now called MAT110), saw one section with a 33 percent pass rate and another with a 100 percent pass rate.
Ferreira insists that in general the professors presiding over the sections with poor pass rates were the ones who were using the Knewton technology least, though Arizona State could not confirm this with much certainty. “Some instructors adapted to the paradigm better than others, and so therefore that may have reflected itself in the different pass rates, but I think there’s a lot of data analysis yet to be done,” says Al Boggess, who has succeeded Raskind as the university’s director of mathematics. “It’s premature to know why certain sections did well and others did poorly.”
Arizona State is confident that the low pass rates in certain classrooms had to do with the instructors, not with any inherent flaw in the Knewton system. The university would not give me pass rates for the three courses in semester prior to the Knewton implementation, saying there were confounding variables that would make side-by-side comparisons misleading. But it did send numbers showing that pass rates had increased in each of the three Knewton-powered courses from fall 2011 to fall 2012 (it did not provide section-by-section rates).
The evidence the university is touting may lack experimental rigor, but in general the university is confident that Knewton works, and will only improve as instructors become accustomed to using it.
“Given the enormous changes that this project entailed for all parties at ASU, Knewton and Pearson, we did not expect to be where we wanted at the end of one term,” says Regier. Based on the major boosts in student success in certain sections, he says, “it was apparent that, if we continued to improve both classroom and technology processes and instructor training, the system would, over time, lead to ever better results.”
In the meantime the ASU Online dean has big plans for Knewton and its adaptive system. “We’re going to do it in macro and microeconomics, psychology, biology chemistry and physics,” says Regier. He says there are plans for doing “an entire degree adaptively” — maybe the university’s nursing bachelor’s program, maybe engineering. Any discipline where the standards for proficiency can be “normed” is potentially in play.
“We’re at the very, very, very early stages of learning what can be done adaptively,” says Regier. “We’re going to push the envelope.”
newton’s reach extends well beyond Arizona State. Last year Pearson, the 800-pound gorilla of the educational content industry, inked a deal with Knewton to have the smaller company “power” Pearson’s MyLabs and Mastering software, which are now used by about 10 million students. (This paragraph has been updated from an earlier version.)
By then the cat was already out of the bag on adaptive learning. Given the resources Pearson had poured into MyLabs and Mastering, which are the crown jewels of the company’s digital product line, ordering the Knewton augmentation was a meaningful vote of confidence.
And Ferreira is a confident guy. He studied philosophy at Carleton College, then found his way into education by way of Harvard Business School, serving two stints at Kaplan, Inc., with an interlude as a desk trader for Goldman Sachs. He became interested in psychometrics because of an obsession with standardized tests. “I used to take them on the weekends for fun,” he says. “I love brain teasers. Some people do crossword puzzles, I took the GMAT.” He claims to have taken every standardized test there is. In fall 1993, he took the GRE so many times that he discovered a loophole and forced the Educational Testing Service to withdraw a series of questions from the test.
That was 19 years ago. Now 44, Ferreira is unassuming but handsome, and possesses the rare ability to talk rapidly and persuasively about his product without sounding oily or over-rehearsed. When he tells you that Knewton is capable of capturing orders of magnitude more usable data about its users than Google or Amazon, it does not sound like he’s pitching; he’s just telling you something interesting over coffee. When Ferreira bites into his sausage-and-egg croissant and says, “That is a great breakfast sandwich,” he uses the exact same tone of voice as when he tells you that Knewton plans to use psychometrics to turn higher education on its head. It’s not at all difficult to imagine how Ferreira has managed to raise $54 million in venture capital. (Knewton’s investors include Silicon Valley rock stars Peter Thiel and Sean Parker, as well as LinkedIn founder Reid Hoffman. Pearson is also an investor.)
“Here’s the big fake-out with data mining and education,” says Ferreira, chewing. “Everybody talks about how much data they’re getting in education. Well, sure — education is titanically large, so if you get just a few data points per user per day then you get a lot of data.”
But he thinks Knewton can do better. Much better.
The company’s data-collection engine measures the amount of time students spend on particular text, video and graphical objects, and ties that to how well they do on the subsequent tests and assignments. Once it picks up a pattern, it can begin making more thoughtful recommendations for content a particular student ought to see, and when.
As far as adaptive software goes, this is pretty standard stuff. Where it gets deeper is Knewton’s idea of a “knowledge graph”: a comprehensive map of how different concepts are related to one another in the context of learning. Using meta-tags, the company wants to give its publisher partners a way to classify their digital content “down to the atomic level.” This would give Knewton’s software a common language for describing the correlations between how students learn concepts -- concepts that may or may not seem to have much to do with one another.
"The ability of the corporate interest to gain access to data, hitherto strictly limited to the institution as the custodian of those data, begins to make things dicey.”
"In education, if you do the work, tagging everything down to the sentence, it starts bleeding data,” says Ferreira. The highest correlations are the most obvious: how students learn isosceles triangles can predict how they learn scalene triangles. And those are correlations that Knewton’s recommendation engine currently takes into account. But Ferreira envisions a future in which the long tail of obscure, Freakonomics-grade correlations are also incorporated into the company’s algorithms. How well students do on isosceles triangles is probably correlated to how well they do on subject-verb agreement, he says. How well students do in their first-year math courses is probably correlated with how well they do in a graduate course in journalism. “Not very highly, but it is,” says Ferreira. “It just is.”
This gives the company an advantage over Big Data’s current bannermen, Ferreira argues, since a user who is solving problems that have right and wrong answers generates more meaningful data than a user who is typing open-ended queries into a search engine.
“If somebody tagged every single sentence, every clause, every single word of every single webpage in the entire world to Google — which is what we’re asking publishers to do to us — if somebody did that, well, then it wouldn’t really help Google all that much, because there’s no real correlation between any of those sentences,” says the Knewton founder. “The words in the sentences are correlated in a way that make sense to a human being, but there’s no underlying taxonomy of concepts you could tag all that stuff to down to an atomic level. In education, there is.” In other words, education is not only another sensible landscape for Big Data to take root; it is an ideal one.
With this promise of psychometric nirvana, Ferreira’s pitch has taken on a cinematic quality. Ones and tens of usable data per user per day isn’t cool — you know what’s cool?
“Millions of data per user per day,” says Ferreira. “That’s what Knewton gets today. We get millions of data per student per day. Next year we think it’s going to be in the billions.”
niversities have long contracted with technology companies. But only recently did they start adopting those companies as “school officials.”
Microsoft is a “school official” at Dartmouth College and Cornell and Gonzaga Universities by virtue of the company’s contract to provide students with cloud-based communication and composition tools, according to its attorneys. Coursera is a “school official” at the University of Virginia, according to the contract it signed last year (and it seems safe to say that the company holds a similar legal status with most of the nearly three dozen other institutions for which it is producing MOOCs).
And, indeed, Knewton and Pearson are “school officials” at Arizona State, according to contracts obtained via a public records request.
The designation is a legal maneuver devised to let noneducation entities handle sensitive student data without running afoul of the Family Educational Rights and Privacy Act, commonly known as FERPA. Though rarely used as a basis for actual lawsuits, FERPA has long been viewed as nonprofit higher education’s Maginot Line: the last barrier between universities and outside vendors.
As companies such as Pearson and Knewton work their way closer to the core of the university mission, legal distinctions between the nonprofit institutions and for-profit vendors are melting away.
“The Family Compliance Office has recognized that institutions can designate other entities, including vendors and consultants, as ‘other school officials,’ ” reads Knewton’s contract with Arizona State. “Designated representatives of Knewton will be designated as ‘other school officials’ for the purposes of this agreement.”
Many institutions see intimate partnerships with vendors as necessary to competing in the 21st century. But as personal data becomes the currency of the information economy, others see the companies -- which, like Knewton, are often backed by Silicon Valley venture capitalists — as interlopers that might wish to exploit student learning data for profit, despite their reassurances to the contrary.
Meanwhile the stakes of who sees a student’s education record, and what they do with it, have never been higher. Under a Knewton regime a student’s “education record” would not just comprise a transcript and some grades; it would be a “psychometric profile”: a strategic blueprint of her brain, describing her relationship to every single concept in every Knewton-powered course she takes, along with a raft of insights on how she absorbs and retains different kinds of ideas.
In the hands of Arizona State instructors or Knewton’s engineers, this information could be used to improve teaching and learning. In the hands of outside companies, it could be used to fashion more effective advertising campaigns.
The company hopes to roll out the individual user profiles sometime this year. Students would be able to log in and view their own profiles via a password-protected part of the company’s website. The point of the profiles, says Ferreira, would be to give students insights into their own learning styles -- insights that would compound over time as students took more Knewton-powered courses.
Knewton cannot do whatever it wants with student data. In its contract with Arizona State, the company assumes responsibility for protecting student education records as an extension of the university. If it fails to do so, “ASU and/or Knewton will take appropriate action against the designated representative that is similar to action ASU would take against one of its employees who disclosed or misused the educational records of its students,” according to the contract.
Meanwhile Pearson, in a separate contract with Arizona State, stakes an unrestricted claim on the “aggregate data” generated by students interacting with its digital content. “The parties acknowledge and agree that aggregate data is the property of Pearson and that Pearson may generate, use, and disclose aggregate data without limitation or restriction,” the contract reads.
Ferreira, for his part, categorically rejects the notion that Knewton might sell out students to advertisers, even if it could navigate around FERPA. The company “will never, ever, ever” sell student data to anyone, says Ferreira. Furthermore, he says, if Pearson were to try to do so, Knewton would "shut it down," he says. “We believe that these data should just be used to improve student learning, period,” says Ferreira.
Barmak Nassirian is not quite ready to take the Knewton chief at his word.
“Those are all seemingly reasonable safeguards,” says Nassirian, an independent higher education consultant and former staffer at the American Association of Collegiate Registrars and Admissions Officers (AACRAO). “Except that at some point somebody will figure out some lucrative mechanism by which money will be made by this material. Then those other considerations go out the window.”
Nassirian does not trust that the penalties the Education Department has at its disposal will adequately deter these for-profit “school officials” from exploiting their privilege, especially if those companies do so within the gray area of the law.
First of all, the department cannot touch the companies themselves. As for the universities, the department’s best instrument for enforcing FERPA is to withhold federal student aid dollars — a measure it could only justify in the case of a sensationally egregious misstep, says Nassirian. “It would have to be a fairly primitive, fairly brazen, unsophisticated and in-your-face kind of violation for the department to come in and punish the partner school, and for the school to terminate the relationship” with the vendor, he says.
While the law prohibits the companies from disclosing individual student records outright, it does not prevent companies from capitalizing on the privileged data in other ways that don’t necessarily violate FERPA, says Nassirian — declining to sell borrowed land while at the same time planting trees and selling their fruit, as he put it.
When pressed, the former AACRAO staffer was at pains to paint a scenario where Knewton or Pearson would explicitly betray the trust of specific students to make a dime. (The companies themselves insist that the “aggregate data” is used only to improve their education products, and that it is in their strategic interest to play those insights close to the vest.)
Nassirian nonetheless finds it unsettling that students will have to trust those companies in the first place — especially given the intimacy of the data involved. He invited me to imagine a case where the Roman Catholic Church began outsourcing the sacrament of confession to a Silicon Valley-backed company that offered a killer confession-management product. The company assures the church that the confessions of individual parishioners will be kept private (except to the parishioners themselves), but it will also own all the data about what sins were being confessed and how, giving it an unprecedented view into the moral mind. Executives at the company say they will only use these insights to improve the product and create more satisfying, redemptive interactions between priests and parishioners. But legally nothing is stopping them from, say, packaging those insights and selling them to other companies.
“There is a confessional quality to the learning process,” says Nassirian. “The ability of the corporate interest to gain access to data, hitherto strictly limited to the institution as the custodian of those data, begins to make things dicey.”
hen parents, lawmakers and advisers talk about getting students good jobs after college, it would be hard to find better example than working for Knewton.
The company’s offices are located on a fashionable block of Fifth Avenue in lower Manhattan, in a building surrounded by high-end clothing stores. Inside, it looks like a magnificently well-appointed student union. There is a Ping Pong table, a posh lounge area and a fully stocked kitchen, complete with wall-mounted cereal silos and a kegerator full of Newcastle Brown Ale. The walls are giant dry-erase boards with mathematical symbols scrawled across them like hieroglyphics. Nearly everyone appears to be in their late 20s or early 30s.
It doesn’t resemble an office in the conventional sense. But Knewton, which competes for talent with Google and other big-time software companies, has no interest in punch-card culture. The company does not keep track of when people arrive at the office and when they leave. It does not even count vacation days. Ferreira believes that maximizing productivity is not the same as parking workers at their desks for a prescribed block of time. Some people are most productive before lunch. Some produce brilliant work in brief, erratic spurts. Some have their best insights immediately after smacking around a small celluloid sphere with a wooden paddle.
This is not just a management philosophy. It is the foundation of the company’s product. The idea behind Knewton’s adaptive learning software is to break the tyranny of formal education’s equivalent of the 9-to-5 workday: the 50-minute lecture.
In the context of learning, Ferreira says short attention spans are only a handicap insofar as the lecture format is biased against students who have trouble focusing for long, sustained periods of time. But the opportunity to succeed in school should not be limited to students who have that kind of stamina, he says. Academe may value exceptionally patient minds, but plenty of professions do not.
“If an employer wants bursts of occasional brilliance, like on a trading floor, a short attention span is [fine],” he says. “I was on a trading floor. Everyone had a short attention span. Everybody. It was like a clinic for people with attention-deficit disorder.”
In theory, Knewton should be able to identify different “phenotypes” of students, says Ferreira, such that “[w]e can basically say, ‘This kid over here has an eight-minute attention span for these types of concepts and a five-minute attention span for these types of concepts, and a 22-minute attention span for these types of concepts.’ ”
The platform should be able to diagnose other idiosyncrasies too, says Ferreira, such as which kind of media — videos, diagrams, graphs, text — help students learn best. It should be able to cross-reference that with other variables, he says, such as time of day and concept category. It should be able to intuit what a student needs to do her best, and nudge her in that direction.
Sometimes, though, it takes more than a software nudge to get a student back on track. Sometimes it takes an actual person.
he rules don’t change,” says Irene (Apple) Bloom, crouching next to a student with her dry-erase clipboard.
The student, a young woman in a powder-blue hoodie, was having trouble with fractional exponents. The student thought that solving problems with fractional exponents involved different rules from other exponential equations. Not so, Bloom explains. The equation may look different, but the principles are the same.
Bloom’s MAT110 classroom looks a lot different now than it did three years ago. She used to spend a good chunk of her class blocks lecturing under the gaze of her students. Now she spends most of her time looking over their shoulders.
Bloom runs this class, and her other Knewton-powered course, MAT142, as “a pretty typical flipped classroom.” The “flipped classroom” refers to an ascendant pedagogical method that has professors assigning recorded lectures and other explanatory materials as homework, while using class time shoring up students’ understanding of the concepts with a mix of problem-solving exercises and face-to-face troubleshooting with their students.
MAT110 is designed for students whose entrance exams indicated they were not quite ready for college-level math. Here at Arizona State’s campus in downtown Phoenix, the class is held in a windowless computer lab in the basement of the U.S. Post Office building. Each session is one hour long, and attendance is mandatory. But instead of listening to Bloom hold court from a lectern, the students take their seats in front of their computers and begin working independently on online tutorials through Knewton.
Bloom has been teaching at Arizona State since 1997. Her current title is “senior lecturer,” although in the last two years that has become a bit of a misnomer. Bloom considers herself built for the style of teaching the Knewton system demands. She is confident interacting one-on-one with students. She does not mind jumping around the syllabus — explaining the rules of expected value to one student, walking another through process for estimating conditional probability and troubleshooting a third’s process for calculating normal curve distributions, all within the space of a single class.
Bloom floats around her classroom armed with a mix of old and new technology. Along with an iPad open to the Knewton app, she carries around a portable dry-erase board for her student interventions. The screen on her desk shows the names of the students who are behind the pace, but Bloom also comes to class with a black-and-white printout of her class roster with their names and faces highlighted.
“This is how I triage,” she says. “I always check the ones who are most behind first to see where their issues are.”
Historically, Bloom has not been an early adopter. “I remember teaching back in the late ‘80s and early ‘90s, when graphing calculators were introduced — ‘Oh my god,’ ” she says. “I mean, the hue and the cry! And I have to laugh because I was one of those people. When they switched AP calculus to using the graphing calculators, I was like, ‘This is just too hard!’ ”
Some of Bloom’s colleagues did not take well to the Knewton system at first. Beyond Raskind’s reservations about deploying the software so widely, so quickly, there were the usual rumblings about instructors — or, at least, instructor time — being supplanted by computers, says Boggess, who succeeded Raskind as math director. Some objected to the idea of relinquishing the control that comes with determining the pace and seasoning of a course, he says.
Perhaps as a result of the mixed response to the system, pass rates in the first semesters of the Knewton regime have varied considerably from classroom to classroom. “There are some faculty who have almost half their students off-track, and that’s just not good,” says Bloom. “And it’s very touchy — because they’re very sensitive, very defensive — to kind of suggest that there are things they could do to help get their students back on track.”
For all its autonomous elements, the Knewton system appears to be only as good as the instructors who are using it.
Knewton’s subject matter experts have broken the MAT110 syllabus into 52 distinct concepts. The students read and watch mini-tutorials and work on practice problems. When they feel ready, they can take “challenge tests” to unlock new concepts. Students cannot move on to a new concept without passing the last. As they do so, the software keeps track of what correlates with each student’s successful and failed attempts to learn and apply different lessons. (Note: Some information in this section has been changed to clarify that Pearson content is not used in MAT110.)
The idea is to mitigate the “Swiss cheese effect” that the traditional system creates by letting students move on to the next course despite not having properly mastered certain concepts. “I think where we’re going with a lot of this adaptive courseware, and proficiency and outcomes assessment is a world where there are two grades -- ‘A’ and incomplete,” says Regier. “You really want to say they’ve mastered fundamentals of this course and they can move on, or they haven’t and they can’t.”
If the semester ends before a student finishes all 52 lessons, then he has to hope a strong showing on the concepts he did learn is enough to pass the final. But if the student finishes all 52 concepts early then he can take the exam and be done with the course, just like that.
By the time I visited Bloom’s classroom, in the third week of October, eight of 44 students had already passed the course. This is not uncommon. Most of the reports from Knewton’s first trials with Arizona State in 2011 focused on the aggregate bump in pass rates; underreported, says Ferreira, was the fact that 50 percent of students finished their coursework at least four weeks before the end of the semester.
This could play into policy debates over the relevance of the credit hour. For students here in the Post Office basement, the implications of “self-paced” learning are simpler. Maddy, a first-year student with red hair and thick-framed glasses, was on the brink of emancipation, with eight lessons still to go. She had blown through four “challenge tests” in the last week with an average score of 97 percent. I ask Maddy what she plans to do with the extra time.
“Sleep,” she says, smiling wearily. “Just sleep.”
With the Knewton platform handling the ushering duties, Bloom is free to lead from behind. She stands behind a desk in the back of the room behind her own computer. She is also logged into Knewton, and she is also learning — about her 44 students.
From her instructor’s dashboard Bloom can see exactly how much progress each has made through the syllabus. In addition to the eight who had already finished, she can see that eight others are behind the pace, and 28 are roughly on track. Twelve students have not attempted a challenge test in the last seven days, and two are in “focus mode,” which means they have failed their current challenge test twice. She can click through for more details to see where each is going awry.
By the time Bloom is crouched by the student’s desk with her iPad and dry-erase board, the student does not have to explain her difficulty. Bloom already knows.
Which is good because, as any instructor will tell you, students are not always so great at describing the nature of their confusion. Sometimes they won’t even try.
“This particular student has taken this particular assessment five times,” says Bloom over the low drone of keyboard-tapping. She points to a name on her screen, Naomi. Naomi is stuck on rational expressions, which means she’s five concepts behind the pace. “And she hasn’t got her hand up. She’s not asking for help. So a lot of times they won’t necessarily ask for my help. So it’s best for me to go and get them unstuck.”
Naomi sits across from Maddy. She wears a yellow Sun Devils tee shirt, athletic shorts and Nike sneakers and has her hair pulled up in a ponytail. I cannot tell whether Naomi is shy in general or just reluctant to talk about her struggles with a stranger who wants to tell the world about them, but in any case she speaks very, very softly.
Naomi is not a fan of Knewton. She prefers the way she learned math in high school, where the instructor talked and the students listened and the whole class moved at the same pace.
“I think this is worse,” she says. “I don’t like it.”
“Because we have to do this on our own. It’s like, in another class they would tell you what to do. And then you could just come to class and give it to them, you know? And they would grade it. But right now? I’m five lessons behind, and it’s, it’s — it’s on my own, too, you know? Fifty-two lessons. And sometimes I don’t even have time, like I’m thinking, 'Oh, I can do this in a day,' or something, so I just leave it for another day.”
I ask Naomi if it helps that her professor can see where she is going wrong. She says it does. But she also complains about feeling alone in her struggle to plow ahead, the long list of unfinished concepts casting a daunting shadow.
“I don’t know if you get me,” she says. “[In high school], the teacher would tell you what to bring to class — the homework. And it was just one thing. But right now it’s just on my own to get these 52 lessons.
“I don’t know if you get me,” she repeats, almost whispering now.
I realize how foreign I must seem to her. But as far as math goes, I do think I get Naomi. I got through algebra in high school, but it took me three and a half years. My college had a pretty thin math requirement, but I struggled to keep my head above water in two semesters of quantitative reasoning. Even when I sought extra help I had no clue what I needed to get back on track, and so my professors, sympathetic though they were, could only do so much to rehabilitate me.
“It’s embarrassing,” says Naomi when I press her on her mixed feelings about being under the constant surveillance of her professor. “They say this math is easy. To me it’s not. I’m not good at it. That’s why.”
What unites me and Naomi is simultaneously the easiest problem to understand, and the hardest to fix. Embarrassment, or fear, or apathy, can make a person want to disappear into the back of the lecture hall and ride things out. In the Knewton system, there is no place to hide, in class or out; no leeway to coast through a lesson or skip a concept that you just can’t seem to grasp. No bluffing.
“It makes you actually do it,” said Maddy, the early finisher. “And it’s just like, there’s no way you can actually not do it.”
Except that is not quite true. Up to 67 percent of the students in some of the first Knewton-powered courses at Arizona State found a way to not do enough work to pass. Self-paced learning, it turns out, is a double-edged sword. “One of the lessons learned was that the students who went too slow really fell far behind and couldn’t catch up later in the semester,” says Boggess. The department is now walking back its emphasis on allowing students to set their own pace, he says.
This amounts to a new twist on an old aphorism: You can lead a student to an adaptive learning platform, but you cannot make him learn. The elevator pitch for Knewton and other adaptive systems is that every student is capable of learning math if they are given the right information, in the right way, at the right time. But there are unwieldy variables in play that make this a hard problem — one that Arizona State’s “other school officials,” the eggheads at Knewton’s Fifth Avenue office, are still working to solve.
eorge Davis’s title at Knewton is “managing data scientist.” This encompasses a lot of things, but one of his duties is to keep students using the Knewton platform on the tightrope between the torpor of success and the despair of failure.
Which is good, because shortly after we start talking I could really use a spotter.
We’re sitting in a conference room at the Knewton headquarters. Jesse St. Charles, the head of analytics, is also here. The two are not much older than I am; Davis is 30 and St. Charles is 31. They were in the same research group in the same doctoral program at Carnegie Mellon University, focusing on creating computer models of complex human systems. St. Charles came to Knewton first, and then recruited Davis, who at the time was programming software for a hedge fund he had started with friends.
A large part of their jobs at Knewton involves asking and answering questions by interrogating the reams of student behavior data that are constantly flowing into the company’s servers. My questions, about what exactly the Knewton system is, are pretty basic. Then again, as any teacher knows, explaining something is not the same as getting someone to understand it.
“So a student at home logs on to Pearson’s website,” says Davis. “They have their content for that course — let’s say it’s an economics course. And they have a set of recommendations that we would generate before they do anything. And they come in and say, ‘O.K., here is the thing I’m going to start on.’ So they click on this, they go into that, they learn some material, they do some practice problems. Every time they interact with that product, a record of that interaction — what specifically happened, did they click here or there, did they get it right or wrong — whatever happens is basically written down digitally, sent to Pearson. Pearson sends it to us. We incorporate that into our view of the student and our understanding of what they know and what they don’t know, which could result in a new recommendation being sent back to Pearson if that’s what Pearson had asked for at that point. So basically that cycle happens for every user that is in this Knewton-powered [product]. So that’s what the data look like as they’re coming in. Something similar happens on the ASU side. They’re coming in through our system, but the same sort of activity happens: they interact, a record of that interactivity comes back to us, we use it to update our models of the student.”
I blink to squeegee the glaze out of my eyes. Ping Pong noises are once again audible through the door, and I feel sudden empathy for the ball.
Reading the transcript now I can see that Davis’s soliloquy was in fact an adequate description of the triangular relationship of data sharing and feedback between Knewton, Pearson and Arizona State. With some light editing it might be fit for a textbook. But at the time my comprehension was, for whatever reason, hopeless. Davis’s words ran together; I heard them as syllables separated from meaning. I started focusing too hard on eye contact, trying to send false positives by screwing my face into a picture of human understanding, all the while wondering why I wasn’t understanding, worrying about what that would portend for my article and my professional future, anxiously quantifying the resources my company had devoted to getting me here for the express purpose of understanding the sentences that were being spoken to me at this very moment, and wishing that the day would end so I could meet up with friends, drink beer, and sort all this out tomorrow.
It felt a lot like being in college again, actually.
“Let me just ask: What’s the thing that you’re trying to get your head around?” The question comes from David Kuntz, the vice president of research and adaptive learning at Knewton, who had entered the room during my reverie. Kuntz is 48 and slight of frame, with a gray goatee and black-rimmed glasses.
The question is so mercifully straightforward that it jars me into lucidity. I tell Kuntz that I am trying to figure out what he and his colleagues actually do in this office, and how that relates to what Bloom and her students are doing in the basement of the Post Office building and what Regier is doing in his executive perch in SkySong. I tell him that I want to be able to understand these relationships in a way that will allow me to express them as anecdotes -- or, if need be, analogies (or personal narrative, but only as a last resort) — because I am not a data scientist and neither are most of my readers and anyway this is really complicated stuff that only freaks are able to hold in their heads all at once, no offense.
The three of them start to work the problem. “If there were more time, then a diagnostic might make a lot more sense,” says Davis. They might quiz me on the principles and limitations of predictive modeling, try to gauge the level of complexity at which I am prepared to comprehend Knewton, try to recommend a sequence of tutorials and exercises that would lead me to an applied understanding of the scientific, emotional and legal concepts that underlie its partnerships with Pearson and Arizona State. (Maybe my patience with this stuff is bad in the afternoons, or after train rides, or when there is Ping Pong being played within a 12-foot radius.)
But there is no time for that now. Davis and St. Charles and Kuntz have blocked out an hour to talk to me and I’ve wasted about 40 minutes of it bluffing my way through an aimless conversation while retaining approximately nothing.
“The choice to do a diagnostic instead of the risk of jumping right into the material is a choice about the tradeoff of certainty of getting enough information to make a valuable recommendation,” says Davis. “And that’s very much what we run into with students. They have due dates, they have a time limit for the entire course. And we have to choose ... between spending time understanding the student better and spending our time directly helping the student and moving them farther forward.”
Gamely, the three scientists tried troubleshooting my learning deficit by becoming a kind of human version of their adaptive learning system. presenting answers to my query in different modalities (St. Charles drew me a diagram; Davis humored me with an analogy) and trying to gauge my understanding using available measures (i.e., asking, “Does that help?”).
It was a crude approximation of Knewton’s system, but the protocol was basically the same: establish a learning goal, figure out what I already know, figure out how I best learn, and intervene when I get stuck — all within the constraints of time and ability.
But this is not the whole equation. As parameters, time and ability can be quantified. Embarrassment is trickier. So is apathy. So is fear.
It is not as though the Knewton scientists are not aware of these variables. They are just harder to isolate, which makes them harder to control.
“There’s lots of unobservable things that we care about,” says St. Charles. “In fact, most of the things we care about are unobservable — not directly, anyway. We can infer things about them, things like engagement. But in terms of trying to understand the things like, why might a student disengage? Maybe they’re bored.… Maybe they don’t have context. Maybe they don’t know why they have to learn this, maybe I have no notion of why this is relevant to my future life, or my goals, or anything like that.”
This is where Knewton runs up against constraints of its own. Kuntz, Davis and St. Charles, with the help of their team of in-house “pedagogical experts,” are in a prime position to study and interpret macroscopic patterns in student behavior across huge swaths of the company’s user base. They can see what a student is doing, then interpolate what they are likely to do next. The “why” of what students do is harder for the system to recognize, at least in individual cases.
But sometimes the “why” is the most important part. Each student struggling to wrap her head around quadratic equations is the same, and each one is different. If anything, my brief visit to Bloom’s classroom at Arizona State left me with a greater appreciation of her role as an interpreter; deciphering and allaying the confusion of individual students where the Knewton algorithms could not.
“I think that for a truly struggling student, if there’s not teacher intervention, they will fail,” says Bloom. “The technology is not the be-all end-all. And that’s why I get frustrated when I hear faculty say we’re being replaced. Because what our data are showing is that actually the faculty is incredibly important. How the teacher manages the class, how they interact with the students, how they troubleshoot. And that makes a tremendous difference in how the students are.”
here is one problem Jose Ferreira has cannot figure out the answer to, and it troubles him.
Ferreira is confident that Amazon Web Services and the Arizona State “school officials” at Knewton will not expose the billions and billions of data he plans to collect from students.
But what if the students decide to expose themselves?
While the company cannot lawfully turn over student data to outside interests, it can turn those data over to the students. And if Knewton does that, it could start a domino effect that could dramatically change the way some employers size up job candidates, says Ferreira.
“Everyone’s going to want to see this,” says Ferreira.
Currently college transcripts, with their course titles and grades, are of little use to most employers, he says. But what if, down the road, a Arizona State student took a series of Knewton-powered finance courses, then applies for a job as a bank analyst?
“Imagine a transcript that says, Here are the 12,000 concepts in finance that we teach at our university — I can tell you, to the percentage, how strong this student is in each one, and how fast she learned each one, how well she retained each one, how naturally sticky finance concepts are to her,” says Ferreira. “I can tell you everything about what she knows about finance and how she learns it best.”
The bank would probably be interested in having a look. So would the admissions department at a master’s program, if that’s what the student were applying for instead.
Some people think this is great -- the ASU Online dean, for one. “I’d like to have this record of proficiency and let students and employers decide how to use it,” says Regier. “I think that would be a good thing.”
Ferreira is less enthusiastic. He is worried that if some students share their profiles, employers and graduate admission officers will begin demanding that everybody who took Knewton-powered courses in college include those deeply descriptive profiles in their applications, looking askance on applicants who refuse to do so.
“That scares me a lot," says Ferreira. “I’m sort of damned if I do, damned if I don’t here. I’m sitting on this wealth of data for — each student is sitting on a wealth of data. If Knewton says, ‘O.K., any student can decide for themselves whom to share it with,’ then all students actually lost the ability to not share it. Because every school and every employer will demand it.”
Naturally this vision of the future, and Ferreira’s professed anxiety, flatters his company. The notion that Knewton, specifically, might become so widely used that graduate programs and employers will come to expect — no, demand — that candidates cough up their Knewton psychometric profile seems far away, if not far-fetched. Knewton still has to persuade textbook companies other than Pearson to do the yeoman’s work of tagging their digital content to suit its “knowledge graph.” In any case, there will be many students who make it through college without ever confessing the contents of their learning minds to Knewton.
Even at Arizona State, the company’s greatest institutional ally, there is no guarantee that Knewton-powered courses will spread significantly further than the low-level, quantitative courses where it has currently taken root. If the university’s old and new math directors agree on one thing, it is that demand for the sort of efficiency and standardization that Knewton provides lies primarily at the remedial and introductory levels. “I doubt it’s ever going to evolve to even close to a one-size-fits-all paradigm,” says Boggess, the current director. “Students learn in too many different ways.... What works for freshman-level stuff, which is where the Knewton project is right now, doesn’t necessarily work at the upper level.”
But even if Big Data does not manifest as an all-encompassing monolith, the issues of efficiency, privacy and pedagogy surrounding the arrival of increasingly sophisticated teaching tools will be relevant to many people, on many campuses, in coming years. And as science fiction, the idea of a Knewton regime highlights the hope and fear of Big Data in higher education.
Which brings us back to “Gattaca.”
At the end of that film, Ethan Hawke’s character, Vincent, successfully finagles his way onto a spaceship despite having been deemed unfit for that job by the government’s eugenics regime. How he fares as a member of the spaceship’s crew despite his physical and mental handicaps is not part of the film. But Ferreira reads this as a happy ending. Earlier in the film, Vincent had challenged his genetically “valid” brother to a swimming contest, winning on pure grit.
“The whole point of the movie is you can’t measure heart,” says Ferreira. “You can only do so much with data. We’ll be able to measure what you know. We’ll be able to measure how well you know it. We’ll be able to measure how fast you learned it. But there are things that we can’t measure.… What Knewton doesn’t have is higher-order thinking skills, passion, imagination, drive. I didn’t build Knewton as a way to measure people. I built it as a way to help make their homework better and their classes better.”
In the end, Ferreira says, adaptive software and the psychometric data they accrue are only tools for teachers to help students, and for students to help themselves. It cannot make turn lousy teachers into good teachers. It does not necessarily cure fear, it does not necessarily fix apathy, and it cannot measure heart. But in the complex human systems of the new American university and its “other school officials,” where many try hard and some get by, it might help make everyone’s best a little better.