Measuring Success in New Ways
Given the complexities of graduate education, it can be hard to measure program success in meaningful ways. Traditional, external reviews track things such as time to degree and completion rates every five to 10 years at large research institutions, but students and faculty are rarely asked deeper questions about curriculum relevance and program goals. A new effort at the University of Minnesota seeks to change that by establishing ongoing, qualitative models of assessment centered on students and action. If successful, the pilot Graduate Review and Improvement Process (GRIP), could be instituted on a voluntary basis across the university next year.
“I think there are clearly quantitative measures that people like to see and use when we assess doctoral education,” said Henning Schroeder, vice provost and dean of graduate education. “But students in these programs need to be able to have the opportunity to take risks and need to experience how frustrating it can be to be engaged in research, and this cannot be measured through an easy quantitative approach. This is what GRIP gives us in campus feedback in a qualitative way.”
Eight departments across eight colleges at Minnesota signed on to pilot GRIP this year in both doctoral and master’s degree programs. Designated students and faculty members work within each department, as well as with a core group of GRIP student and faculty leaders at the university level, to develop program-specific assessments and action items centered on three themes: goals of the program and whether curriculums reflect those goals; engagement with qualitative and quantitative evidence of program success; and creation of an internal “state of the graduate program” report and focused plan for improvement. Department GRIP teams also meet regularly with teams from other pilot departments to discuss their progress.
GRIP comes at a crucial time for graduate education, with universities seeking to rethink their doctoral programs to reflect the changing nature of academic job market. Graduate programs in the humanities are in the crosshairs of this debate, including those at Stanford, which announced efforts this month to streamline doctoral humanities programs from the traditional seven years to five, and better-prepare students for jobs outside academe. But the sciences have taken note as well: the American Chemical Society recently issued a report urging significant changes in the structure, curriculums, and financing of graduate programs in chemistry to better align the interests of students, institutions and the discipline. Echoing Stanford, the society encouraged universities to compress their doctoral timelines to five years at most.
Ronald Ehrenberg, director of the Cornell Higher Education Research Institute, said a program like GRIP could offer a way to "flip on its head" the traditional review model that involves ranking graduate programs by external standards, “whether it is publications of faculty relative to elsewhere or mythical rankings or placement of graduate students and the preoccupation – the obsession – with placing graduates at top universities, which is of course unrealistic for most places.” Internal models of assessment offer an opportunity to “rethink what we are and what we’re going to value and how we’re going to measure it," and could be particularly effective for graduate programs -- such as those in the humanities -- that haven't traditionally focused on readying students for a variety of job markets, he said.
Although GRIP is still in its infancy, Schroeder said eventual outcomes could include similar recommendations for curricular flexibility, a program "quality marker." For example, a history Ph.D. candidate who knows he wants to specialize in 19th-century imperialism may not need to spend years studying ancient history to the present day, he said. This kind of less instruction-intensive doctoral program is already the standard in Europe. (Schroeder also noted that in response to student complaints identified through GRIP about the number of courses required for one Ph.D. program in relation to comparable programs elsewhere and their ability to do research, the university already has taken steps to allow graduate students to begin research earlier).
While departments involved in the program have carried it out in varying ways, GRIP’s basic features include workshops and advice from university experts on program assessment, evaluation colloquiums for student leaders, and research “tool kits” with survey instruments and focus group protocols.
The program was inspired in part by the Carnegie Foundation for the Advancement of Teaching's Carnegie Initiative on the Doctorate to improve graduate education, for which Chris Golde served as a lead scholar. Golde, associate vice provost for graduate education at Stanford University, said that although GRIP is specific to Minnesota, it embodies some of the principles of her research – namely that assessment should be tailored to local needs and involve graduate students. She called students “secret agents of change,” who approach problems with lots of energy but without some of the baggage that can be accumulated over time by faculty and administrators. Faculty can take for granted that they know what students want out of their programs and can be resistant to simple changes due to the perception they require more time or resources than they actually do.
Schroeder said the Carnegie initiative’s focus on students was in line with the university’s tradition of student involvement in shared governance (it also wrapped up just prior to the decentralization of graduate studies at Minnesota in 2009, when the Graduate School relinquished program authority to individual colleges). Additionally, he said, getting students involved in real research can only benefit them later on in their careers, whether inside or outside academe.
Marta Shaw, a Ph.D. candidate and research assistant in the College of Education and Human Development’s department of organizational leadership, policy and development, helped establish GRIP in a pre-pilot phase last year within her department and is helping other departments stand up programs as a student leader. Working on GRIP has been a great complement to her research on comparative higher education, she said, and the value of involving graduate students in reform can’t be overstated.
“Faculty often talk about [graduate students] as the future of their discipline, but that’s not exactly true; we’re also the present of the discipline,” Shaw said. “At a time when there’s so much knowledge and it’s advancing so fast, everyone has to learn from each other.”
Such a program is especially pertinent right now, Shaw added, as students want to know what programs are doing to adjust to the changing realities of the academic job market.
Leah Hakkola, another GRIP student leader and Ph.D. candidate within Shaw’s program, said the department already has identified and instituted through GRIP ways to help students increase career prospects, such as connecting them with alumni and local professionals face-to-face and on LinkedIn.
“Traditional graduate education review is based on very finite and limited quantitative metrics that may not be relevant in today’s global landscape and economy,” Hakkola said in an email interview. “A student-centered process addresses these issues and benefits students and faculty in moving toward tangible change in a way that was extremely difficult in the traditional [summative] process.”
Carlson School of Management is one of the colleges piloting GRIP this year. Srilata Zaheer, Carlson dean, said the school is still finalizing its goals and procedures for assessment but is looking forward to collecting data this spring.
“Generally, we serve our students well, so I was not concerned that we had major problems in our programs that needed fixing,” she said in an e-mail. “Still, our current feedback system mechanisms are nonsystematic and largely anecdotal…. The idea of GRIP, to develop the goals and their assessment internally within the program, promises to be a more useful approach.”
Still, GRIP already has produced some interesting results. Schroeder said one unnamed program was shocked to discover that only 1 of its 50 students aspired to be a faculty member at a research institution. Because faculty had been teaching as if their students were going to join the professoriate, they are now rethinking their instructional approaches. And Jean King, consultant and faculty evaluation lead for GRIP, as well as professor and director of graduate studies within the Department of Organizational Leadership, Policy and Development, said she was surprised to discover that students had more than a few suggestions for their orientation program during GRIP’s pre-pilot phase last year.
Instead of a “rah-rah” program introduction, King said students through GRIP requested more information-oriented sessions on funding and an opportunity to connect with their cohorts. King said she and colleagues were able to institute such changes this year.
In addition to offering immediate feedback and opportunities for action, King said another important feature of GRIP is that it’s “fun."
"People love to talk about their programs, just like they like to talk about their children or pets.”
Schroeder said eight programs was a good start for GRIP, but he’d like to see more departments volunteer next year. And there could be a good chance of that, as Schroeder said a dean who skeptically volunteered for GRIP is now one of its biggest supporters. “The feeling is that it’s a process owned by ‘you,’ and not the central administration coming down on ‘us’ ” said Schroeder. “They’re doing this to improve their programs, and not to avoid punishment from [a central office]. So the buy-in could be different.”
Both Schroeder and King agreed that GRIP shouldn’t ever replace traditional, summative assessments, but rather serve as a complement. Schroeder compared external reviews that happen every seven to 10 years to soup served to a customer in a restaurant, and ongoing assessments to tastes that help the cook season the dish before it ever leaves the kitchen.
Golde also offered a restaurant analogy, comparing qualitative assessments such as GRIP to Michelin stars, and summative assessments to external health inspections. “Those are two different things,” she said. “You need the Board of Health to make sure there’s not rats in the kitchen, but really what we’re after is Michelin stars.”