England Seeks to Measure Learning

More than 70 institutions are testing different measures of student learning amid new government effort to evaluate universities on teaching quality.

September 9, 2016

Questions of how much and what students learn in college have proven frustratingly hard to answer. What methods are most appropriate for measuring gains in student learning, and what can’t be well measured?

In England, a higher education funding and regulatory body is funding a series of research projects involving more than 70 colleges and universities to test various potential measures of learning gains. The research projects come at a time when the conservative government has embarked on a controversial effort to evaluate universities on “teaching excellence.”

The Higher Education Funding Council for England is this fall launching the biggest of the research projects, which is expected to involve about 27,000 undergraduate students at about 10 institutions. Ross Hudson, the project manager for HEFCE’s learning gain program, said students will be tested at various points throughout their undergraduate careers using three different measures:

  • a problem-solving and critical thinking test designed by Cambridge English Language Assessment (the test provider declined to answer questions on the test, referring questions to HEFCE);
  • survey questions on student attitudes and noncognitive skills (Hudson said these questions will focus on “different aspects of their learning experience including attitudes, openness and motivation”); and
  • survey questions on students’ engagement with their studies.

In addition to this nationwide project, HEFCE is funding 13 different institutional and multi-institutional pilot projects to measure learning gains.

These projects, summarized in brief here, involve a range of methodologies, including grades, tests of academic skills and critical thinking, engagement and well-being surveys, and tests of noncognitive traits like resilience and “employability identity.”

Some of the projects focus on specific aspects of student gains related to topics like employability, work-based learning and the development of research skills. Some employ established tests and methodologies such as the CLA+ (an updated version of the Collegiate Learning Assessment), the National Student Survey and the U.K. Engagement Survey, while some use homegrown instruments or a combination.

Chris Millward, the director of policy at HEFCE, said England’s move from a free to fee-based higher education system has, coupled with increasing participation, fed a growing desire for data on student learning gains.

HEFCE is administering the first full-scale trial round of the new Teaching Excellence Framework in 2017. The exercise in its first iteration will rely on three “core metrics” that will be benchmarked to take into account different institutions’ student characteristics and subject mixtures: 1) data on satisfaction in teaching drawn from the National Student Survey, 2) retention rates and 3) rates of employment or continuing study six months after graduation drawn from the national Destinations of Leavers From Higher Education survey. In addition to these core measures, TEF evaluators also will consider additional “contextual” evidence submitted by individual higher education institutions on issues related to teaching quality, the learning environment and student outcomes and learning gains.

The development of the TEF has been controversial. As Times Higher Education has reported, university leaders have raised concerns that the TEF will be administratively burdensome and that the proposed metrics will be poor proxies for measuring teaching excellence across a diverse range of institutions.

“Nobody is against teaching excellence, but I think some people are afraid that thinking about teaching excellence becomes too simplistic and that some people may believe that we will be able to create league tables [or rankings] of universities based on how well they can achieve teaching excellence or learning gains of students,” said Jan Vermunt, a professor of education at the University of Cambridge who is involved with one of the HEFCE-funded research projects (see below).

“There’s not one way to achieve teaching excellence. There are different ways," Vermunt said.

The government has framed the TEF as a tool “to provide clear information to students about where the best provision can be found and to drive up the standard of teaching in all universities.”

“The TEF will provide clear, understandable information to students about where teaching quality is outstanding. It will send powerful signals to prospective students and their future employers, and inform the competitive market,” said a government white paper published in May. In that paper the government also outlined its plans to link performance on the TEF to tuition rates by granting universities that perform well on the TEF permission to raise fees in line with inflation.

“The Teaching Excellence Framework could be seen in a sense as the next stage of reforms in higher education in this country,” said Millward, of HEFCE. “The current government came in with a strong view that there might not be the information available for students to understand different levels of quality in higher education.”

Millward said it’s too early to know how findings from the HEFCE-supported research on learning gains could inform metrics used in future administrations of the TEF. He stressed that the learning gains research will take years, with findings emerging gradually, while the TEF is taking off now.

But a technical paper published by the government in May states that “as new measures of learning gain become available, we anticipate that these will also feed into the TEF in future years.” A footnote for that sentence links to a website about the HEFCE-funded research on learning gains.

“I think there’s a lot that we can learn from the project that will really support universities’ own learning and teaching improvement activities and help them understand better the progress that students make in the different contexts in which they’re working,” Millward said. “That’s quite different from a kind of national metric of learning gain that could be used in a teaching and excellence framework. We’re really a long way from that.”

Christina Hughes, the pro vice chancellor for teaching and learning at the University of Warwick and the leader for one of the HEFCE-funded projects, described learning gain as “a new concept for the U.K.”

“Of course it’s going to be even more significant given that we have the Teaching Excellence Framework coming our way and learning gain potentially being part of that,” Hughes said. “It then becomes a national policy initiative. This is partly why HEFCE’s funding all this work. It’s partly capacity building; it’s partly trying to test out different models and methodologies.”

“It may be that we are all subjected to a particular definition of a particular kind of learning gain or we are certainly asked to produce quantifiable data around learning gain through the Teaching Excellence Framework,” Hughes said. “From an institutional point of view, we want to understand what learning gain is, what the best ways of measuring it are, what the issues and challenges of that concept are, so that we can be informed in our use of it.”

Below are a few examples of specific pilot projects, which are entering the second of three years of HEFCE funding.


A project led by the Open University is using data related to learning analytics -- on student participation in online classes, for example, or on library log-ins.

Bart Rienties, a reader in learning analytics at the Open University, a distance learning institution, said the project primarily relies on analysis of data already being collected by his university and the other two institutions in the project, Oxford Brookes University and the University of Surrey.

“All three universities have data of what students are doing in terms of virtual learning environment engagement, so what we’re testing is can we indeed identify patterns of some students becoming more engaged in a virtual learning environment while others are moving in a different direction,” Rienties said. While the focus is on virtual learning, Rienties said the study is not looking only at online courses, but also at various online learning environments created to support traditional face-to-face classes. Do students download lectures posted online with lecture capture software, for example, or do they download lecture notes prior to class? Are they participating in online discussion forums?

The project is examining affective, behavioral and cognitive measures, mining sources such as student satisfaction surveys (affective), data on student participation and class contributions (behavioral) and grades (cognitive). It also includes a qualitative component, including interviews and the use of student diaries.


A project involving 18 different members of the elite group of universities known as the Russell Group, led by the University of Warwick, includes four different components. The largest piece is being led by the University of Cambridge, where researchers are in the process of developing an instrument intended to measure student learning gains across five main domains.

“The first one is the extent to which students exhibit deep thinking or deep learning,” said Vermunt, of the University of Cambridge and editor in chief of the journal Learning and Instruction. “Part of it is critical thinking, but is much broader -- how do they relate to theories, how they think for themselves, how they try to be analytical, be critical, go further than just the subject matter and try to think for themselves.”

“A second important one is what we call self-regulation,” he said. “Do they self-regulate their learning, their thinking. Time management would be an important element of it, let’s say their autonomy, independence in thinking and learning.

“A third component would be more affective, so let’s say engagement, curiosity, the degree to which they engage with their studies, research curiosity -- the degree to which they like research, they want to find out things, they want to generate new knowledge.

“A fourth is the social communication aspect of being a student and a graduate. We focus on communication competency but also social embeddedness, how are they embedded in a social environment, how well can they communicate both verbally and in writing.

“A fifth and the last one is what we would call epistemological development, so their views on the nature of knowledge. Do they have a strongly dualistic view of right and wrong, or are they more open-minded and do they realize there are different truths and one truth is not necessarily better than the other?”

Vermunt said the design of the survey instrument, which will partly make use of existing instruments, is being informed by qualitative interviews conducted with a total of 40 different students over the past year. The first wave of the survey will take place this fall. The plan is to survey about 3,000 students in four disciplines -- business, chemistry, English and medicine -- at both the undergraduate and graduate level at about a dozen different Russell Group universities up to three times over two years.

The other three components of the pilot project, which are being led by other universities in the 18-university group, focus on questions related to employability.


A separate project involving 14 different institutions relies on data on student career planning and work experience already being collected by universities during the online course registration process.

Bob Gilworth, the director of the Careers Group, a kind of centralized career services office at the federal University of London, which is overseeing the project, explained that the participating institutions ask students to answer questions related to the kinds of work experience they have obtained and where they are in the career planning process. Students are asked to identify which of a series of statements most reflects where they are in what Gilworth describes as a three-part process of deciding what they want to do, planning how they might get the right skills and experience to make that happen, and competing effectively.

Gilworth said the research question for the HEFCE-funded project is whether data on how students progress in their career-related thinking and planning can be used as an indicator of learning gain. He said the verdict is still out on that, but a main takeaway is that “the biggest single issue is around career choice and career planning.”

“A lot of the noise around employability is around skills, and that’s perfectly understandable, but the big issue that students need to confront and are often struggling with is career choice and career planning,” he said. “When you look at the data, you see that having close to half of students going into their final year still in the ‘decide’ phase is pretty common.”


Back to Top