A new study in Proceedings of the National Academy of Sciences suggests that there’s a dependable way to foster long-term improvements in students’ critical thinking skills. Researchers at Stanford University and the University of British Columbia developed a framework consisting of cycles and decision making based on comparisons between data sets or data and models, and applied the learning structure to 130 students in an introductory physics lab.
During a series of simple physics experiments, the students received instructions to compare new data to existing data, and to decide how to act on those comparisons based on statistical tests. For example, students used a stopwatch to time a pendulum swinging between two angles of amplitude. Rather than just conducting data and comparing them to equations in a textbook, as a control group of students did, the students in the modified course were instructed to make decisions based on the comparison. What should they do to improve the quality of their data and better explain the difference between their results and the equation in the textbook? Students chose everything from conducting more trials to putting the team member with the biggest finger on stopwatch duty. Their data improved, along with their understanding of the process.
Even after the instructions were taken away, the students in the test group were 12 times more likely than a group of 130 students the previous year (the control group) to propose changes to improve their data or methods. The test group students also were four times more likely to identify and explain a shortcoming of the model using their data.
The test group students demonstrated similar critical thinking skills in a second course the next year, suggesting that their learning was long-term. Lead author N. G. Holmes, a postdoctoral researcher in physics education at Stanford, and her co-authors argue that the framework they developed could be adapted to a range of settings beyond physics. The study is available here.
Holmes said via email that "giving students the space to make decisions about how to follow up on an experimental result, with careful guidance, ingrained critical thinking long-term. … I think this adds to the existing literature a concrete, yet simple way to structure how these skills can be taught with lasting improvements. It is a demonstration of how to teach expert-level skills in context that can be generalized outside a particular classroom."
Lots of departments want to know what they’re doing right for non-tenure-track faculty members, what they can do better and how that climate affects student learning. But how to measure it? The Delphi Project on the Changing Faculty and Student Success at the University of Southern California, which works with adjuncts and administrators on these issues, fields such questions all the time. So it created a self-assessment tool called Departmental Cultures and Non-Tenure-Track Faculty.
The anonymous survey tool can be used by provosts or administrators, department chairs, non-tenure-track faculty members themselves, or unions to understand departmental climates on campus. It collects information on basic demographics of non-tenure-track faculty members, such as length of service, whether respondents are part-time or full-time, and if they work primarily on campus or online. Questions on departmental culture explore treatment by tenure-track faculty members, participation at faculty meetings, salary and pay, hiring practices, communication, mentoring, and levels of institutional support. There’s a separate subsection for online-only faculty.
Based on non-tenure-track faculty members’ responses, departments fall into one of four “cultures” for adjuncts the Delphi Project has identified elsewhere in its research: destructive, neutral or invisible, inclusive, or “learning” (in which tenure-track colleagues view and treat non-tenure-track faculty members as true peers). The tool includes descriptions of various aspects of departmental culture within each, in part for the benefit of departments looking to improve their climates and therefore improve student learning. For example, departments with learning cultures employ intentional hiring practices and offer professional development that's not limited to campus events, resulting in less turnover and recruitment of quality faculty. Destructive departments, meanwhile, are constantly hiring and offer no professional development.
“The four cultures that the survey is designed to get at are linked to student learning in research,” Adrianna Kezar, professor of higher education and director of the Delphi Project said via email. “We know it can really help campuses, and they have been asking for such an instrument, so we want to get the word out!”
Kezar added, “The destructive cultures are obviously very negative to student outcomes. The invisible one also is fairly problematic. What is surprising is even the inclusive culture does not fully support student learning. I think most people are in the invisible culture and a few moving to inclusive -- but the goal is to reach the learning culture.”