Want to be a more effective teacher? There’s an app for that. Or, at least, there soon may be.
“Classroom Sound Can Be Used to Classify Teaching Practices in College Science Courses,” published this week in Proceedings of the National Academy of Sciences, previews a new tool that measures the extent to which professors use active learning in their classrooms. Scholars involved in the study hope to make the tool into an iPhone application so others can work to increase their use of high-impact teaching practices. For now, it's available online, here.
“It’s really hard to change if you don’t measure what it is you’re starting with,” said the study's co-author, Kimberly Tanner, professor of biology education at San Francisco State University. “It’s like trying to lose weight without a scale. To make changes you need some really quick feedback.”
Active learning happens when students participate in classroom discussions and solve problems, rather than just listening passively. And previous studies suggest that active learning results in greater learning gains and student retention rates than lecture-only courses. So Tanner and dozens of other researchers across natural science, technology, math and engineering fields and institutions worked to create and test a machine-learning algorithm that uses sounds to identify teaching styles in college and university classrooms.
They argue that there’s a particular need for their tool in the natural sciences, since hundreds of millions of dollars have gone toward improving STEM teaching nationally in hopes of keeping students -- especially underrepresented minorities and women -- in the so-called pipeline. And while all evidence suggests that significant learning gains can be made by many professors incorporating even a little active learning into their courses, the study says the “extent to which large numbers of faculty are changing their teaching methods to include active learning is unclear.”
The new tool is called Decibel Analysis for Research in Teaching, or DART. It reports what types of activities are going on in a classroom based on sound waveforms, categorized as follows, down to half-second audio samples: single voice, multiple voice and no voice. Lectures and question-and-answer periods count as single voice and are indicative of a nonactive teaching style. Multiple voice samples, including discussions and transitions, are considered active learning, as are no-voice samples, such as when the entire class is engaged in a silent writing activity.
Essentially, DART computes the volume and variance of sounds in a classroom. Average volume and high variance indicates one person speaking at a time, or lecturing or otherwise not engaging students in active learning. High volume and low variance, observed in multiple-voice, pair discussions, for example, means active learning. Low volume and low variance also means active learning is happening, as all students are likely engaged in a task.
The idea behind DART is that professors don’t have to guess how much active learning they’re asking their students to do, but can actually measure it to a relatively precise degree. Based on an initial study of 1,486 class session recordings from 67 community college and four-year university STEM courses, DART is 90 percent accurate, in classroom settings both big and small. In other words, the algorithm was nearly as good at determining what kind of learning was happening as were human annotators in the large-scale study of 1,720 class hours involving 49 instructors.
Perhaps surprisingly, the amount of time spent on active learning was higher in courses for biology majors than non-biology majors. The authors take this finding as a proof that DART can be used to study teaching styles across more disciplines, institutions and course types going forward. All courses in the study were taught by professors who had completed STEM-teaching professional development.
Over all, the professors fared well in their pursuit of active learning. While single-voice instruction was observed in all courses a majority of the time, 88 percent of analyzed courses used active learning in at least half the class sessions. Female instructors were more likely to engage their students in active learning than were men.
Tanner said that professors sometimes don’t mean to dominate class time with lectures, but passion for their subject matter can unwittingly lead them away from active learning. DART is a clear, objective measure of how often that’s happening, she said.
The Association of American Colleges and Universities works to promote high-impact teaching practices, among other goals. Lynn Pasquerella, president, said via email that these practices should be "infused throughout a student's entire curriculum," and DART's value is that it offers a "point of information" for faculty members who are committed to engaged learning.
"If faculty tend to overestimate the amount of time their students are engaged in active learning processes, DART can provide data that will prompt the redesigning of assignments and foster enhanced student engagement," she said. "Learning outcomes can then be assessed comparing courses that rely most heavily on active learning with those that are dominated by lectures. We know that high impact practices have a disparately positive effect on students from underrepresented groups. As a result, there is significant potential for this tool to advance the equity imperative in STEM and beyond."
Again, the paper suggests that DART could aid “systematic analyses” of the use of active learning in classrooms, and says that its relative simplicity, affordability and ability to protect student and professor privacy (capturing sound types, not course content) make it ideal for such a pursuit. Tanner emphasized that it's a tool to improve one's teaching and learn more about the profession, and said it shouldn't be used by external parties for evaluation or punitive purposes.
“I think that DART will allow us to ask questions about how things are and aren’t changing in higher ed,” she added.