You have /5 articles left.
Sign up for a free account or log in.

In a previous post I described the general philosophy of our lab as one focused on building “Stairways to Heaven”; that is, tools that allow us to transform education from where it is now, to what we’d like it to be.  That post was a little nonspecific, so in this post I present three of our current “stairways” in a little more detail.

Our most established staircase; peerScholar was originally developed to bring written assignments back to our very large (then about 1000 students) Introduction to Psychology class.  However it has since developed into much more than a writing tool.  In fact, we see it as a “multivitamin of learning objectives” in the sense that it routinely exercises (and can also assess) virtually all of the core learning objectives we hope to develop in our students1.

Within peerScholar students first create and submit a digital composition and then see and assess (i.e., mark and provide feedback to) a random and anonymously-presented subset of compositions submitted by their peers.  This process gives them direct practice using a number of important cognitive skills (critical thought, creative thought, self-reflective thought, expressive communication of ideas). The process can end here with student grades being based on the average peer ratings thereby requiring little time or effort on the part of faculty.  Better education, less work, with direct research supporting the fairness of grades and the efficacy of the tool in terms of enhancing critical and self-reflective thought and, despite not being designed for this purpose, it also enhances sense of community.

However, the process can also continue by allowing students to revise and resubmit their work, informed by the comments they received by peers.  This extension gives them even more practice, and also introduces new skills (e.g., receptive communication, formative revision) but it does require an expert grader to evaluate the product and process, so it does not reduce workload to the same extent as the two-step process (that said, the time commitment is similar to that associated with grading essays … but this is a much richer educational experience).

In contrast, mTuner is an online tool we use to reinforce students’ understanding of concepts while simultaneously eliminating misconceptions.  The name was inspired by the process of tuning a guitar (or ukulele) wherein one tests the first string; if it is tune you leave it as is, if it is out of tune you first “tune it up” before moving on.  Similarly mTuner embodies everything we know about “assessment FOR learning” to tune our students’ minds … mindTuner, mTuner.

The basic premise of “assessment FOR learning” is that students are never more engaged than they are when we test them and, thus, testing represents a fantastic context for learning.  Within mTuner students first see a multiple-choice type question but without the alternatives, and they are asked to first type what they think the answer is.  This makes them think rather than go right into recognition mode, and this thought has been shown to enhance subsequent learning.  Once they type their answer the alternatives are shown and, at this point, it would appear as any other multiple-choice question.  Students select what they think is the best answer and if they are right (i.e., in tune) they are told so, their score is incremented, and they see a short explanation that re-emphasizes why that particular answer is right.  Hence we reinforce correct content knowledge.

The more interesting aspect is when they chose incorrectly.  At that point they are given a “hint” which can take a number of forms.  In my class, if the question came from my lectures - all of which are recorded – we literally return them to the part of the lecture related to the question.  If the question came from the text we show them the relevant page of the eText.  They use whichever resource to “figure out” the right answer, and then are given a second chance for half marks.  If they choose right this time, they are told so, their score is incremented, and they see the explanation.  If they still choose wrong, we tell them which answer is right along with the explanation for why it is right.  That is, we correct their misconception before it can be learned too deeply; we tune up the dissonant string.

The final tool we use extensively is Digital Labcoat.  This tool gives students hands on (i.e., active learning) experience with the scientific method, including exposure to statistics and the role it plays in hypothesis testing.  Students first provide data via a questionnaire, then they mine their own data by performing analyses that test hypotheses they specify.  Each analysis is performed on a subset of the entire dataset so, in a subsequent phase, they then attempt to replicate interesting results reported by their peers.  Finally, once our top 10, most interesting, replicable results are obtained, students are asked to provide theoretical accounts of the findings, or to vote on the accounts provided by others (i.e., crowd-sourced voting).

Students really enjoy Digital Labcoat, and they do much more work with it than we ask them too.  They also strongly endorse statements like “Thanks to Digital Labcoat I now feel I really understand how science works” and “After doing this assignment I feel less worried about the statistics courses I have to take”.  They also claim to have a much better sense of the usefulness of graphs and numbers in general in terms of coming to scientific answers.

Phew!  Those were quick descriptions.  But they at least give you a sense of three of the staircases we have built and now use in the context of my 1800 student, blended learning Introduction to Psychology course at the University of Toronto Scarborough.  Thanks to these tools I feel I am giving my students an incredibly rich learning experience that supports their development of cognitive skills while also making sure they learn the content of the course well.

There are more staircases to come, but we are very proud of the ones we have built to date.  We are currently in the process of honing these tools and providing the resources (i.e., websites, manuals, learning communities) that will allow us to share and support them widely.  Interested educators should feel comfortable sending me an e-mail.

1http://www.heqco.ca/SiteCollectionDocuments/Taking learning outcomes to the gym_ENG.pdf)

Steve Joordens is Director of Advanced Learning Technologies Lab at the University of Toronto Scarborough

Next Story

Written By

More from Higher Ed Gamma