Student presentations are a common feature of many courses, but presentation quality varies dramatically. Nearly every student has endured text-heavy PowerPoints read verbatim and doubted the credibility of a presentation’s content. Yet, student presentations are pedagogically important; they provide students with an opportunity to take ownership of an issue and improve their public speaking ability – a valuable, employment-related skill.
Faculty members often urge students to meet for assistance with their presentations, but only the outliers show up. Detailed instructions for producing quality presentations sometimes go unnoticed or ignored. Even dedicated, high-achieving students can miss the mark come presentation day. The end result is a waste of valuable instruction time. Fifteen minutes of ineffective student-to-student instruction multiplied by 25 student presentations equal six-plus person hours of lost learning.
Who is at fault? An episode of “The Apprentice,” which aired fall 2010, provides a possible answer. Donald Trump assigned two teams the same task. One team failed miserably. In the boardroom, Trump showed no mercy to Gene, who had done a poor job presenting, or Wade, the project manager who had selected Gene, but who had failed to verify Gene’s ability to perform this important task.
Each week Trump fires one person. Should Trump fire Gene, the unprepared presenter, or Wade, the project manager who failed to assure quality control procedures? In what was described as a shocking move, Trump fired both men. However, his decision was sound; Gene performed poorly and Wade, who is ultimately responsible for the quality of the show, failed to do his job.
What if a student performs like Gene? What should happen if a student provides erroneous, irrelevant, and unimportant information, fails to provide credible references, and is unable to provide answers to basic questions? Who should be "fired" – the student who delivered an unacceptable presentation, the professor who had no advance knowledge of the presentation’s content and allowed it to proceed during class, or both?
From my experience, requiring students to meet with the professor at least one week prior to their presentations in order to obtain permission to present is an effective method that dramatically improves student presentations and ensures more effective use of instructional time. It can be framed as a business meeting in which the vice president (professor) requests a meeting to review the work of the lead presenter (student) prior to presenting to an important client (the class). This meeting might even be graded. Certainly, a VP would not wait until the big presentation to evaluate the work of the lead presenter.
The purpose of these meetings is not simply to evaluate and approve student work. The meetings provide an opportunity to assist the student inside the "zone of proximal development"; I see what the student is able to do without assistance and what he or she can achieve with assistance.
At the start of each individual meeting I address an e-mail to the student and then add notes, links to videos and articles, and electronic documents archived in desktop folders. Although most undergraduates have grown up in the information age, many of these so-called “digital natives” do not demonstrate the ability to sift through data and identify what is important. Despite having mentioned in class that the founder of Wikipedia discourages academic use of this community-generated encyclopedia, it still appears on slides. Fortunately, each Wiki is left on the cutting room floor.
Determining the credibility of other websites involves asking students, “What do you know about this organization? What is their mission? Who is responsible for the content?” We explore the site to find the answers. Once a source is found to be credible, deciding what information to include is guided by the question, “Knowing that memory is imperfect, what will students retain from your presentation one year later?”
For presentations in my class, students must carefully select one or more videos and show clips that total five minutes. We discuss the credibility of the video and determine whether it repeats what the student will discuss. Viewing the video is essential. Prior to the adoption of my current policy, a student began to show an inappropriate video during his presentation. The video included profanity, and lacked any apparent educational value. I asked him to pause the video and explain why he chose the video and what we could expect to see. He replied, "I don’t know – I haven’t seen it." Now, before the video’s debut in class, I say, “Tell me about the video. Why did you choose this video and not another?”
During the meeting I ask students to answer the discussion questions they plan to use. Often, the questions are duds and their answers brief. We refine the questions with Bloom’s Taxonomy and higher-ordered learning outcomes in mind and generate discussion questions that are more likely to inspire passionate debate.
After three semesters of observational data, the improvement has been unmistakable, and the early results of an Institutional Review Board-approved study indicate that 84 percent of students agree or strongly agree that the meeting was beneficial and 77 percent agree or strongly agree that the meeting helped them to avoid procrastination. One student who had completed over 70 credit hours wrote, "This was the first required faculty-student meeting I have encountered in my college career. It was highly beneficial…. If there was no meeting, my presentation would have been a major disaster." A graduate wrote, "By setting an earlier 'due date' I avoided throwing together a presentation the night before I actually had to present it." The highest compliment came from a student who blurted out in class, "These are better than many professors’ presentations."
I have found a number of benefits to required meetings with students beyond the improved quality of the presentations themselves. These face-to-face meetings typically leave me with a greater sense of a personal relationship with the student, and I would venture to say the feeling is mutual. Taking the time to meet outside of normal class hours clearly indicates to students that the professor cares. It also gives them a better idea of the rigor that underlies the peer-review process – how their professors’ scholarship thrives on the constructive criticism of others – and how this can ultimately elevate the quality of their own work. Finally, it might be considered a “high-impact practice” that opens minds and improves retention. Although there may be no panacea for subpar student presentations, the lesson I learned from "The Apprentice" – that I am at least partially accountable for the quality of student’s presentations – has improved the classes I teach and the quality of my relationships with students.
Christopher A. Hirschler is an assistant professor of health studies at Monmouth University.
In my sophomore literature class, I read a passage aloud from perhaps our best-known slave narrative, Narrative of the Life of Frederick Douglass, An American Slave, in which Douglass characterizes the nefarious effects of slavery on his new mistress, Sophia Auld:
The fatal poison of irresponsible power was already in her hands, and soon commenced its infernal work. That cheerful eye, under the influence of slavery, soon became red with rage; that voice, made all of sweet accord, changed to one of harsh and horrid discord; and that angelic face gave place to that of a demon.
But then I stopped and asked, "What does the word commenced mean?" Silence. "What about infernal?" Silence. "Accord?" Embarrassed smiles all around.
In the past I would have given my standard lecture about looking up words instead of relying on something my students call "context clues," which I take to mean anything that prevents them from stopping, briefly, to do it the old-fashioned way. They have told me that they learned about "context clues" from previous teachers. I ask them what the word "context" means. Silence.
Douglass intimates that the worst part about slavery isn't the work or the whippings or the cold or the hunger or even the literal shackles. It's neither the blood nor the rapes. No, it's the compulsory ignorance, the full force of a system that understands slavery can only exist by the deprivation of learning, the absence, as it were, of light.
So I asked them: "What’s it like to be slaves?" I wasn't referring to Douglass, and I think some of them knew it.
As a child Douglass overhears his master, Hugh Auld, tell the naively benevolent Sophia to stop teaching him to read: "A nigger should know nothing but to obey his master — to do as he is told to do," Auld tells her. "Learning would spoil the best nigger in the world" and "would forever unfit him to be a slave." This is the moment of enlightenment for Douglass as he discovers through serendipity and keen discernment what he had always pondered: "to wit, the white man's power to enslave the black man." He resolves to learn to read, reasoning that compulsory ignorance is the tool that keeps him and his fellow slaves in bondage.
"It is hard to have a southern overseer," Douglass’s contemporary, Henry David Thoreau, wrote in Walden; "it is worse to have a northern one; but worst of all when you are the slave-driver of yourself." Although Thoreau refers to physical labor that fails the test of self-enlightenment, his larger point applies to my students who, too, seem explicitly bent upon achieving their own contemporary version of metaphysical enslavement. Both Douglass and Thoreau would recognize and lament this mentality, and walk away confused by the disheartening juxtaposition of material affluence and imaginative poverty. And then they would use words to write about it.
It bears asking, though, what such students might be enslaved to, or by. Dangerous ideas? Not likely. The latest in chic outerwear for the fall season? Too late. Without sounding overly prejudicial, it is difficult to conceive of much that would fundamentally threaten their defensive sense of self-assurance, which is often no such thing. What I want to say here is that I am not always sure what I would like to free my students from — figurative slavery notwithstanding — since so many of them seem blissfully happy in their formidable selves. It's freedom to I’m concerned with.
Complicating my bewilderment is that I have no transgenerational ax to grind, knowing as I do that the cry of English professors over their students' supposed failings is pretty standard fare for well over a century at least, and anyway, the topic simply isn’t that interesting before the third beer.
So here's what I want, in part: I want my students to become interesting people — that is, more interesting than they already are. I want to be able to talk to them in 10 years about Frederick Douglass, and if they aren’t into Frederick Douglass I would wish that they have a passion about something, as I think many of them will. Most important, my foremost desire is for them to have the tools to express their passion, whatever that passion may be. One of these tools is vocabulary; the more important other is curiosity.
You have an English professor, a text, and a class. You ideally have the formula for some kind of reimagining of the self, the world, the text — even the professor. And a choice gets made not to make that transformation, not out of any inchoate philosophical positioning (echoing Bartleby the Scrivener’s "I would prefer not to"), but, well, just because. I would prefer not to. Or, more reasonably, the students choose not to out of fear, having failed in their previous attempts, or because the words themselves are another in a long list of obstacles familial, cultural, and structural.
But imagine, too, how Douglass's autobiography would look had he made the same choice not to pay attention to the signs around him. It would look like silence, the kind of silence we used to see on walls in New York City in the early and middle years of the AIDS crisis: Silence = death, a morbid equation that would touch Douglass at his very core and about which, I am certain, he and Thoreau would have much to say.
Thus his communicative power — indeed, any communicative power — is tied inextricably to literal and figurative liberation; it is liberation’s proximate and ultimate precondition. Sadly, many of my students miss the nuances of Douglass’s story because its function as literary text shuts down that act of communication. Think of it: an aesthetic and polemical text — no, a book! — a slave narrative that misses its mark because the author, himself an escaped slave with no formal education, uses words too well. The very words that helped to free Douglass are now the mark of another form of enslavement. I try to encourage my students to think of the profundity of a boy, then a man, who was everywhere unrecognized as a boy or man until his escape, and even then he remained of questionable status. His devotion to learning as a slave in fact allowed him to occupy the space of all those who kept him from such learning. He took power.
I want my students to take the same power, even if it seems significantly less is at stake. Maybe that's the problem. Maybe it only appears as if nothing other than a letter grade is on the line. If so, that’s exactly where we as educators have failed. We have to find a way to free them and ourselves. Why keep pretending? Why continue the charade? Slavery, as Douglass tells us, affects everyone, including the masters whose tyrannical assumption of power corrupts even the beneficent Sophia Auld. I want my students to free me, too. They can only do this by assuming and wielding the power I would most readily concede. Take it, I want to tell them. Kill me.
William Major is professor of English at Hillyer College of the University of Hartford.
Apple recently unveiled its digital book-authoring program, iBooks Author, and I’m scared.
The last three years that I have dedicated to pursuing my Ph.D. in instructional design & technology, which centers on interactive digital text, have given me a new perspective on the delicate balance that is necessary for classroom technologies to be productive and fruitful rather than novel and superficial. The seemingly endless hours that I have spent reading journal articles, writing papers, reading book chapters, taking in lectures, reading conference proceedings, and reading some more, have left me feeling as though I have earned some sort of badge that licenses me to make qualified observations about new educational technologies.
But that’s just the problem; you don’t need to be qualified. iBooks Author allows any Apple user to design and develop an interactive, multitouch textbook. No design experience necessary.
I should be ecstatic that a layperson is able to design instructional products with applications that, until recently, required a personal computer programmer to develop. The digital revolution is finally upon us!
Not exactly. I’m concerned that the act of creating a digital book for students will impede the learning benchmarks that are expected of them. Let me put it this way: When was the last time you saw a well-designed, engaging PowerPoint presentation, where the speaker did not read the words directly off of the slide, verbatim? This is my point. We have allowed everyone to become an instructional designer.
This phenomenon is occurring much more broadly. We are encouraging everyone to become an expert on everything. When I feel a swollen lymph node on my 3-year-old daughter’s neck, I don’t immediately call her pediatrician. I consult WebMD. I’m convinced it is a severe case of lymphoma until the pediatrician assures me that her body is just fighting off a cold. He prescribes more vitamin C.
When I hear that the Dow Jones Industrial Average has once again dipped below 10,000, and it is only going to get worse, I jump on to my eTrade account and start selling. I’m not a stock trader. I don’t know anything about the stock market. Nor am I a physician. So why am I acting like one? Because anyone can be an expert, and instructional design is no exception.
I teach at a small university and an even smaller community college in the Southeast. Every semester during my brief five years’ experience, I have been assigned course sections accompanied by a blank Blackboard (or Moodle) shell and told to design a course. Not once have any of my Blackboard (or Moodle) course sites been evaluated, and most have never been viewed by anyone but my students.
The idea that instructors are somehow incapable of violating basic instructional design principles is naive. What percentage of our nationwide faculty has heard of the split-attention effect, redundancy principle, contiguity principle, cognitive flexibility, or even cognitive load? Now, instructors are expected to be subject matter experts and instructional designers. The two are not synonymous, and the results can be detrimental to learning. iBooks Author is giving creative license to everyone, with or without instructional design experience.
For instance, iBooks Author touts the ability to embed multiple-choice quizzes into the text, yet the research on inserting lower-level, recall-type adjunct questions in text has been mostly inconclusive since the 1960s. Its effect on comprehension is minimal at best, but its impact on extraneous cognitive load is more likely. A more desirable widget would be to allow the user to interact with the text generatively, that is, by generating unique paraphrases, summaries or analogies.
Be aware of another thing: if you are going to use iBooks Author to design and develop that bestseller that you have always wanted to write, be prepared to sell it only in the iBookstore. That’s right. By creating your book in the iBooks Author output format, you are entering an exclusive licensing agreement with Apple. Check the fine print.
Let me be clear: I love Apple. I love admire its pursuit of innovations in educational technology. In fact, I composed this rant on an iPad. So, I suppose iBooks Author is not completely negative. It opens the discourse on interactive text in education. But the thought of anyone being able to develop entire textbooks for class use on his or her MacBook worries me. Interactive, customized, and adaptive text should be the next educational technology milestone, but not like this.
We are all going to continue to embrace and applaud Apple’s newest, sleekest application, because Apple is masterful at luring educators to its sexy designs and technology clique. But we should recognize that iBooks author is not an instructional tool that supports proven ID theory. And as a result, we will continue to build an increasingly accessible virtual world where we can act as professional instructional designers, physicians, and stock traders: with no experience necessary.
So I will leave you with something to think about: Technology doesn’t make us experts. Let’s recognize that a teacher is not inherently an instructional designer. Let the designers design, and teachers teach. Besides, teachers don’t get paid enough to do both.
Alan J. Reid is a Ph.D. student in instructional design and teaches English courses at Brunswick Community College and Coastal Carolina University.
Submitted by Ryan Craig on February 3, 2012 - 3:02am
Over the past few weeks, the news media has been abuzz over two developments in higher education that some in the chattering class foretell as the beginning of the end of degree programs.
First, MIT announced that it would extend its successful OpenCourseWare initiative and offer certificates to students who complete courses. Like OpenCourseWare, which has provided free access to learning materials from 2,100 courses since 2002 (and which, with more than 100 million unique visitors, has helped launch the open education movement), MITx will allow students to access content for free. But students who wish to receive a certificate will be charged a modest fee for the requisite assessments. The kicker is that the certificate will not be issued under the name MIT. According to the University: “MIT plans to create a not-for-profit body within the institute that will offer certificate for online learners of MIT coursework. That body will carry a distinct name to avoid confusion.”
Then, Sebastian Thrun, an adjunct professor of computer science at Stanford who invited the world to attend his fall semester artificial intelligence course and who ended up with 160,000 online students, announced he had decided to stop teaching at Stanford and direct all his teaching activities through Udacity, a start-up he co-founded that will offer online courses from leading professors to millions of students. Udacity’s first course is on building a search engine and will teach students with no programming experience how to build their own Google in seven weeks. Thrun hopes 500,000 students will enroll. He called the experience of reaching so many students life-changing: “Having done this, I can’t teach at Stanford again. I feel there’s a red pill and a blue pill. And you can take the blue pill and go back to your classroom and lecture your 20 students. But I’ve taken the red pill, and I’ve seen Wonderland.”
Just as the Web 2.0 boom is recapitulating much of the excitement and extravagance of the dot-com boom, we get the funny sense we’ve seen this movie before. Take a look at this excerpt from a dot-com era New York Times article with the headline “Boola Boola, E-Commerce Comes to The Quad,” which anticipates Professor Thrun’s announcement by 12 years:
"We always thought our new competition was going to be 'Microsoft University,' " the president of an elite eastern university ruefully remarked to a visitor over dinner recently. ''We were wrong. Our competition is our own faculty.'' Welcome to the ivory tower in the dot.com age, where commerce and competition have set up shop… Distance learning sells the knowledge inside a professor's head directly to a global on-line audience. That means that, just by doing what he does every day, a teacher potentially could grow rich instructing a class consisting of a million students signed up by the Internet-based educational firm that marketed the course and handles the payments. ''Faculty are dreaming of returns that are probably multiples of their lifetime net worth,'' said Kim Clark, dean of the Harvard Business School. ''They are doing things like saying, 'This technology allows someone who is used to teaching 100 students to teach a million students.' And they are running numbers and imagining, 'Gee, what if everyone paid $10 to listen to my lecture?' ''
It was a heady time, and many in higher education really believed the hype that brand-name institutions would grow to hundreds of thousands of students and that “rock star” faculty would get rich teaching millions of students online. Twelve years later, the only universities with hundreds of thousands of students are private-sector institutions whose brands were dreamed up by marketers in the past 30 years, and the only educator who has become a rock star through the Internet is in K-12, not higher education (more on him in a moment). So what happened?
The currency of higher education is degrees because degrees are the sine qua non of professional, white-collar, high-paying jobs. The difference between not having a degree and having a degree is hundreds of thousands of dollars in lifetime earnings. So what happened is that Professor Thrun’s antecedents like Arthur Miller, the Harvard Law professor, found that while they might offer courses, faculty cannot offer degrees. And their brand-name institutions have continued to prioritize avoiding “confusion” over extending access. Even MIT, the most forward-thinking of the lot, will ensure its new offering cannot possibly be construed as an MIT degree.
The noise emanating from these recent announcements boils down to this: when the chattering class meets Professor Thrun, it’s love at first sight. The notion that they might take a Stanford course for free recalls their youthful days at similar elite universities. But of course, these educational romantics already have degrees. And when Udacity begins charging even modest fees for its courses, Professor Thrun may find this group resistant to paying for lifelong learning.
On the other hand, you have the much, much larger group of non-elites who need a degree. The United States, once the global leader in the number of 25-34 year-olds with college degrees, now ranks 12th, while more than half of U.S. employers have trouble filling job openings because they cannot find qualified workers. The outsized importance of the degree itself over the university granting the degree or the faculty member teaching the course is the simplest explanation for the explosion in enrollment at private-sector universities.
As a result, the notion that certificates or “badges” might displace degrees in any meaningful timeframe is incorrect. Even in developing economies, where there is truly a hunger for knowledge in any form and where the degree may not yet be as central to the evaluation of prospective employees, the wage premium from a bachelor’s degree is even higher: 124 percent in Mexico, 171 percent in Brazil and 200 percent in China, compared with a mere 62 percent in the U.S. Degrees are definitely not disappearing; they’re not even in decline.
There are two important respects, however, in which this movie is different. The first must be credited to the first online “rock star” educator: Salman Khan, founder of Khan Academy. If you haven’t had the pleasure of watching a Khan video, you haven’t missed much in the way of the simulations, animations and expensive special effects many dot-com pundits predicted would dominate online learning. A Khan video is short, just a few minutes, and teaches a single concept. It does so by showing Khan’s hand on the whiteboard while you hear his narration – an approach that is especially effective for math. Professor Thrun’s online course builds on Khan’s innovation, and the resulting andragogy is remarkable.
With regard to the more important innovation, here’s what Professor Thrun had to say in his announcement:
We really set up our students for failure. We don’t help students to become smart. I started realizing that grades are the failure of the education system. [When students don’t earn good grades, it means] educators have failed to bring students to A+ levels. So rather than grading students, my task was to make students successful. So it couldn’t be about harsh, difficult questions. We changed the course so the questions were still hard, but students could attempt them multiple times. And when they finally got them right, they would get their A+. And it was much better. That really made me think about the education system as a whole. Salman Khan has this wonderful story. When you learn to ride a bicycle, and you fail to learn to ride a bicycle, you don’t stop learning to ride the bicycle, give the person a D, and then move on to a unicycle. You keep training them as long as it takes. And then they can ride a bicycle. Today, when someone fails, we don’t take time to make them a strong student. We give them a C or a D, move them to the next class. Then they’re branded a loser, and they’re set up for failure. This medium has the potential to change all that.
So when Anant Agarwal, one of the leaders of the MITx effort, notes that “human productivity has gone up dramatically in the past several decades due to the Internet and computing technologies, but amazingly enough the way we do education is not very different from the way we did it a thousand years ago,” the major advance he has in mind is not rock star professors lecturing to millions, but rather that the online medium lends itself perfectly to a competency-based approach.
The shift from “clock hours” or “seat time” to competency-based learning is just around the corner and much more fundamental to higher education than the explosion of online delivery itself. Awarding credits and degrees based on assessed competencies will significantly reduce time to completion and therefore increase completion rates and return on investment. More important, it ensures that students actually have mastered the set of competencies represented by the degree they have earned. Though not without significant challenges, this approach has the potential to revolutionize degree programs and all of higher education from within. That’s the real Wonderland adventure. And we don’t need to take a pill to find it.
So have we seen this movie before? Turns out this one’s a sequel. But this is that very rare occasion when the sequel is much better than the original.