When I started teaching writing at City College in 2002, I took a poll and every one of my students had a cell phone. I told them that I didn’t have one. It was meant to be a humanizing detail, an icebreaker. I explained how I thought that phones that you carry with you were invasive and distracting and thus dangerous. I should have, but didn’t, ask them to write about the topic.
Instead, I took advantage of the good feelings in the room as an opportunity to outline my cell phone policy, strictly enforced for years: No cell phones in class, ever. If I saw one out or heard a ring, I would ask the student to leave. I wanted to make the point that while students are in class, or doing anything for that matter, they should give the task at hand their undivided attention. It should be noted that there was no equivalent policy against doodling, staring out the window (in the rare instances when there was one), or staring at a classmate’s tight clothes.
In 2006, after my first child was born, I was able to resist getting a cell phone, to the amusement and frustration of my friends and family for another two years. Finally, when my wife was pregnant with our second child, and I was commuting twice a week 90 miles upstate to teach, I came home one day to find a pay-by-the-minute phone activated for me. For a year and a half, I used the phone only when necessary (it was amazing how difficult it had become to make plans with someone — "We’ll meet around eight, somewhere downtown; I’ll text you the exact time and place around then. Oh right, you still don’t have a cell phone"). I’d show off my bulky, bare-bones phone to classes, so they could have a laugh at how primitive it was at the beginning of the semester, and then I’d still drop the hammer on my strict cell phone policy.
But then, all of the sudden, something changed in me, I finally wanted a decent cell phone. And no, it wasn’t because I wanted an iPhone. It was just that I wanted a phone that I could do things with, like kill smug pigs with exploding birds or find out where the traffic was in Queens without waiting until the radio’s report on the 8s or 1s of the hour. So I bought a relatively cheap smartphone.
I didn’t tell my students, nor did I dare, for fear of setting a bad example, pull it out in their presence. But within a week, the students must have somehow sensed something different in me. Requests never made before began popping up. During an open-book reading test, a student asked if he could use a downloaded version of the book on his phone. All right, I told him. A few days later, in a different class, as I was putting an assignment’s instructions on the board, a student asked if he could take out a cell phone to snap a photo of the instructions instead of writing them down. Why not?
Now I start every semester teaching the difference between the register of socializing and academic English by having students translate their informal, acronym-filled text communiqués into formal academic prose. They get it instantly. "LMAO" gets changed to "I find that funny." "OMG shes such a skank" becomes "Wow, she is dirty." The longer and more incomprehensible the message, the more I learn. How else would I have known that "whip" can mean car or that "white boys" may refer to rolling papers?
All of these experiences had only suggested to me that cell phones might be useful as educational tools to a very limited extent. For some time, I continued to believe that by and large they still didn’t belong in the classroom. Until recently, that is. A few months back, I was listening to a radio program about Tony Schwartz, a New York field-recording specialist, whose work dates back to the 1950s. I was in the car, stopped at a light, and without a pen to write down information about an upcoming event on Schwartz, I pulled out my phone and, in an instant, recorded a voice memo. Later, after listening to the voice memo, I was reminded that I wanted to record a poem I had been working on. I printed out a draft, and instead of opening my laptop, I took out the phone.
Perhaps the most frustrating part of teaching writing is reading student papers that are filled with myriad, obvious anacoluthons. The mistakes themselves are not what frustrate. Instead, it’s how these mistakes suggest the students didn’t even bother to read their own work even once before handing it in; it’s how the students have ignored the most oft-repeated proofreading advice given by writing instructors of all levels (and one I repeat with each assignment): read it aloud before you hand it in. But now, it all of the sudden occurred to me, as I was sitting there with the printed poem and the cell phone, I can make them do this. I can make them read their work out loud and demonstrate this by having them upload the file online. I can make them do this all before they hand in a final draft of a paper.
Last semester teaching developmental writing at Queensborough Community College, I gave my students explicit instructions on how to record files of themselves reading their papers, both on their phones and in the computer lab. I also demonstrated for them how fast they should read, making a point to demonstrate with a document that had errors. I would interrupt my deliberate cadence on the error, which from the snickers I could tell they’d all heard too, asking for suggestions on correcting the mistake. After correcting the mistakes, I would start my recording again until I did a reading that didn’t have any writing mistakes (as opposed to reading mistakes, which I say are fine as long as they are corrected with a rereading).
The student responses to these assignments have been mostly positive. However, some students struggle with technology already, and they are none too eager to have to use it some more. Others resent having to do more work than they think they are supposed to do for a writing assignment. A fair amount, though, have grown to appreciate how much this technique helps them -- and not only in finding grammatical mistakes. More than one student has reported that they noticed their arguments or narratives don’t make sense when they read them out loud. My own sense is that the student progress made this semester has exceeded the student progress of previous semesters.
I should say the process isn’t a panacea. One student, who has explained in other writing projects that she is at college because her parents are forbidding her to go cosmetology school, uploads files that sound like an LP sped up to 45 rpm. I reply to her postings, asking her to slow down her reading. She has slowed down some, but not completely. Regardless, she has done the work, and while at the beginning of the semester I could barely understand one of her sentences, now she is writing papers in which there are still numbers of run-ons sentences, but clear sentences.
Another student, who is registered with our SSD (Services for Students with Disabilities) office, writes almost exclusively in simple sentences. If I assign a two-page paper, maybe I’ll get two complex sentences from him. When I listen to his readings, his voice is flat and mechanical. There were very few mistakes in his sentences to begin with. My challenge of helping him learn how to subordinate and coordinate thoughts has not been accomplished with these recording assignments. Many students have become competent and confident writers.
I have started grading the papers while listening to the students’ readings of them. When the students read at deliberate pace, I have more than enough time to highlight mistakes and begin my comments in the margins. When the information from the student is clear and cogent, I put a check next to the sentences. Again, my sense is I have never gone through so many papers giving almost nothing but checks. At least three times this semester, I’ve written on a student paper, "this is one of the best papers I have ever read in this class." I can’t remember writing that once before in my two years at Queensborough.
Recently a colleague at Queensborough stopped by my office to chat. He mentioned he had to ask someone to leave his class that day because she was using her cell phone during class. He knows of my strict cell phone policies of the past. “It just drives you crazy,” I sympathized with him.
He then noticed the headphones on my desk. “What are you listening to?” he asked. “Student papers,” I told him. I explained the project and how I was encouraging students to use their cell phones to make the recordings. “I’ve finally surrendered,” I offered, apologizing for abandoning the no-cell-phone-ever hardliners.
The generous interlocutor my colleague is, he mused, “I guess you’re right, we have to find a way to make them useful in class.” That wasn’t what I was thinking or intending, but there you go.
Jed Shahar is assistant professor in the Basic Education Skills Department of Queensborough Community College of the City University of New York.
A group of companies and higher ed groups on Tuesday announced a project aimed at expanding Internet capabilities at rural colleges and universities across the country. The project, called AIR.U., would increase the broadband available to those institutions and their neighbors by harnessing the unused frequencies, called "white space," of defunct television channels. The partners in the deal, which include Microsoft, Google, the New America Foundation and the National Institute for Technology in Liberal Education (among many others), are billing the project as an altruistic effort to "[upgrade] broadband offerings in those communities that, because of their educational mission, have greater than average demand but often, because of their rural or small town location, have below average broadband." The first networks are expected to come online early next year.
Recent events at the University of Virginia following the decision of the institution's governing board to remove its president after only two years in office have brought to light some questionable claims that have been animating educational reformers lately.
In a statement justifying the Board of Visitors’ decision, the Board’s rector, Helen Dragas, asserted that U.Va.’s president, Terry Sullivan, was unwilling to make the kinds of changes necessary at a time when universities like Virginia are facing an “existential threat.” The times, Dragas claimed, call for a bold leader willing to impose “a much faster pace of change in administrative structure, in governance, in financial resource development and in resource prioritization and allocation” than was Sullivan. “The world,” Dragas believes, “is simply moving too fast.”
Dragas’s comments echo those of many reformers who believe that new technologies are producing “disruptive innovation,” forcing universities, which are supposedly conservative by nature and controlled by faculty who are invested in outmoded ways of doing things, to transform themselves to meet the needs of the 21st century.
The claim behind this is, quite simply, that new technologies have alone made it possible to transcend the “traditional” campus model. Whether through new learning technology on campus or through distant online education, technology will cut costs, improve access, and completely reframe the foundation of the American academy: tenure, shared governance, and the centrality of brick-and-mortar classrooms and flesh-and-blood faculty.
The problem with this refrain is that it ignores the importance of ideas and politics. As Mark Blyth of Johns Hopkins has written, disruptive innovation is not just a technological act, but one in which new contexts enable people with pre-existing ideas for reform to push their pre-existing agenda. In short, ideas can be used to determine what gets defined as a crisis and what gets defined as the appropriate solution.
In this case, the constant iteration of “disruptive innovation” provides an aura of inevitability to issues that are to be worked out on the ground through politics, on the one hand, and through decisions by specific colleges and universities, on the other hand. There is nothing inevitable about how new technology is used, what its goals are, and who will control it: these are all matters that some one -- or some group -- will decide.
The refrain of disruption is misleading. It suggests that the “traditional” college is on its last legs, when this is far from being the case. In fact, there is nothing inherently disruptive about new technologies. At their best, as the New York Times's David Brooks rightly notes, new technologies can improve colleges by allowing teachers to be more effective within classrooms.
At their worst, new technologies will be used to replace the human dimension of education with machines. But those are choices that we, as citizens, policy makers, and members of the academy are free to make. Change may be inevitable, but the direction and meaning of change are not.
For decades critics of higher education have sought to limit the centrality of the liberal arts -- and the humanities in particular -- in the American college curriculum. For decades, critics of higher education have sought to eliminate tenure and reduce shared governance to make universities more accountable to managers. For decades, critics have sought to rely on market-based ideas both to fund universities and to determine which programs are worth funding. None of these criticisms relied on new technologies. They were present in the 1970s and 1980s.
What we are really seeing is not necessarily a moment of disruptive change.
Rather, those who are already hostile to the academy are invoking the idea of disruption to convince the rest of us that the changes they desire are inevitable.
The new technologies are an excuse; the reality is that many of the changes being imposed on universities across America -- and exposed in the debates at the University of Virginia -- are not about technology and disruptive innovation but about those who have a particular vision of American higher education and want to see it happen.
In short, it’s about politics and values, and there’s nothing inevitable about those.
Johann Neem is associate professor of history at Western Washington University. He received his Ph.D. in history from the University of Virginia in 2004.
While it may go against the grain for faculty members who aren't digital natives, Paula Dagnon and Karen Hoelscher explain how to find out whether creating an electronic portfolio of your work is right for you.
The Bill & Melinda Gates Foundation on Tuesday announced $9 million in grants for "breakthrough learning models" in higher education:
The awards include:
$3.3 million to EDUCAUSE for four winners of the Next Generation Learning Challenges' latest RFP. These winners include state systems, four-year and two-year programs, and all have signed up to deliver significant improvements in completion at scale, at affordable tuition rates.
$3 million to MyCollege Foundation to establish a nonprofit college that will blend adaptive online learning solutions with other student services.
$1 million to Massachusetts Institute of Technology to develop and offer a new, free prototype computer science online course through edX, a joint venture between MIT and Harvard, and partner with a postsecondary institution that targets low-income young adults to experiment with use of the course in a "flipped classroom."
$450,000 to the League for Innovation in the Community College to develop and pilot a national consortium of leading online two- and four-year colleges that will help increase seat capacity in the community college system and support more low-income young adults in attaining a postsecondary credential. The consortium will initially include Coastline Community College (CA), the University of Massachusetts Online, Pennsylvania State World Campus and the University of Illinois-Springfield.
In a recent Wall Street Journal interview about college costs and online learning, Stanford University President John Hennessy said, "What I told my colleagues is there’s a tsunami coming. I can’t tell you exactly how it’s going to break, but my goal is to try to surf it, not to just stand there." Stanford and other elite institutions, such as Harvard and Carnegie Mellon Universities, and Massachusetts Institute of Technology are not sitting back and waiting for technology to disrupt higher education — they are out there experimenting with both delivery formats and cost. They are part of the change. This is why they are elite. They boldly anticipate. And they have the wealth, confidence and the unassailable market niche to do so.
But are they looking in the right place for that tsunami? I would argue “no!” Much of their current effort is directed at experimenting with online learning. This is a necessary component of the massive change that potentially will reconfigure higher education in the United States. Princeton and Stanford Universities and the Universities of Michigan and Pennsylvania have combined to form Coursera, offering free selected courses to the public. Harvard and MIT have announced a new nonprofit partnership, known as edX, to do the same. Carnegie Mellon is offering its Open Learning Initiative (OLI) to the public.
But all of these efforts are not the tsunami. Open online learning is merely a tool that adds variety to how education is delivered. And many 18-21-year-olds and their families still believe — despite the rhetoric to the contrary — that a college education is as much about maturing in a residential setting as it is about learning or getting a job.
No, online learning may be part of the current, but the tsunami itself will be something different. The tsunami will come from a notion as old and as distinctive as American education itself. The notion about which I speak is that education takes place not just in the classroom — and now through a computer, iPad or smart phone screen — but literally "everywhere, anywhere, anytime."
Yes, education happens in schools and colleges, but it happens also in the home, on the job, at places of worship and through individual initiative. Education also is never finished. A degree offered decades ago — even a few years ago — is obsolete with respect to up-to-date factual knowledge (critical-thinking skills, leadership skills in a residential setting and historical knowledge stay relevant, however). The "anytime" in a distinctively American education means that there is an imperative to amass knowledge through a lifetime and demonstrate acquisition.
Now, imagine that a highly respected, unassailable institution or set of institutions offers a set of completion exams at the bachelor’s level to anyone everywhere, anywhere, anytime. One need only look at the GED, or to some extent the Western Governors University, to say this is possible. Of course, a GED probably doesn’t have the "prestige" of a regularly earned degree and the WGU is still a new model. But we are talking here about what is possible over time with experimentation, improved technologies and unrelenting public pressure to offer an undergraduate education at a more reasonable price than currently predicted.
Necessity clearly still drives invention. Imagine that this move is made by those extremely prestigious research universities currently at initial stages of experimentation with online learning, open access and the rewarding of certificates. Imagine that these universities find a way to equal a high level of academic achievement online to that on their residential campuses, are secure in knowing that there will be always sufficient students who wish a traditional residential experience at their respective campus, and convince their alumni and the public that their coursework on campus and online is academically equivalent as far as the transfer of knowledge is concerned. Would they ultimately leave money on the table in times of ever increasing financial constraint and unrelenting demand to fund pioneering research? Would they restrain from total market dominance?
Imagine the moment when these completion exams permit a person to assemble learning from a variety of academic institutions and life experiences to complete a degree. At that moment, the monopoly of institutions over source and cost loosens, and the student gains control of how knowledge is to be gained and at what price. At that moment, the sources of learning are severed from credentialing. At that moment, American higher education is radically changed.
A tsunami is in the making, but it will encounter a wall of resistance in yet another defining characteristic of American higher education — a 24/7 residential learning and living experience that aims not just to transfer knowledge to 18-21-year-old students, but also to guide their maturation into citizenship. This pushback will be located squarely in the historically prestigious liberal arts colleges and in those institutions like the Ivies and the major research universities confident in securing undergraduates regardless of alternative developments because they have the wealth to afford what always was. But this wall of resistance is not very deep when it comes to all students. All the governors and other policy makers embracing WGU and other forms of recognition for prior learning as well as online learning seem to be quite willing to give up that residential experience, at least for other people’s children.
This residential learning is often inefficient, costly and repetitive, and that is because many developing young people are emotionally and intellectually unpredictable during undergraduate years. The mission for much of 18-21-year-old undergraduate education is to move these students to another level of maturity and corresponding engagement. It is a worthy pursuit. It is education for democracy.
The tsunami is close to shore. The warning siren is sounding. But the outcome is not evident. A barrier — albeit increasingly thin -- formed by commitment to undergraduate residential education for democracy confronts a wave of convenience and necessity defined by centralized credentialing, dispersed sourcing of knowledge and learner-controlled pricing. This is the wave to surf and the shoreline to protect.
William G. Durden is president of Dickinson College.