I will threaten financial, if not physical, punishment for the next student who sits opposite me and glibly announces that s/he desires, “time off.” These seniors then expect me to find them a fellowship for the self-proclaimed period of inaction.
A fundamental error underlies their logic. If one takes time “off,” no project exists for a foundation to fund. No one hires an employee because they “need a break.” It equates to writing, “Objective: Naps” across the top of a resume.
If students want to become doctors, they can and should pursue research and service interests after they complete their undergraduate degree and before they enroll in the memorization marathon known in the US as “M1.” They need the context and motivation to endure the peculiar form of torture we require of aspiring physicians.
In neither theory nor practice should such efforts carry the belittling label, “time off.” If working with blind babies is “time off,” it means such service has no value in and of itself. It exists merely as a means to fill a vacuum while a med school applicant flies around to interviews. Who would wish such a selfish person to treat a child let alone to offer fellowship funding for the egotistical endeavor?
The problem with perceiving any activity other than enrollment in a course as “time off” persists after students return to the academy. Last week, I sat in a room with doctoral students from the recently arrived to the nearly finished. When they introduced themselves, many referred to their “time off” between undergraduate and graduate school. I pressed them to explain what that meant.
They had jobs. To them, this didn’t count in their evolution as individuals or their training as scholars. However, their non-academic experience plays a profound role in each. The dismissive “time off” label strips their activities of value; diminishes a job in data entry to the equivalent of a coma; and renders young scholars inarticulate defenders of the skills and interpretive perspectives they acquired and can bring to bear as a result of their years of labor.
I fully endorse those who take time to explore new places and learn new skills. For those who develop a strong rationale for why they need to go wherever to do whatever, someone out there may well be willing to pay them to do it. Fulbright awards go to those who know why they wish to research, study, or teach; but not to those who want a State Department stipend for time spent sunning on a foreign beach.
Most of those who desire “time off” don’t mean Margaritas in the Mediterranean. They mean a job, an income, and some independence before they return to the infantilizing aspects of graduate education.
All of these require “time on” the job, whether that is high finance-low ethics, high ethics-minimum wage, or the myriad permutations in between. The capital accrued - cultural and/or financial - remains after the retreat from autonomy to lecture halls and reading lists. When students deny their capital, they deprive themselves of the advantages they gained over those like me who proceeded apace from commencement to convocation without “time in” the ever-elusive “real world.”
As someone who didn’t take “time off,” it is admittedly odd that I should sound the alarm about the phrase’s semantic sloppiness. Nonetheless, the inherent snobbery in the assumption that any activity outside credit hours towards a degree lacks value fills me with revulsion. If only medical training counts, my life must be meaningless in the eyes of the students before me. If only doctoral degrees count, then graduate teaching assistants must despise the majority of those they teach. Something would most certainly be “off” - indeed awful - in such a world.
Elizabeth Lewis Pardoe is a member of the University of Venus editorial collective and an associate director of the Office of Fellowships at her undergraduate alma mater, Northwestern University, where she teaches History and American Studies. For more, follow @ejlp on Twitter or go to http://elizabethlewispardoe.wordpress.com.