Matthew Daniel Eddy’s fascinating paper “The Interactive Notebook: How Students Learned to Keep Notes During the Scottish Enlightenment” is bound to elicit a certain amount of nostalgia in some readers. (The author is a professor of philosophy at Durham University; the paper, forthcoming in the journal Book History, is available for download from his Academia page.)
Interest in the everyday, taken-for-granted aspects of scholarship (the nuts and bolts of the life of the mind) has grown among cultural historians over the past couple of decades. At the same time, and perhaps not so coincidentally, many of those routines have been in flux, with card catalogs and bound serials disappearing from university libraries and scholarship itself seeming to drift ever closer to a condition of paperlessness. The past few years have seen a good deal of work on the history of the notebook, in all its many forms. I think Eddy’s contribution to this subspecialty may prove a breakthrough work, as Anthony Grafton’s The Footnote: A Curious History (1997) and H. J. Jackson’s Marginialia: Readers Writing in Books (2001) were in the early days of metaerudition.
“Lecture notes,” Eddy writes, “as well as other forms of writing such as letters, commonplace books and diaries, were part of a larger early modern manuscript world which treated inscription as an active force that shaped the mind.” It’s the focus on note taking itself -- understood as an activity bound up with various cultural imperatives -- that distinguishes notebook studies (pardon the expression) from the research of biographers and intellectual historians who use notebooks as documents.
Edinburgh in the late 18th century was buzzing with considerable philosophical and scientific activity, but the sound in the lecture notes Eddy describes came mainly from student noses being held to the grindstone. For notebook keeping was central to the pedagogical experience -- a labor-intensive and somewhat costly activity, deeply embedded in the whole social system of academe. Presumably the less impressive specimens became kindling, but the lecture notebooks Eddy describes were the concrete embodiment of intellectual discipline and craftsmanship -- multivolume works worthy of shelf space in the university library or handed down to heirs. Or, often enough, sold, whether to less diligent students or to the very professors who had given the lectures.
The process of notebook keeping, as Eddy reconstructs it, ran something like this: before a course began, the student would purchase a syllabus and a supply of writing materials -- including “quares” of loose papers or “paper books” (which look pocket-size in a photo) and a somewhat pricier “note book” proper, bound in leather.
The syllabus included a listing of topics covered in each lecture. Eddy writes that “most professors worked very hard to provide lecture headings that were designed to help students take notes in an organized fashion” as they tried to keep up with “the rush of the learning process as it occurred in the classroom.” Pen or pencil in hand, the student filled up his quares or paper book with as much of the lecture material as he could grasp and condense, however roughly. The pace made it difficult to do more than sketch the occasional diagram, and Eddy notes that “many students struggled to even write basic epitomisations of what they had heard.”
The shared challenge fostered the student practice of literally comparing notes -- and in any case, even the most nimble student was far from through when the lecture was done. Then it was necessary to “fill out” the rough notes, drawing on memory of what the professor said, the headings in the syllabus and the course readings -- a time-consuming effort that could run late into the night. “Extending my notes taken at the Chemical and Anatomical lectures,” one student wrote in his diary, “employs my whole time and prevents my doing any thing else. Tired, uneasy & low-spirited.”
As his freshman year ended, another wrote, “My late hours revising my notes taken at the lectures wore on my constitution, and I longed for the approach of May and the end of the lectures.”
Nor was revision and elaboration the end of it. From the drafts, written on cheap paper, students copied a more legible and carefully edited text into their leather notebooks, title pages in imitation of those found in printed books. The truly devoted student would prepare an index. “While many of them complained about the time this activity required,” Eddy writes, “I have found no one who questioned the cognitive efficacy that their teachers attached to the act of copying.”
Making a lecture notebook was the opposite of multitasking. It meant doing the same task repeatedly, with deeper attention and commitment at each stage. Eddy surmises that medical students who prepared especially well-crafted lecture notebooks probably attended the same course a number of times, adding to and improving the record, over a course of years.
At the same time, this single-minded effort exercised a number of capacities. Students developed “various reading, writing and drawing skills that were woven together into note-taking routines … that were in turn infused with a sense of purpose, a sense that the acts of note taking and notebook making were just as important as the material notebook that they produced.”
You can fill an immaterial notebook with a greater variety of content (yay, Evernote!), but I’m not sure that counts as either an advantage or an improvement.
As a lifelong champion of civil rights and a firm believer in fighting for what is right, I applaud our young people for the various protests they have undertaken in recent years, such as Occupy Wall Street and Black Lives Matter. Recently, the young brothers and sisters of MRC Student Coalition at Matteo Ricci College, Seattle University, have taken up such a fight based on curriculum concerns. This protest, however, has become personal for me, since it is in part centered on my autobiography entitled Nigger, and the fact that some students became offended when Jodi Kelly, dean of Matteo Ricci College, recommended Nigger to a student to read.
While I strongly support their right to air their grievances, I ask these students to ask themselves if the scale of their movement is appropriate for a curriculum discussion. Can students adequately connect a recommendation to read my autobiography with their larger curriculum issues?
I am not offended by Dean Kelly's use of the word “nigger.” In fact, I am pleased that she has the foresight to want to give these young men and women the knowledge, insight and experience of a civil rights activist that might just help them understand life a little better. I am disappointed that they seemed to have stopped at the title instead of opening the book and reading its contents. Years ago my mama told me, “Son, sticks and stones can break your bones, but names will never hurt you.” I grew up thinking that Richard was what they called me at home, but my real name was Nigger.
That’s why I named my autobiography Nigger, because it only echoes what “they” called me -- it doesn’t define who I am. People called me nigger in 1964 when I marched with Martin Luther King Jr., when I sat in the Birmingham jail with him and when I walked across the Pettus Bridge in Selma, Ala. When I fasted for 72 days protesting the Vietnam War, the white folks and even some black folks said, “Look at that crazy nigger.”
I have frequently said that sometimes black folks focus on the wrong injustices. For example, black folks tolerate Howard University being named after General Howard, who became famous for killing Indian children, and Spelman College being named after the grandmother of Nelson Rockefeller. Many times we rise up for injustices that are not the most oppressive. I frequently speak on college campuses and explain that we were fighting for liberation, not education. A liberated mind requires a deeper historical and analytical understanding about the good, bad and ugly regarding America’s past, and its future.
I tell students they should be concerned that some of their classmates can’t walk down the streets in certain cities without the fear of being shot by both gangbangers and misguided police officers. They should be concerned about violence and sex assaults on college campuses. The National Rifle Association wants to arm students -- and that doesn't bother you? Students should be concerned about the exorbitant cost of education and subsequently student debt, the next financial crisis. Also, students should be concerned about a job market that is not going to be waiting on them upon graduation.
Some students want to punish Dean Kelly for giving them some good advice. Somehow, her advice and the hypersensitivity to her suggesting my book by its title managed to get dragged into the curriculum debate. By adding a hot-button racial component to that debate, the students managed to water down their main objectives. Movements need to be clear and well defined, yet they rarely are.
I have read the students’ list of demands, which include issues such as “Classrooms which encourage healthy academic discourse.” That includes fostering dissent, analyzing diverse narratives, discussing the intention of others and dealing with microaggressions. Classrooms must also provide space to discuss how books from previous generations may be problematic yet remain very much connected to modern-day issues.
There’s a blurb on the cover of my book from The New York Times. It says, “Powerful and ugly and beautiful … a moving story of a man who deeply wants a world without malice and hate and is doing something about it.” I’m proud that Dean Kelly recognized my autobiography as something that students of all races can learn from. I am also proud and pleased that my autobiography Nigger was among her collection of books.
I do not presume to know more than professional educators. Yet it appears to me that Dean Kelly was encouraging healthy academic discourse. I ask the students to ask themselves if they objectively considered Dean Kelly’s real intentions and if they themselves are willing to engage in academic discourse. This seems like an issue that is bigger than one dean. It also seems that if both sides were to have meaningful discussions, they would find common ground.
The person who has read and is now recommending my autobiography to her students is most likely not racist. My communication with Dean Kelly leads me to believe that she respects students’ intelligence enough to share with them the autobiography of a black man who was honest enough to name his book Nigger.
Students, your dean didn’t name the book -- I did. I am hopeful that my autobiography will become required reading at Matteo Ricci College -- and I am certain that it will be enlightening. I’ll even provide the books for free. Continue to be respectful and peaceful, loving and lovable.
God bless you,
Dick Gregory is an American civil rights activist, social critic, writer, entrepreneur and comedian.
No concept is arguably more popular in higher education policy today or seems to have broader consensus than institutional “skin in the game”: the idea that colleges need to be on some sort of financial hook when their students don’t succeed.
Students and families are spending near-record amounts on postsecondary training, yet students are dropping out and defaulting on loans at disturbingly high rates. Mix in high-profile collapses like Corinthian Colleges and near-daily stories of college graduates struggling to find employment, and we get policy makers coming to the disheartening conclusion that our higher education institutions are incapable of doing the very thing we expect of them -- creating capable graduates -- unless threatened with financial sanctions.
Yet is this really the case? Colleges spend a lot to recruit and retain students, and every one that leaves without completing represents lost time, money and effort that require more recruitment and retention dollars to replace him or her. Students who don’t finish or who complete but struggle to find employment create nothing but negative reputational outcomes that institutions must invest both time and resources to counteract. Plus, when those same students leave with loan debt and struggle to repay it, the institution may yet again spend dollars and effort on default prevention services.
Put it all together and it’s pretty clear that when students fall off a successful education path, institutions pay a very real financial price. But this is exactly what having skin in the game entails. So why are we pushing for policy and regulation to accomplish what’s already taking place?
Making colleges pay a second time for poor outcomes doesn’t make much sense, although critics will say that market-driven financial penalties are obviously just not doing enough to change institutional behavior. To believe that, however, we have to believe institutions, as producers, actually prefer to see some of their education outputs fail.
That’s awfully strange. If institutions could control how much students learn, then why would they consciously choose to send unprepared graduates into the labor market where they struggle to find and keep employment? And if they could control who graduates and who doesn’t, what economic rationale do they have for producing a mix of graduates and dropouts? If they really had a choice, why would they ever produce anything other than graduates?
Colleges today face a continuous barrage of criticism about whether they provide value for money, and so we’re left to ask under what circumstances colleges that capably control learning, degree completion and postgraduate employment outcomes would actually opt to produce substandard products. Does a business approach that thrives on threats of greater regulatory scrutiny exist? Does a “student failure” model bringing about additional enrollment management, default prevention and reputational costs make operational sense?
It’s pretty obvious that if institutions could control the types of outcomes that skin-in-the-game proposals wanted to see improvements on they’d already be doing so. What colleges and universities wouldn’t benefit from high graduation rates, stellar job placement statistics and graduates who earned enough money to comfortably pay off their student loans?
It’s also why the argument that the financial costs institutions already face just aren’t harsh enough doesn’t make much sense. It’s like suggesting that my dog doesn’t speak English because I’m just not spending enough time teaching him. The outcome and process we’re trying to link don’t fit the way we think they do.
What’s missing from the equation is the idea that academic success is a two-way street where students’ academic preparation, motivation and effort do as much to shape the outcomes we care about as the resources institutions provide them. In its absence, the obvious consequences of policies that only hold colleges accountable for outcomes that they share control over is that they put their effort into the things they can control -- which, in this case, is picking students they think are most likely to succeed.
All of this means that the losers from skin-in-the-game proposals end up being students who have less academic preparation and who come from underresourced school districts. We actually end up creating undermatching by putting greater pressure on colleges to pick “winners” and discouraging them from taking chances on individuals who may benefit the most from the type of education they offer.
It’s also likely to hurt institutions with open admissions policies and that currently enroll larger percentages of minority and nontraditional students. Community colleges, with their limited state budgets and high transfer rates, would suffer most, but so would any college drawing large populations of students from disadvantaged communities. In the long run, those institutions could face unsustainable financial and reputational costs.
There’s certainly a place for risk sharing in higher education, which is why institutions currently pay the real financial costs I described earlier. But if what we care about is making institutions more responsive to students’ long-run needs and expectations, then the solution lies in policies and practices that make such goals their focus.
Income-Share Agreements (ISAs) -- whereby colleges finance their students’ education in return for a fractional share of those students’ future income -- are a good example. They create not only financial penalties but also financial rewards for institutions that help students achieve long-term, sustained success. Driving more institutional revenues through ISA-style agreements also discourages the kinds of deceptive marketing practices that policy makers believe institutions engage in since colleges and universities would, over time, end up having to financially absorb the costs of misrepresenting their programs’ job placement prospects.
The fact is that it’s easy to think that simply imposing penalties on bad actors will fix the problem, yet the logic has to be there to justify the approach. The basis on which risk sharing proposals today are being crafted doesn’t meet the standards of sound policy. We owe it to both colleges and students to craft policies that work toward, not against, the system’s overall objectives.
Carlo Salerno is a Washington, D.C.-based education economist and private consultant.
We need only two things to convince our communities, public officials, local employers and parents of students and prospective students about the value of a degree in the humanities: stories and data.
In the humanities, we have always used stories well. We can assemble lots of anecdotes about our graduates and how, now that they’re gainfully employed, they use what they learned in our classes. Anecdotes are clearly not enough, however. We’re definitely not winning the public relations contest about what aspects of public higher education are worth investing in. So how can we supplement our good stories with good data, while keeping the discussion firmly rooted in the humanities?
In an effort to share strategies and to get better at making the case for the value of humanities education, a group of about 40 humanities faculty members and administrators, local employers, and public humanities representatives in southern New England got together recently. We talked about what student success in the humanities looks like, how we could measure what it gives students and how we would know when we’ve helped students to achieve it.
The question of student success is on everyone’s radar these days, and the discussion usually refers to retention and graduation rates. Our discussion in New England pointed a different way, however. We wanted to bring employers into the conversation to help them to understand what our students are learning and to help us to learn what they value in new employees. That is especially important for those of us who take issues of racial and economic diversity seriously. As Karen Cardozo, assistant professor of interdisciplinary studies at Massachusetts College of Liberal Arts, pointed out at the meeting, if we can show that humanities degrees have value in the workplace, we can assure working-class students, first-generation students and students of color that following a passion for history, philosophy, literature or music can lead to a good job, too.
Here’s how our meeting went:
First, we assembled by tables, trying to making sure an employer and a public humanities representative were at each table. (Public humanities representatives include those who work at museums, state National Endowment for the Humanities affiliates, cultural councils and the like.) Employers from publishing, local government and local small businesses also participated. (We hope to involve some larger employers next time we meet.) We also mixed in representatives of two- and four-year colleges, as well as public and private institutions.
Each table considered one question at a time, and we then discussed our answers in the group as a whole. Here are the questions:
What can a humanities graduate do?
What (else) should a humanities graduate be able to do?
How can we make sure students graduate with this knowledge or these skills?
How can we measure or assess whether they can do what we say they can do?
It was great to have employers at each table, and we moved them around between groups for each question so the tables could get different perspectives. Some of the employers were already savvy about what a humanities education delivers; others weren’t sure what exactly constitutes the humanities.
Together, we compiled a list of the skills that we think graduates have cultivated in their humanities education:
Writing skills, with style
Cultural competencies, intercultural sensitivity and an understanding of cultural and historical context, including on global topics
As part of our list, we also agreed that graduates should have the ability to:
Construct complex arguments
Provide attention to detail and nuance (close reading)
Ask the big questions about meaning, purpose, the human condition
Communicate in more than one language
Understand differences in genre (mode of communication)
Identify and communicate appropriate to each audience
Be comfortable dealing with gray areas
Think abstractly beyond an immediate case
Appreciate differences and conflicting perspectives
Identify problems as well as solving them
Read between the lines
Receive and respond to feedback
Then we asked what we think our graduates should be able to do but perhaps can’t -- or not as a result of anything we’ve taught them, anyway. The employers were especially valuable here, highlighting the ability to:
Use new media, technologies and social media
Work with the aesthetics of communication, such as design
Perform a visual presentation and analysis
Identify, translate and apply skills from course work
Perform data analysis and quantitative research
Be comfortable with numbers
Work well in groups, as leader and as collaborator
Identify processes and structures
Write and speak from a variety of rhetorical positions or voices
Support an argument
Identify an audience, research it and know how to address it
Know how to locate one’s own values in relation to a task one has been asked to perform
They also mentioned a need for better technological, project-management and conversational and interview skills.
We also discussed creating tables that would link the knowledge, skills and aptitudes of the first two questions to the kinds of work students might do after graduation, task by task. We’ve assigned that work to the participating employers.
To make sure that our students can graduate with the knowledge and skills we want to see, we know we would have to make some changes to the way our degrees are structured. Some of the changes we talked about at the meeting were:
Providing more faculty development to help professors be more explicit and intentional in language about the skills being taught
Creating a one-credit course on the relation of humanities to work and the professions
Using required courses (general education) and events (orientation) to introduce the need to connect courses and skills
Being intentional about double majoring, adding minors that enable students to pair professional training with humanities
Using successful alumni in programming
Integrating student employment with academics, through course work or portfolio reflection
Infusing reflective writing into courses
Encouraging community engagement with the curriculum
Providing avenues for student creativity to demonstrate higher-order skills
Taking on the idea of maker space—what are the humanities making?
Giving students self-assessment skills
Developing portfolios that include both work and reflection linking course work to other kinds of engagement, such as employment and student activities
Structured work shadowing opportunities
Creating local employer/faculty advisory groups to determine workforce needs and establish a common language
Building reflection, work, community engagement and shadowing into the credit structure
Capitalizing in four-year colleges and universities on work already being done at two-year institutions
The final task at our meeting was to come up with ways to measure whether we are doing what we say we are doing now, as well as if we pursue the changes we want to make. We developed the following list:
Alumni surveys, to determine short- and long-term impact of humanities education
Student surveys, at entry and exit, about how their ways of thinking have changed
Internship supervisor surveys
Determining whether local employers hire our graduates, why or why not, and whether those graduates have the needed knowledge and skills
Using capstone courses to assess ways students have been asked to combine humanities and work
Gathering information that can contribute to big data. Who else is collecting what we seek, and how can we combine their data with ours?
That was the most difficult assignment, and it’s the shortest list that our group developed. That, of course, was not surprising. Assessment has always been challenging, as any regional accreditation team can tell you.
But we had started the afternoon asserting that we want the general public to support humanities education and to understand the value of what we do, and so we knew we must to find good ways to collect evidence. That’ll be a topic in our next meeting.
We agreed that the next step, when we reconvene in May, will be for all of us to have made some progress on our own campuses toward both adding education in the new skills we think humanities students need and finding ways of measuring our success.
If you’re working on humanities student success initiatives, what tactics are you trying? With whom are you working? Are you getting any traction in your institution or region?
Making the case for the humanities can start on the campus, but it ultimately has to convince funders, parents and employers, too. We’re hoping to make southern New England the first Humanities Success Zone in the country -- where an employer with some job openings asks, “What kind of person would add some real value to our company beyond the specific skills we need for this job?” We want the answer that springs to mind to be: a humanities graduate.
Paula M. Krebs is dean of the College of Humanities and Social Sciences at Bridgewater State University, in Massachusetts. On Twitter, she is @PaulaKrebs.
It has long been a truism in American higher education that junior and senior year are seen as at the top of the curricular pecking order. That is when the major is taken and, frankly, that is where most of our senior faculty really prefer to teach.
First year, on the other hand, is seen by many of us as less important. And because of this, guess who is often assigned general education and introductory courses? Adjuncts, graduate assistants and our most junior faculty.
It’s almost as though introductory and general education courses that define the first two years of college are what students get through as quickly as possible so that they can get to the good stuff in their third and fourth years -- that is, upper-level courses and the major.
But this view is out of sync with what many prospective college students and their parents are thinking. In a book I recently wrote about the transition from high school to college, virtually all of the high school seniors I interviewed, along with their parents, hoped that the first year of college would be a major step up from what they were doing in high school. But they are often disappointed.
At many colleges and universities, first-year students take large introductory courses in classes of 100 or more. Teaching is usually done by an instructor lecturing in front of the classroom while students dutifully take notes later to be regurgitated on a quiz. There is very little class participation involving discussion and debate. Writing anything over a few pages is unusual.
Arizona State University has gone even further. They are offering a Global Freshman Academy that allows first-year students to take their courses by the use of MOOCs (massive open online courses). Students won’t even have to leave the comfort of home to complete their first year! First year is seen as a means to an end, with the end being upper-level courses and the major.
But I would argue that the first year of college is far more important than this -- perhaps, in some ways, just as important as the final years of college.
Why do I believe this?
First year is when college students get a sound, cross-disciplinary grounding in the liberal arts and sciences, especially those who go on to vocational majors like engineering or nursing. The liberal arts are where they learn how to think critically and how to communicate effectively, skills that are crucial for a generation that will have many different careers in their lifetime.
First year to sophomore year is when attrition is at its highest. When I was a college president, 20 percent of first-year students at my institution didn’t return for their sophomore year. Some transferred, but many dropped out of college altogether. Why does this happen? In far too many exit interviews I have seen, dropouts say that they found their first-year classes meaningless.
I will never forget the admissions tour I took at a well-known university with my youngest daughter. We were in the university’s amazing library, and the tour guide, a sophomore, was bragging about the fact that most of his teachers were graduate assistants. “They’re really cool,” he said, “and understand our generation,” whereupon a mother standing next to me uttered sotto voce (but loud enough for everyone to hear), “Why am I paying a small fortune to have my child taught by someone who is only a couple years older than she is?”
That parent was articulating what many parents I interviewed for my book were saying: for $50,000 or more per year, the expectation is that their children will be taught by experienced faculty with the requisite credentials, not by part-time employees or graduate students.
Of course, many of the instructors assigned to introductory or general education courses including adjuncts and graduate students are quite capable teachers. But I believe that first-year students could really benefit from also being taught by senior faculty members who excel in the classroom. In many ways -- and I know this is heretical -- assistant professors who just completed their Ph.D. dissertations are probably the most capable of teaching the major that requires up-to-date knowledge of their discipline. Senior faculty, on the other hand, who through wisdom and experience have a wider view of the world are, in my opinion, the most qualified to teach general education courses designed to give first-year students a broader perspective on human knowledge and, in the process, excite them about what will come later.
Increasingly, colleges are coming to see the crucial importance of the first year. At one college I feature in my book, the freshman writing seminar is largely taught by the college’s most distinguished and experienced senior faculty, who are handpicked because they are also master teachers. First-year advising is also being given a new emphasis. At far too many colleges, advising is relegated to new faculty who have limited knowledge of the curriculum or to adjuncts who have equally limited office hours. But many colleges, realizing that solid advising reduces attrition, are assigning experienced faculty who are skilled at advising or professional advisers to first-year students.
For these colleges and universities, the first year has been given a new priority.
I’d like to end by saying that there is money to be raised by rethinking the first year, which should make presidents who are reading this article happy. I believe that philanthropic individuals and foundations, concerned about the cost of higher education and the human waste when students prematurely drop out and don’t graduate, will resonate to programs that support first-year students and keep them in college. I’m talking about:
Innovative first-year general education programs that challenge and excite first-year students through active learning (including discussion, debate and writing) so that they don’t want to leave college.
Endowed writing centers and other support systems that can save kids who come to college with academic deficiencies.
Endowed first-year opportunity programs that keep underserved and first-generation students in college.
Attrition is enormously expensive. A college of 2,000 students like my own that loses 20 percent of the first-year class potentially forgoes $5 million or more in tuition, room and board, which for many colleges is more than the development office raises each year in the annual fund.
In summary, by putting more energy and resources into the first year I believe we keep more of our students in college and thereby cut down on the enormous human waste when otherwise good students prematurely leave college with outsize debts they can’t pay back because they are unemployable. At the same time we improve our bottom line by not losing so much in tuition dollars. Most important, we graduate students for whom education from the very beginning is a pleasure, not a hardship to be endured.
Roger Martin is president emeritus and professor of history at Randolph-Macon College. He is the author of Off to College: A Guide for Parents. This essay is based on a presentation at the Council of Independent Colleges’ Institute for Chief Academic and Chief Advancement Officers.