I turn and pause, and I know I have only one step to take.
She sees before I even understand it that I’m blocked. My face is frozen, my eyes blank. Vera, on her toes, raises a forefinger. We’ll catch the next go-round.
My dance instructor is ever encouraging and perfectly tactful, but I burn with shame.
The other couples are moving. They occasionally stumble, but they continue. I’m blocked the way a wind-up toy is blocked by a wall, its feet nevertheless churning. Even when there is no literal wall, there is sometimes a figurative wall in front of me, and it feels as if my feet have only the one-two rhythm, side to side. I cannot go forward or backward, although one of my feet must. But which one? And then what?
The motion has to flow, and I cannot. It is the ever-intensifying faintheartedness of leaping into a cold lake or off a diving board, or while driving onto a freeway trying to merge with a stream of cars.
I would go, but I can’t.
“We’ll try again,” says Vera. She offers her hands, and I can do that, I can take them.
I’ve never been able to dance unself-consciously except while clowning it up for my own kids and nieces. I signed up for Vera’s dance lessons because I wanted to please my wife. Suzanne deserves a husband who can dance in public, I tell myself -- the thought of which increases rather than diminishes the pressure.
Get loose, Bob! Relax!
After all, as an English professor at a community college, I continually encourage my hesitant students to let go, to write freely -- not to answer or please me but to show themselves what they’re thinking. I tell them they can write their panic right down on the paper.
“But I’m stuck.”
“Write, ‘I’m stuck …’ and then ask the question you know I’d ask: ‘Why?’ And then continue with ‘because …’ And go on like that. You always have the room to write what’s going on in your head right now.”
And Vera has said pretty much the same to me: I always have the room to move my feet and create the space I want, that remembering the steps or forgetting them needn’t stop me from moving. My feet still work, and the music keeps playing, and she or my wife is waiting for me …. Sowhile trying to decide where to move, don’t stop -- just move!
“It’s life, it’s rhythm, and you can do it with or without the music, with or without the steps,” says Vera. “You’re confused because you’re blocking your body with your own thoughts.”
But “I’m not thinking,” I want to tell her. I’m shutting down. I’m trying to listen to her, to the music, to my memory.
In the classroom, I sympathize with my students when I see them stuck, but I know now it’s nothing compared to the truckloads of pity I feel for myself on the dance floor.
I try to trick the students into experiencing (not “learning” but doing!) various ways of dealing with anxieties about writing. But Donald -- 19 years old, a native speaker of English, not unintelligent, obeying the law I wish I would obey on the dance floor -- stumps me. He plunges in, never looks back, no hesitations, no regrets!
In class, I watch him, and I’m impressed. His pen is moving all the time, ink covering line after line. But when I get home to my rhythmically consoling rocking chair where I do most of my marking, I read on and on, confused by Donald’s plainspoken nothingness and carelessness. One phrase thoughtlessly follows one another: “Nick was idean in a boat on water and don’t know the idean lady and the husband takes his ax out and makes the baby born and the doctor named George saw this and liked smoking with ideans.”
He even carelessly miscopies the title of “Indian Camp,” calls the author not Hemingway but “earnest” and hops and skips along, summarizing the story into long curls of nonsense that would bring him up short if he only registered what he was saying, but he tumbles on, lest he notice what he’s written. He is scrupulous about not looking at what he’s written because, he tells me one day, it would freeze him.
“Did you read what you just wrote?” I ask as he rises up, half crouched, writing the last phrase of what was supposed to be a response to, not a summary of, the story.
“To be honest, no,” he responds, walking up and handing me the paper.
“You have to.”
“Really, Professor … it’s forward or nowhere. I can’t.”
The other students are listening.
He avoids my eyes but makes tiny shakes of his head. I don’t like putting him on the spot. “Hold on to this,” I say, pointing to the paper, “and take a break. And come back and try to read it just the way you read somebody else’s work.”
He takes it and goes back to his desk. He sits a moment, then gets up and puts it in his folder and slides the folder into his backpack. He nods at me, and then he walks past with his pack and says, “I’ll be back in a few.”
Is Donald really to be the person I model myself upon on the dance floor? He doesn’t come back for a week!
I ask him the following Tuesday, “Do you still have your response to ‘Indian Camp’?”
“The what?” He looks for a minute in his bag. “This?” He pinches it like a dirty diaper and reluctantly offers it to me.
But wait, there’s also Marya, who dashingly composes first in Ukrainian and then into a peculiar English -- an English she has never heard or read before. Back home, she has told me, she wrote hundreds of essays, and though I try to discourage her and other ESL students from composing in their native languages, she tells me that the Ukrainian is a constant stream that can’t be shut off.
Her writing for class is a kind of argument with English, as if she imagines she’s showing the English grammar how to reform itself: “According to me, Nicholas is child who both neither death nor being born knowledge they have given him to understand. Father, by me, is made pathos by act of many sufferings seen by son.” She’s in touch with her own private “Ukrainlish” and -- this is true (she’s argued the point with me) -- I usually understand her perfectly, so why fuss?
I try to explain: “Because the writing makes me …” I cringe (which is a visual aid I hope she gleans) and try to think of another word that conveys cringe. “It reads with a thick accent,” I say, “and while I love accents in speech, I can’t help thinking we don’t want to show it in writing if we can help it.” I don’t mention how she unrepentantly crowbars English grammar into places it’s never been.
I decide I could try Donald’s method as a dancer and go blank and ignore my own incoherent moves, or I could try Marya’s method. I could wrench my partner (my wife, lucky girl!) around the floor, listening to an inner rhythm that bends the music and my partner to it. Or I could keep doing what I have been doing, stumbling and halting, my face borscht red, my wife and Vera hopefully and anxiously awaiting my improvement.
Bob Blaisdell is a professor of English at Kingsborough Community College of the City University of New York.
Grants for digital humanities projects serve as established tradition as the new chairman for the National Endowment for the Humanities welcomes grant recipients to the agency's new home in Washington.
As someone who teaches young adult fiction at a university, I am troubled by the recent crop of opinion pieces about adults who read this genre. At Slate, Ruth Graham wants anyone over 18 to be embarrassed to enjoy YA (as those who study, catalog, or publish the genre call it). And on the opinion page of The New York Times, in a piece plaintively titled “Adults Should Read Adult Books,” Joel Stein writes “I’ll read The Hunger Games when I finish the previous 3,000 years of fiction written for adults.” Over at The New Republic, at least, Hillary Kelly thinks you should have the courage to read whatever the hell you want.
I like Kelly’s commitment to seeing some continuity between adulthood and childhood. However, both those who defend the adults who now read YA and those who attack them seem to assume that such readers have suddenly departed from a long-established norm of adults reading novels written for adults only.
Whatever you think of YA’s mixed-age readership, there is one thing you should know: there is no 3,000-year history of fiction written for adults. There is barely a 100-year history of such fiction. The adult novel is a relatively new invention, one that is not much older than YA itself. So all the adults now skulking or striding proudly down the ever-expanding YA aisle are not in fact breaking with a long tradition of adult reading. If we look back a couple of centuries, we find that in many ways YA’s mixed-age readership is perfectly normal for the Anglo-American novel. Fiction about young people triumphing over adversity in morally satisfying ways has long been default reading for people of any age who read fiction at all.
Look at the title page of Samuel Richardson’s 1740 breakthrough novel Pamela, which shattered sales records and inspired Pamela fans, teacups, and a multitude of other consumer tie-ins. It proudly announces that the book was written in order to improve “the YOUTH of BOTH SEXES.” And its protagonist, Pamela Andrews herself, is a beautiful and indomitable 16 year-old who confronts the perils, sexual and otherwise, of a hostile world, winning a resounding finale of emotional and material rewards. Sound familiar?
The winning adventures of one plucky young protagonist or another play out through two centuries of the Anglo-American novel. And it’s not just these characters and the arc of their plot that resemble YA, but the ages of these novels’ actual readers as well. From Pamela’s day through the end of the 19th century, these novels were devoured by readers of all ages. They promised to teach moral lessons to inexperienced young people, and we have records of children as young as nine weeping over Pamela. But masses of older people read them as well.
Until recently, even boundaries between more specialized children’s literature and what we now see as literature for adults were quite blurry: Frances Hodgson Burnett’s The Secret Garden first appeared in The American Magazine, whose other contributors include Upton Sinclair, F. Scott Fitzgerald, and Sir Arthur Conan Doyle; E. Nesbit’s Adventures of the Treasure Seekers was first serialized in the similarly eclectic Strand. When Little Women came out in 1869, lawyers, merchants, and office clerks happily chatted at work about the tribulations of the March girls.
People who are shocked by the fact that The Fault In Our Stars has mixed-age market appeal, even after witnessing the sales of Harry Potter and The Hunger Games, will be equally shocked by a list of turn-of-the-century American best-sellers. Heidi, Alice's Adventures in Wonderland, Little Women, and Little Lord Fauntleroy topped the charts between 1865 and 1914. Ever since T.S. Eliot, critics have tended to draw a bright line between Adventures of Tom Sawyer, which Eliot called a "boys' book," and Adventures of Huckleberry Finn, which Eliot insists "does not fall into the category of juvenile literature." However, through the first decades of the 20th century, both books were praised as equally fabulous for "boys of all ages" — which meant that they were good for a male of any age whatsoever. Gender, not age, was the criterion for identifying appropriate readers. The "great works of American fiction," the prominent literary critic Leslie Fieldler wrote in Love and Death in the American Novel (1960), "are notoriously at home in the children's section of the library."
So what is “adult fiction” and why do we now use it as our standard? Adult fiction has never been a description of what most adults actually read, but rather a fairly new aesthetic and psychological standard riding the coattails of a trendy political ideal. In principle, adulthood is an egalitarian idea. The claim that everyone has the right to vote when they turn 18, for example, implies a more leveling view of the world than does claiming this right as the exclusive property of a few people who own a lot of property. Coming into adulthood is now supposed to mean coming into power, in your personal life and in the wider world. But until quite recently, most people, most of the time, did not expect age alone to bring them much power over much of anything.
Late 19th- and 20th-century public policy and the emerging discipline of psychology charted a new path through life. In particular, universal state-sponsored education structured lives according to a new sense of age. Everyone became part of a cohort: we now read — as we reason, play, and love — at, above, or below grade level. From kindergarten eligibility to child-labor restrictions, from voter registration to old-age pensions, this path created newly precise and standardized age distinctions and invested them with meaning. And the apex of all these developmental schemes is adulthood, which in the 20th century became not only a key legal status, but also an always-out-of-reach personal aspiration — the golden moment when we transcend our lousy judgment, sexual confusion, self-centeredness, and other woes. And in this sense, far from being an egalitarian, leveling sort of idea, adulthood becomes a deliciously elite one: most adults, it turns out, are not adult at all. Modernist novelists of this era like Henry James, D. H. Lawrence, and James Joyce, and the critics who valued them, were the first writers to rely on “adult” as a synonym for “good.”
In playing down plots that reward good deeds and punish bad ones (and in playing up ambiguity, formal complexity, and explicit sex), the Modernists were not writing for an existing adult audience. They were calling it into being. They were fighting to demolish the mixed-age audience they had inherited: "Nothing is so striking in a survey of this field, and nothing is so much to be borne in mind, as that the larger part of the great multitude that sustains the teller and the publisher of tales is constituted by boys and girls," lamented Henry James. Eventually, the Modernists prevailed — in ideals about reading if not actual practice. Their new idea of the adult novel meant that other kinds of novels became suddenly and conspicuously non-adult, enabling all the guilty pleasures of the self-consciously crossover reader that flourish so vigorously today.
Modernist ideas about adult reading were especially appealing to mid-20th-century English departments. They took root in part because they were helpful in establishing the profession of literary criticism as an adult affair, a proper part of the intellectual life of the university, in contrast to the poorly rewarded child-centered work of primary and secondary education.
So, as a market phenomenon that cashes in on a high-stakes, intensively calibrated sense of age, YA is indeed a late-20th- and 21st-century thing. But there is nothing new at all about great numbers of fully grown people reading fiction that was not written for adults. What is fairly new is the value we place on a particular sense of adulthood. There are lots of interesting arguments to have about what makes any novel bad, good, or great. Using age as shorthand for aesthetic quality is not the best way to frame these arguments. Since the ideal of adulthood is now so important, whenever another YA book tops the best-seller lists, the opinion pieces on mixed-age readership will continue to fly. But awareness of the complex history of age and reading may help to deepen the discussions we have about the place of YA in an English department’s curriculum.
Ask anyone professing the humanities today and you come to understand that a medieval dimness looms. If this is the end-times for the ice sheets at our poles — and it is — many of us also understand that the melt can be found closer to home, in the elimination of language and classics departments, for instance, and in the philistinism represented by governors such as Rick Scott of Florida and Patrick McCrory of North Carolina, who apparently see in the humanities a waste of time and taxpayer subsidies. In the name of efficiency and job creation, according to their logic, taxpayers can no longer afford to support bleary-eyed poets, Latin history radicals, and brie-nibbling Francophiles.
That there is a general and widespread acceptance in the United States that what is good for corporate America is good for the country is perhaps inarguable, and this is why men like Governors Scott and McCrory are dangerous. They merely invoke a longstanding and not-so-ugly stereotype: the pointy-headed humanist whose work, if you can call it that, is irrelevant. Among the many easy targets, English departments and their ilk are convenient and mostly defenseless. Few will rise to rush the barricades with us, least of all the hard-headed realists who understand the difficulties of running a business, which is what the university is, anyway.
I wish, therefore, to propose a solution that will save money, save the humanities, and perhaps make the world a better place: Close the business schools.
The Market Argument
We are told that something called “the market” is responsible for the great disparities in pay between humanities professors and business professors. To a humanist, however, this market is the great mystifier; we find no evidence of an “invisible hand” that magically allocates resources within the university. The market argument for pay differentials between business professors and historians (average pay in 2014 for full professors at all institutions: $123,233 and $86,636, respectively, a difference of almost 30 percent; average at research institutions is $160,705 and $102,981, a difference of 36 percent), for instance, fails to convince that a market is operating. This is because administrators and trustees who set salaries based upon what the market can bear, or what it calls for, or what it demands, are actually subsidizing those of us who are who are manifestly out of the market.
Your average finance professor, for instance, is not a part of this market; indeed, she is a member of the artificial market created by colleges and universities themselves, the same institutions that tout the importance of critical thinking and of creating the well-rounded individual whose liberal arts study will ostensibly make her into a productive member of our democracy. But the administrators who buy the argument that the market allocates upward of 20, 30, or 40 percent more for the business professor than it does her colleague in the humanities have failed to be the example they tout: they are not thinking.
The higher education market for business professors and legal scholars, for instance, is one in which the professor is paid as if she took her services and sold them on what is commonly call the market. Which is where she, and her talents, manifestly are not. She is here, in the building next to ours, teaching our students and doing the same work we are. If my daughter cuts our lawn, she does not get paid as if she were cutting the neighbor’s lawn.
The business professor has sacrificed the blandishments of the other market for that of the university, where she can work softer hours, have her December/January vacation, go to London during the summer on a fellowship or university grant, and generally live something approaching the good life — which is what being employed by a college or university allows the lucky who earn tenure. She avoids the other market — eschews the long hours in the office, the demands of travel, the oppressive corporate state — so that she can pick up her kids from school on occasion, sleep in on a Saturday, and turn off her smartphone. She may be part of a machine, but it is a university machine, and as machines go she could do worse. This “market” is better than the other one.
But does she bring more value to the university? Does she generate more student hours? These are questions that administrators and business professors do not ask. Why? Because they wouldn’t like the answers. They would find that she is an expensive acquisition. Unless she is one of the Wharton superstars and appears on CNN Money and is quoted in The Wall Street Journal, there’s a good chance that the university isn’t getting its money’s worth.
The Moral Argument
There is another argument for wishing our business professor adieu. She is ostensibly training the next crop of financiers and M.B.A.s whose machinations have arguably had no salutary effects on this democracy. I understand that I am casting a wide net here, grouping the good with the bad, blaming the recent implosion of the world economy on business schools. One could, perhaps, lay equal blame on the mathematicians and quantitative analysts who created the derivative algorithms and mortgage packages that even the M.B.A.s themselves don’t understand, though there’s a good chance that business school graduates hired these alpha number crunchers.
Our investment bankers and their ilk will have to take the fall because, well, they should have known better. If only because, at bottom, they are responsible — with their easy cash and credit, their drive-through mortgages, and, worst of all, their betting against the very system they knew was hopelessly constructed. And they were trained at our universities, many of them, probably at our best universities, the Harvards and Princetons and Dartmouths, where — it is increasingly apparent — the brightest students go to learn how to destroy the world.
I am not arguing that students shouldn’t take classes in accounting, marketing, and economics. An understanding of these subjects holds value. They are honorable subjects often horribly applied. In the wrong hands they become tools less of enlightenment and liberation than ruthless self-interest. And when you have groups of like-minded economic pirates banding together in the name of self-interest, they form a corporation, that is, a person. That person, it is now apparent, cannot be relied upon to do the right thing; that person cannot be held accountable.
It’s not as if this is news. Over 150 years ago, Charles Dickens saw this problem, and he wrote A Christmas Carol to address it. The hero of Dickens’s novella is Jacob Marley, who returns from the grave to warn his tightfisted partner Ebenezer Scrooge that he might want to change his ways. When Scrooge tells Marley that he was always a “good man of business,” Marley brings down the thunder: “Mankind was my business. The common welfare was my business; charity, mercy, forbearance, and benevolence, were, all, my business. The dealings of my trade were but a drop of water in the comprehensive ocean of my business!”
In closing the business schools, may the former professors of finance bring to the market a more human side (or, apropos of Dickens, a more ghostly side). Whether or not they do, though, closing the business schools is a necessary first step in righting the social and economic injustices perpetuated not by capitalism but by those who have used it to rend the very social fabric that nourishes them. By planting the seeds of corporate and financial tyranny, our business schools, operating as so many of them do in collusion with a too-big-to-fail mentality, have become the enemy of democracy. They must be closed, since, as Jacob Marley reminds us, we all live in the business world.
II. Save the Humanities
Closing the business schools will allow us to turn our attention more fully to the state of the humanities and their apparent demise. The 2013 report released by the American Academy of Arts and Sciences, which asserts that “the humanities and social sciences are not merely elective, nor are they elite or elitist. They go beyond the immediate and instrumental to help us understand the past and the future.” As if that’s going to sell.
In the wake of the academy’s report, The New York Times dutifully ran three columns on the humanities — by David Brooks, Verlyn Klinkeborg, and Stanley Fish — which dove into the wreck and surveyed the damage in fairly predictable ways (excepting Fish, whose unpredictability is predictable). Brooks remembers when they used to teach Seneca and Catullus, and Klinkeborg looks back on the good old days when everyone treasured literature and literary study. Those days are gone, he argues, because “the humanities often do a bad job of teaching the humanities,” and because “writing well used to be a fundamental principle of the humanities,” though it apparently is not anymore. Why writing well isn’t a fundamental principle of life is perhaps a better question.
We might therefore ask: Aside from the typical obeisance to something called “critical thinking,” what are the humanities supposed to do?
I propose that one of the beauties of the liberal arts degree is that it is meant to do nothing. I would like to think, therefore, that the typical humanities major reads because she is interested in knowledge for purposes outside of the pervasive instrumentalism now fouling higher education. She does not read philosophy because she wants, necessarily, to become a philosopher; she does not read poetry to become a poet, though she may dream of it; she does not study art history, usually, to become an art historian, though she may one day take this road.
She may be in the minority, but she studies these subjects because of the pleasure it gives her. Reading literature, or studying philosophy, or viewing art, or watching films — and thinking about them — are pleasurable things. What a delight to subsidize something that gives her immediate and future joy instead of spending capital on a course of study that might someday allow her to make more money so that she can do the things she wants to do at some distant time. Henry David Thoreau said it best: “This spending of the best part of one's life earning money in order to enjoy a questionable liberty during the least valuable part of it reminds me of the Englishman who went to India to make a fortune first, in order that he might return to England and live the life of a poet. He should have gone up garret at once.” If you want to be a poet, be done with it.
Does she suffer for this pleasure?
It is an unfortunate fact of our political and cultural economy that she probably does. Her parents wonder helplessly what she is up to and they threaten to cut off her tuition unless she comes to her senses. The governor and legislature of her state tell her that she is wasting her time and that she is unemployable. She goes to her advisers, who, if they are in the humanities, tell her that the companies her parents revere love to hire our kind, that we know how to think critically and write clearly and solve problems.
And it isn’t that they are lying, exactly (except to themselves). They simply aren’t telling her the whole truth: that she will almost surely never have the kind of financial success that her peers in business or engineering or medicine will have; that she will have enormous regrets barely ameliorated by the thought that she carries the fire; that the digital humanities will not save her, either, though they may help make her life slightly more interesting.
It is with this problem in mind that I argue for a vision of the university as a place where the humanities are more than tolerated, where they are celebrated as intrinsic to something other than vocationalism, as a place in which the ideology that inheres to the industrial model in all things can and ought to be dismantled and its various parts put back together into something resembling a university and not a factory floor.
Instead of making the case that the humanities gives students the skills to “succeed in a rapidly changing world,” I want to invoke the wisdom of Walt Whitman, one of the great philosophers of seeming inactivity, who wrote: “I lean and loafe at my ease observing a spear of summer grass.”
What does it mean to loafe? Whitman is reclining and relaxing, but he is also active: he “invites” his soul and “observes” the world around him. This conjunction of observation and contemplation with an invitation to the soul is the key here; using our time, energy, and intellectual faculties to attend to our world is the root of successful living. A world of contemplative loafers is one that can potentially make clear-eyed moral and ethical judgments of the sort that we need, judgments that deny the conflation of economic value with other notions of value.
Whitman would rather hang out with the men who brought in the catch than listen to the disputations of science or catch the fish himself: “You should have been with us that day round the chowder-kettle.” While I am not necessarily advocating a life of sloth, I’m not arguing against it, either. I respect the art of study for its own sake and revere the thinker who does nothing worthwhile, if by worthwhile we mean something like growing the economy. Making a living rather than living is the sign of desperation.
William Major is professor of English at Hillyer College of the University of Hartford. He is author of Grounded Vision: New Agrarianism and the Academy (University of Alabama Press, 2011).