In 1869, Charles W. Eliot, a professor at the Massachusetts Institute of Technology, wrote an essay in The Atlantic Monthly entitled “The New Education.” He began with a question on the mind of many American parents: “What can I do with my boy?” Parents who were able to afford the best available training and did not think their sons suited for the ministry of a learned profession, Eliot indicated, sought a practical education, suitable for business “or any other active calling”; they did not believe that the traditional course of study adopted by colleges and universities 50 years earlier was now relevant. Less than a year later, Eliot became president of Harvard. Among the reforms he initiated were an expansion of the undergraduate curriculum and substantial improvement in the quality and methods of instruction in the law school and the medical school.
The debate between advocates of traditional liberal learning and partisans of a more “useful” education, Michael Roth, the president of Wesleyan University, reminds us, has deep roots in American soil. In Beyond the University, (Yale University Press) he provides an elegant and informative survey of the work of important thinkers, including Benjamin Franklin, Thomas Jefferson, Ralph Waldo Emerson, W.E.B DuBois, Jane Addams, William James, John Dewey, and Richard Rorty, who, despite significant differences, embraced liberal education because it “fit so well with the pragmatic ethos that linked inquiry, innovation, and self-discovery.” At a time in which liberal learning is under assault, Roth draws on the authority of these heavyweights to argue that “it is more crucial than ever that we not abandon the humanistic frameworks of education in favor of narrow, technical forms of teaching intended to give quick, utilitarian results."
Most of Beyond the University is devoted to claims by iconic intellectuals about the practical virtues of liberal learning, which Roth endorses (with occasional qualifications). Exhibiting a “capacious and open-ended” understanding of educational “usefulness,” Roth indicates, Thomas Jefferson opted for free inquiry at his university in Charlottesville, Va., to equip citizens in the new republic to think for themselves and take responsibility for their actions. Ralph Waldo Emerson resisted education as mere job training; but, he indicated, it should impart knowledge to develop individuals willing and able to use what we now call “critical thinking” to challenge the status quo.
Acknowledging that different people need different kinds of educational opportunities, W.E.B. DuBois nonetheless insisted that the final product of training “must be neither a psychologist nor a brick mason, but a man.” Liberal learning, Jane Addams emphasized, inculcates “affectionate interpretation,” which prepares individuals not only to defend themselves against those with different points of view, but to empathize with others and act in concert with them. And John Dewey, the most influential philosopher of education in the 20 century, looked to a liberal education, according to Roth, to help students learn the lessons of experiment and experience, by trying things out and assessing the results, by themselves and with others, and, then, if appropriate, revising their behavior.
Roth’s approach – a reliance on the authority of seminal thinkers – is not without problems. As he knows, the nature of higher education – and its perceived roles and responsibilities – has changed dramatically since colleges focused on liberal learning. In 1910, only 9 percent of students received a high school diploma; few of them went on to college. These days, about 40 percent of young men and women get a postsecondary degree. Undergraduate, master’s, and doctoral degrees, moreover, are now required, far more than were in the days of Emerson and Eliot, for entry into the most prestigious, and high-paying, professions. Jamie Merisotis, president of the Lumina Foundation, is surely right when he asserts that “to deny that job skills development is one of the key purposes of higher education is increasingly untenable” – and that integration of specific skills into the curriculum can help graduates get work and perform their assigned tasks well.
Roth does not specify how liberal learning might “pull different skills together in project-oriented classes.” Nor does he adequately address “the new sort of criticism” directed at liberal learning. A liberal arts education, many critics now claim, does not really prepare students to love virtue, be good citizens, or recognize competence in any field. As Roth acknowledges, general education, distribution requirements, and free electives are not effective antidotes to specialization; they have failed to help establish common academic goals for students. And, perhaps most disturbingly, doubt has now been cast on the proposition that the liberal arts are the best, and perhaps the only, pathway to “critical thinking” (the disciplined practice of analyzing, synthesizing, applying, and evaluating information).
President Roth may well be right that liberal learning “will continue to be a fundamental part of higher education” if (and, he implies, only if) it rebalances critical thinking and practical exploration. The key question, it seems to me, is how to rebalance, while preserving the essence of liberal learning, at a time in which higher education in general and, most especially, the humanities are under a sustained attack by cost-conscious advocates of an increasingly narrow vocationalism, who are certain to be unpersuaded by the testimony of long-dead intellectuals. The task, moreover, is all the more daunting, moreover, because it will have to be carried out by proponents and practitioners of the liberal arts, many of whom, unlike Michael Roth, are now in despair, in denial, or have lost faith.
Glenn C. Altschuler is the Thomas and Dorothy Litwin Professor of American Studies at Cornell University.
Reporting on the Senate's confirmation of Theodore Mitchell as the U.S. Department of Education's chief higher education official, Inside Higher Ed quoted a statement from Secretary of Education: “He will lead us through this important time in higher education as we continue to work toward the President’s goal to produce the best-educated, most competitive workforce in the world by 2020.” While this brief remark is hardly a major policy statement, its tone and focus are typical of the way Secretary Duncan, President Obama, and many others in politics these days talk about higher education.
This typical rhetoric, in Duncan’s statement and beyond, makes a good point, but it doesn't say enough. To explain why, I will take a leaf from Thucydides. In History of the Peloponnesian War, he explained that his apparent verbatim accounts of speeches by other figures really articulated what he thought they should have said. With due respect for Secretary Duncan and President Obama, here is what the Secretary of Education should have said, on behalf of the President's aims, on the confirmation of a new Under Secretary of Education in charge of higher education affairs:
He will lead us through this important time in higher education as we continue to work toward the president’s goals for higher education in making America a more productive economy, a more just society, a more flourishing democracy, and a richer environment for what the Founders called, in the Declaration of Independence, "the pursuit of happiness," and in the Preamble to the Constitution, "the general welfare."
A part of that economic goal is to produce the best-educated, most competitive workforce in the world by 2020. Another part is to ensure that higher education extends broadly the opportunity to develop the ingenuity and creativity that will drive American innovation in the years ahead.
That means working to ensure that higher education regains its function as an engine of socioeconomic advancement, both for the individual and for society as a whole. This means resisting the increasing stratification of curriculums and opportunities, making sure that the advantages of arts and sciences education are extended as far throughout higher education as possible. This is both prudent, to cultivate the nation's human capital, and also just, to mitigate disadvantages of less-privileged starting points.
Everyone knows that democracy depends on America's capacity to maintain a deliberative electorate, capable of making well-informed choices in a political system they understand and in which they actively participate. It is a responsibility of higher education to enhance this investment in America by helping maintain that electorate. It is a responsibility of government to promote that role.
Finally, when the Founders embraced such goals as " the pursuit of happiness," and securing "the general welfare" of the people, they acknowledged that the well-being of individuals and of society as a whole -- difficult as these concepts are to define -- are legitimate objects of government interest. Higher education has crucial responsibilities of exploration and discovery in this broad field of human well-being. It is here that the perennial American question concerning the scope and limits of government itself is to be explored, and given for inquiry to succeeding generations of Americans.
"So on the appointment of a new Under Secretary with responsibilities toward higher education, we celebrate the many contributions of higher education to American flourishing: its role in contributing to a vibrant economy, certainly; and also its role in sustaining and advancing the broad aims of justice and improvement to which the country has always been committed."
That would have been good to hear from Secretary Duncan, and would be good to hear in any of the administration's speeches about higher education. None of us who are committed to this broader vision of higher education can ever, I emphasize, lose sight of its role in propelling the economy forward. But we cannot permit the purposes of higher education in America to be narrowed solely into the goal of workforce production. More is at stake: access to opportunity, cultivation of ingenuity and innovation, and broad contributions to the future of the country. Phi Beta Kappa joins many voices in advocacy of that vision. We invite Theodore Mitchell, Secretary Duncan, and President Obama to join, as well.
John Churchill is secretary of the Phi Beta Kappa Society.
Most of my faculty colleagues agree that Writing Across the Curriculum (WAC), in which the task of teaching writing is one assigned to all professors, not just those who teach English or composition, is an important academic concept. If we had a WAC playbook, it would sound something like this: students need to write clear, organized, persuasive prose, not only in the liberal arts, but in the sciences and professional disciplines as well. Conventional wisdom and practical experience tell us that students’ ability to secure jobs and advance in their careers depends, to a great extent, on their communication skills, including polished, professional writing.
Writing is thinking made manifest. If students cannot think clearly, they will not write well. So in this respect, writing is tangible evidence of critical thinking — or the lack of it -- and is a helpful indicator of how students construct knowledge out of information.
The WAC playbook recognizes that writing can take many forms: research papers, journals, in-class papers, reports, reviews, reflections, summaries, essay exams, creative writing, business plans, letters, etc. It also affirms that writing is not separate from content in our courses, but can be used as a practical tool to apply and reinforce learning.
More controversial — and not in everyone’s playbook -- is the idea that teaching writing skills cannot be delegated to a few courses, e.g., first-year composition courses, literature courses, and designated “W” (writing-intensive) courses. Many faculty agree with the proposition that writing should be embedded throughout the curriculum in order to broaden, deepen and reinforce writing skills, but many also take the “not in my back yard” approach to WAC.
We often hear the following refrains when faculty discuss students and writing. Together they compose a familiar song (sung as the blues):
1. “I’m not an English teacher; I can’t be expected to correct spelling and grammar.”
2. “I don’t have time in class to teach writing — I barely have enough time to teach content.”
3. “Why should students be penalized for bad writing if they get the correct answer?”
4. “Mine isn’t supposed to be a ‘W’ course, so I’ll leave the writing to others.”
5. “There is no way to work writing into the subject matter of my course.”
6. “They hate to read and write and won’t take the time to revise their work.”
7. “I don’t have a teaching assistant and don’t want to do a lot of extra correcting—I have enough to do.”
8. “Our students come to college with such poor writing skills that we can’t make up for years of bad writing.”
9. “They never make the corrections I suggest; I see the same mistakes over and over again, so why bother?”
10. “They’re seniors, and they still can’t write!”
Much has been written about WAC, and I add my voice to the multitudes because I recently came to a realization, watching my students texting before class began: students spend hours every day reading and practicing writing — bad writing. How many hours are spent sending and reading tweets, texts and other messages in fractured language? It made me wonder: is it even possible to swim against this unstoppable tide of bad writing? One of my colleagues argues that students cannot write well because they don’t read. I think that students do read, but what they spend their time reading is not helpful in learning how to write. (That, however, is a discussion for another day.)
I’m not sure that all students can be taught to improve their writing, but I am sure that it is one of the most important things we can attempt to teach. What difference does it make if students know their subject matter and have excellent ideas if no one can get past their sloppy and disorganized writing?
Let us consider (with annoying optimism) those sad faculty refrains.
“I’m not an English teacher; I can’t be expected to correct spelling and grammar.”
But we are college professors; we know more about writing than our students do. What you could do, if you don’t want to make corrections yourself or are stymied by the magnitude of a particular writing problem (where to begin?), is circle areas for revision and require the student to submit the work to the tutoring or writing center before a grade will be given. (You can even allow several opportunities for revision, depending on your tolerance for pain.) You can designate a certain number of points in your rubric to writing mechanics, letting students know that their grades will be affected by their writing; human nature being what it is, students pay more attention when they know they will be graded.
Most important, we can all emphasize that writing is important in our disciplines and that students will be judged in the workplace on the basis of their writing skills. We can all convey the message that polished prose matters to us and to professionals in our field — so much so that we are taking points off for sloppy work.
“I don’t have time in class to teach writing — I barely have enough time to teach content.”
Do you have time to assign minute papers at the beginning or end of each class, asking students to summarize three things they learned, or pose a question related to the day’s work, or answer one question based on the previous reading assignment? These papers are short and easily graded; they help students internalize and reinforce content.. They each can be worth a few points, based on quality. If assigned on a regular or irregular basis (like a pop quiz), you may even get students to keep up with the reading and pay more attention in class. Minute papers encourage students to organize their thoughts; I discovered that students who could not speak coherently in class sometimes produced thoughtful short essays. Writing can be used in many ways to learn content and improve fluency and writing proficiency.
“Why should students be penalized for bad writing if they get the correct answer?”
Bcuz omg in the workplace they will be penalized for it. Ignoring student errors is like ignoring the piece of spinach in someone’s teeth; it may seem kind not to say anything, but no one really benefits. We can assign more writing in our courses, but if it is never graded, it may improve fluency but not accuracy — and confirm bad writing habits. Take a guess: over four years, what percentage of written assignments at your institution is graded for writing mechanics as well as content?
“Mine isn’t supposed to be a ‘W’ course, so I’ll leave the writing to others.”
Leaving WAC to others is like leaving voting to others. If WAC is viewed as an institutional playbook, it implies that everyone is part of the team and plays a position. All courses should be writing courses with a small w if not a big W; that is the only way to convey the message that what students learn in Composition 101 is relevant to success in their upper-level psychology course or business minor. Furthermore, since each discipline has its own rhetoric, it is particularly important for students to practice the specific types of writing they will be asked to produce in their careers. They will not be exposed to professional writing in their first-year seminars and English composition courses.
“There is no way to work writing into the subject matter of my course.”
Physicists, pathologists, geologists, mathematicians, dentists, lab technicians, engineers, architects, web designers, curators, forensic anthropologists and others have to explain things in writing; in an algebra course, for example, students could explain their reasoning on a given problem. No matter what the field, the ability to organize information in writing is a key professional asset, whether writing is used in a patient history, business contract or gallery brochure. We can invent ways to bring theory into practice by creating opportunities for students to write in the language of their careers.
“They hate to read and write and won’t take the time to revise their work.”
Yes, for many of our students, academic reading and writing seem to be unnatural acts. Some students, for example, seem much more themselves, much more authentic and engaged, on the soccer or football field.
One day in late autumn, on a perfect, still, golden afternoon, I stopped to watch the football team practice. The camaraderie, the sense of purpose, the sheer joy were poignant, as I pictured these young men paying mortgages and sitting in cubicles. Our job is to coach them safely into their futures, into different green pastures. Part of the playbook for that is to insist that they improve their writing skills so that their writing does not undercut their potential — even if they are not there yet, not fully ready to commit to academic work.
My other thought that afternoon was, can we make learning as engaging and authentic as sport? We each have to answer this question in our own way. In my law classes, for example, I ask students to write legal memorandums using the IRAC method: “You are a junior associate in the firm of Flake, Moss and Marbles, and your senior partner wants you to research and write a memo on the case of Madame X, who… .” The IRAC method not only structures the memo for students (they summarize the facts of the case, Identify the legal issues, cite the relevant Rules of law, Analyze the problem based on the facts and law, and draw a Conclusion on the likely outcome of the case), but allows them to role-play a real-world situation. They complete a series of these short writing exercises, with a rubric to guide them, and have several opportunities to revise their work.
For a formal or high-stakes writing assignment, scaffolding is essential; students will perform better when the structure of the writing assignment is broken down into components, which, when assembled, produce a coherent whole. The IRAC method has a built-in scaffold, but other writing assignments can be structured into a series of elements or steps. It is a mistake to assume that students know how to organize a paper or report; let them know what you are looking for, break down the structure into elements, and if you have a good sample of what you expect, hand it out. (Save your students’ work for this purpose.)
In my mediation class, students are asked to draft an agreement based on a mediation role-play they have participated in. The agreements follow a structured blueprint. They are peer-edited, revised by the student (with a writing tutor, if necessary) and then corrected by me. Students are given model agreements from past years and have three opportunities to revise their work prior to grading. Last semester, 18 of 19 students revised their work and received As on the agreements. The agreements were polished and professional and reinforced the content taught in the course.
I believe that we can devise meaningful and engaging ways for students to write in all courses; the challenge is to explain to students why they are doing it. Writing should be like driver’s ed in students’ minds -- a practical skill that is essential to their future success. Without that connection, writing will seem more like juggling: nice if you can do it, but not an essential life skill.
“I don’t have a teaching assistant and don’t want to do a lot of extra correcting — I have enough to do.”
Most of us don’t have teaching assistants, but we do have students for peer editing, and writing or tutoring centers with support staff. Some degree programs have upper-class peer mentors who can help students with writing in the discipline. Consider ways to form a writing partnership, using the resources available to you. Personally, I prefer that students take responsibility for their revisions by seeking out support services. Somehow, it doesn’t seem kosher to make all these corrections, have students incorporate them into their next draft, and then grade my own language, saying “good word choice,” “nicely written,” or “well organized!” I like to circle areas for improvement, making general comments, not specific corrections.
“Our students come to college with such poor writing skills that we can’t make up for years of bad writing.” Some students will make little progress in improving their writing, for a variety of reasons. But if we accept students into our institutions, we should provide opportunities for them to improve their writing skills, even if some students are the proverbial horses who won’t drink. If students practice and are graded on their writing in only a few courses, they learn: 1) that in most courses they can get a decent grade without decent writing, and 2) that writing is relevant only in a few contexts. If we insist that career preparation includes the process of writing and revision, and we all assign meaningful writing exercises that students can revise and improve, the rest is up to them.
“They never make the corrections I suggest; I see the same mistakes over and over again, so why bother?”
When students start losing points, they tend to sit up and take notice. I’ve found that many mistakes are careless ones — what I call a document dump, turning in a first draft with no proofreading. If you hand back a draft and deduct points for writing errors, you will see more effort to correct those mistakes. Why should students devote time to an ungraded exercise when they can spend their time on something that will affect their grades? If sloppy writing has no impact on their grades, it makes sense for students not to internalize your corrections or prioritize revisions.
“They’re seniors, and they still can’t write!”
If we can agree about the value of a WAC playbook, not just in theory but in our daily practice; find ways to weave writing into all of our courses, not as busywork but as a meaningful part of the content we teach; assess student writing and promote it as an essential career skill; and allow students to revise their work, since revision is the heart and soul of the writing process, we are less likely to encounter seniors who have not practiced or improved their writing skills over four years. Our playbook should read that all courses, from now on, are writing courses with a small w.
Ellen Goldberger is director of the Honor Scholars Program and teaches law, leadership and conflict resolution courses at Mount Ida College.
Brian Cranston’s recitation of “Ozymandias” in last year’s memorable video clip for the final season of Breaking Bad may have elided some of the finer points of Shelley's poem. But it did the job it was meant to do -- evoking the swagger of a grandiose ego, as well as time’s shattering disregard for even the most awe-inspiring claim to fame, whether by an ancient emperor or meth kingpin of the American Southwest.
But time has, in a way, been generous to the figure Shelley calls Ozymandias, who was not a purely fictional character, like Walter White, but rather the pharaoh Ramses II, also called User-maat-re Setep-en-re. (The poet knew of him through a less exact, albeit more euphonious, transcription of the name.) He ruled about one generation before the period that Eric H. Cline, a professor of classics and archeology at George Washington University, recounts in 1177 B.C.: The Year Civilization Collapsed (Princeton University Press).
Today the average person is reasonably likely to know that Ramses was the name of an Egyptian ruler. But very few people will have the faintest idea that anything of interest happened in 1177 B.C. It wasn't one of the 5,000 “essential names, phrases, dates, and concepts” constituting the “shared knowledge of literate American culture” that E.D Hirsch identified in his best-seller Cultural Literacy (1988), nor did it make it onto the revised edition Hirsch issued in 2002. Just over 3,000 years ago, a series of catastrophic events demolished whole cities, destroying the commercial and diplomatic connections among distinct societies that had linked up to form an emerging world order. It seems like this would come up in conversation from time to time. I suspect it may do so more often in the future.
So what happened in 1177 B.C.? Well, if the account attributed to Ramses III is reliable, that was the date of a final, juggernaut-like offensive by what he called the Sea Peoples. By then, skirmishes between Egypt and the seafaring barbarians had been under way, off and on, for some 30 years. But 1177 was the climactic year when, in the pharaoh’s words, “They laid their hands upon the lands as far as the circuit of the earth, their hearts confident…. ” The six tribes of Sea Peoples came from what Ramses vaguely calls “the islands.” Cline indicates that one group, the Peleset, are "generally accepted” by contemporary scholars "as the Philistines, who are identified in the Bible as coming from Crete.” The origins of the other five remain in question. Their rampage did not literally take the Sea Peoples around “the circuit of the earth,” but it was an ambitious military campaign by any standard.
They attacked cities throughout the Mediterranean, in places now called Syria, Turkey, and Lebanon, among others. About one metropolis Ramses says the Sea Peoples “desolated” the population, Ramses says, “and its land was like that which has never come into being.”
Cline reproduces an inscription that shows the Sea Peoples invading Egypt by boat. You need a magnifying glass to see the details, but the battle scene is astounding even without one. Imagine D-Day depicted exclusively with two-dimensional figures. The images are flat, but they swarm with such density that the effect is claustrophobic. It evokes a sense of terrifying chaos, of mayhem pressing in on all sides, so thick that nobody can push through it. Some interpretations of the battle scene, Cline notes, contend that it shows an Egyptian ambush of the would-be occupiers.
Given that the Egyptians ultimately prevailed over the Sea Peoples, it seems plausible: they would have had reason to record and celebrate such a maneuver. Ramses himself boasts of leading combat so effectively that the Sea Peoples who weren't killed or enslaved went home wishing they’d never even heard of Egypt: “When they pronounce my name in their land, then they are burned up.”
Other societies were not so fortunate. One of them, the Hittite empire, at its peak covered much of Turkey and Syria. (If the name seems mildly familiar, that may be because the Hittites, like the Philistines, make a number of appearances in the Bible.) One zone under Hittite control was the harbor city of Ugariot, a mercantile center for the entire region. You name it, Ugarit had it, or at least someone there could order it for you: linen garments, alabaster jars, wine, wheat, olive oil, anything in metal…. In exchange for paying tribute, a vassal city like Ugarit enjoyed the protection of the Hittite armed forces. Four hundred years before the Sea Peoples came on the scene, the king of the Hittites could march troops into Mesopotamia, burn down the city, then march them back home — a thousand miles each way — without bothering to occupy the country, “thus,” writes Cline, “effectively conducting the longest drive-by shooting in history.”
But by the early 12th century, Ugarit had fallen. Archeologists have found, in Cline’s words, "that the city was burned, with a destruction level reaching two meters high in some places.” Buried in the ruins are “a number of hoards … [that] contained precious gold and bronze items, including figurines, weapons and tools, some of them inscribed.” They "appear to have been hidden just before the destruction took place,” but "their owners never returned to retrieve them.” Nor was Ugarit ever rebuilt, which raises the distinct possibility that there were no survivors.
Other Hittite populations survived the ordeal but declined in power, wealth, and security. One of the maps in The Year Civilization Collapsed marks the cities around the Mediterranean that were destroyed during the early decades of the 12th century B.C. — about 40 of them in all.
The overview of what happened in 1177 B.C. that we’ve just taken is streamlined and dramatic — and way too much so not to merit skepticism. It’s monocausal. The Sea Peoples storm the beaches, one city after another collapses, but Ramses III survives to tell the tale…. One value of making a serious study of history, as somebody once said, is that you learn how things don’t happen.
Exactly what did becomes a serious challenge to determine, after a millennium or three. Cline’s book is a detailed but accessible synthesis of the findings and hypotheses of researchers concerned with the societies that developed around the Mediterranean throughout the second millennium B.C., with a special focus on the late Bronze Age, which came to an end in the decades just before and after the high drama of 1177. The last 20 years or so have been an especially productive and exciting time in scholarship concerning that region and era, with important work being done in fields such as archeoseismology and Ugaritic studies. A number of landmark conferences have fostered exchanges across micro-specialist boundaries, and 1177 B.C.: The Year Civilization Collapsed offers students and the interested lay antiquarian a sense of the rich picture that is emerging from debates among the ruins.
Cline devotes more than half of the book to surveying the world that was lost in or around the year in his title — with particular emphasis on the exchanges of goods that brought the Egyptian and Hittite empires, and the Mycenean civilization over in what we now call Greece, into closer contact. Whole libraries of official documents show the kings exchanging goods and pleasantries, calling each “brother,” and marrying off their children to one another in the interest of diplomatic comity. When a ship conveying luxury items and correspondence from one sovereign to another pulled in to dock, it would also carry products for sale to people lower on the social scale. It then returned with whatever tokens of good will the second king was sending back to the first — and also, chances are, commercial goods from that king’s empire, for sale back home.
The author refers to this process as “globalization,” which seems a bit misleading given that the circuits of communication and exchange were regional, not worldwide. In any case, it had effects that can be traced in the layers of scattered archeological digs: commodities and artwork characteristic of one society catch on in another, and by the start of the 12th century a real cosmopolitanism is in effect. At the same time, the economic networks encouraged a market in foodstuffs as well as tin — the major precious resource of the day, something like petroleum became in the 20th century.
But evidence from the digs also shows two other developments during this period: a number of devastating earthquakes and droughts. Some of the cities that collapsed circa 1177 may have been destroyed by natural disaster, or so weakened that they succumbed far more quickly to the marauding Sea Peoples than they would have otherwise. For that matter, it is entirely possible that the Sea Peoples themselves were fleeing from such catastrophes. “In my opinion,” writes Cline, “… none of these individual factors would have been cataclysmic enough on their own to bring down even one of these civilizations, let alone all of them. However, they could have combined to produce a scenario in which the repercussions of each factor were magnified, in what some scholars have called a ‘multiplier effect.’ … The ensuing ‘systems collapse’ could have led to the disintegration of one society after another, in part because of the fragmentation of the global economy and the breakdown of the interconnections upon which each civilization was dependent."
Referring to 1177 B.C. will, at present, only get you blank looks, most of the time. But given how the 21st century is shaping up, it may yet become a common reference point -- and one of more than antiquarian relevance.