It was too prolonged for there to be any specific date, or dates, to mark it. But perhaps this is as good a time as any to mark the 25th anniversary of a process that started with the fall of the Berlin Wall in early November 1989 and reached a kind of peak with the events in Romania late that December.
The scale and pace of change were hard to process then, and difficult to remember now. Ceausescu had barely recovered from the shock of being heckled before he and his wife faced a firing squad. It was not how anyone expected the Cold War to end; insofar as we ever imagined it could end, the images that came to mind involved mutually assured destruction and nuclear winter.
A few years ago, Daniel T. Rogers characterized the intellectual history of the final decades of the 20th century as an “age of fracture” – an era in which the grand narratives and overarching conceptual schemata were constantly displaced by “piecemeal, context-driven, occasional, and… instrumental” ideas and perspectives in the humanities, social sciences, and public life. Fair enough; just try finding a vintage, unshattered paradigm these days. But a system of bipolar geopolitical hostilities prevailed throughout most of that period, and the contradictory structure of conflict-and-stasis seemed very durable, if not permanent.
Until, suddenly, it wasn’t. One smart and well-executed treatment of the world that came to an end a quarter-century ago is a recent television series called "The Americans," set in the early 1980s. The first season is now available in DVD and streaming video formats, and the second will be in two weeks, just in time for binge-viewing over the holidays.
"The Americans" is a Cold War spy drama as framed by the “secret life amidst suburban normality” subgenre, the basic tropes of which were inaugurated by "The Sopranos." In it, the Jenningses, a married couple, run a travel agency in Washington, where they live with their two early-adolescent kids. But they are actually KGB agents who entered the United States some 20 years earlier. They have operated from behind cover identities for so long that they blend right in, which makes them very effective in their covert work. While gathering information on the Strategic Defense Initiative, for example, they even get access to the Advanced Research Projects Agency Network -- aka ARPANET -- which allows communication between computers, or something.
The comparison shouldn’t be pushed too hard, but the paradox of the deep-cover agent is right out of John Le Carré: A divided identity makes for divided loyalties. At very least it puts considerable strain on whatever commitment the couple started out with, back in the late Khrushchev era. We get occasional flashbacks to their life as young Soviet citizens. With the onset of “Cold War II,” the motherland is imperiled once again (not only by the American arms buildup but also by the reflexes of KGB leadership at “the Center”) and the Jenningses have decidedly mixed feelings about raising kids under rampant consumerism, even if they’ve grown accustomed to it themselves.
The moral ambiguities and mixed motives build up nicely. Life as a couple, or in a family, proves to be more than a layer of the agents’ disguise: love is another demand on their already precarious balance of loyalties. Yet the real menace of thermonuclear showdown is always there, underneath it all. Some viewers will know that things came very close to the point of no return at least once during this period, during NATO’s “Able Archer” exercise in November 1983. Whatever sympathy the audience may develop toward the Jenningses (played with real chemistry by Keri Russell and Matthew Rhys) is regularly tested as they perform their KGB assignments with perfect ruthlessness. They are soldiers behind enemy lines, after all, and war always has innocent casualties.
The conflict has gone on so long, and with no end in sight, that the characters on screen don’t even feel the need to justify their actions. The spycraft that the show portrays is historically accurate, and it gets the anxious ground-tone of the period right, or as I remember it anyway. But very seldom does "The Americans" hint at the impending collapse of almost every motive driving its core story -- something the viewer cannot not know. (Pardon the double negative. But it seems to fit, given the slightly askew way it keeps the audience from taking for granted either the Cold War or the fact that it ended.)
The focus on the family in "The Americans" takes on added meaning in the light of Margaret Peacock’s Innocent Weapons: The Soviet and American Politics of Childhood in the Cold War, recently published by the University of North Carolina Press. The scriptwriters really ought to spend some time with the book. At the very least, it would be a gold mine of nuances and points of character development. More generally, Innocent Weapons is a reminder of just how much ideological freight can be packed into a few messages rendered familiar through mass media, advertising, and propaganda.
Peacock, an assistant professor of history at the University of Alabama at Tuscaloosa, examines the hopes and fears about youngsters reflected in images from the mid-1940s through the late 1960s. The U.S. and the USSR each experienced a baby boom following World War II. But the outpouring of articles, books, movies, and magazine illustrations focusing on children was not solely a response to the concerns of new parents. It might be more accurate to say the imagery and arguments were a way to point the public’s attention in the right direction, as determined by the authorities in their respective countries.
Children are the future, as no politician can afford to tire of saying, and the images from just after the defeat of fascism were tinged with plenty of optimism. The standard of living was rising on both sides of the Iron Curtain. In 1950 President Truman promised parents a “the most peaceful times the world has ever seen.” Around the same time, the Soviet slogan of the day was “Thank You Comrade Stalin for Our Happy Childhood!”, illustrated with a painting of exuberant kids delivering an armful of roses to the General Secretary, whose eyes fairly twinkle with hearty good nature.
But vows of peace and plenty on either side were only as good as the leaders’ ability to hold their ground in the Cold War. That, in turn, required that young citizens be imbued with the values of patriotism, hard work, and strong character. Sadly enough, children on the other side were denied the benefits of growing up in the best of societies.
The Soviets media portrayed American youth as aimless, cynical jazz enthusiasts facing Dickensian work conditions after years of a school system with courses in such topics as “home economics” and “driver’s education.” The Americans, in turn, depicted Soviet youth as brainwashed, stultified, and intimidated by the state. (And that was on a good day.)
By the late 1950s, the authorities and media on each side were looking at their own young people with a more critical eye (alarmed at “juvenile deliquincy,” for example, or “hooliganism,” as the Soviets preferred to call it) -- while also grudgingly admitting that the other side was somehow bringing up a generation that possessed certain alarming virtues. Khrushchev-era educational reformers worried that their students had endured so much rote instruction that they lacked the creativity needed for scientific and technological progress, while American leaders were alarmed that so many young Soviets were successfully tackling subjects their own students could never pass -- especially in science and math. (The news that 8 million Soviet students were learning English, while just 8,000 Americans were taking Russian, was also cause for concern.)
The arc of Cold War discourse and imagery concerning childhood, as Peacock traces it, starts out with a fairly simplistic identification of youth’s well-being with the values of those in charge, then goes through a number of shifts in emphasis. By the late 1960s, the hard realities facing children on either side were increasingly understood as failures of the social system they had grown up in. In the U.S., a famous television commercial showed a little girl plucking the leaves of a daisy as a nuclear missile counted down to launch; while the ad was intended to sway voters against Barry Goldwater, it drew on imagery that the Committee for a Sane Nuclear Policy (better known as SANE) and Women Strike for Peace first used to oppose nuclear testing a few years earlier. Nothing quite so emblematic emerged in the Soviet bloc, but the sarcastic use of a slogan from the Komsomol (Young Communist Union) became a sort of inside joke about the government’s self-delusion.
“To varying degrees,” writes Peacock, “both countries found themselves over the course of these years betraying their ideals to win the [Cold] war, maintain power, and defend the status quo…. Even images like that of the innocent child can become volatile when the people who profess to defend the young become the ones who imperil them.”
Adjunct faculty members at St. Michael’s College in Vermont voted to form a union affiliated with Service Employees International Union, they announced Monday. They’re the third group of adjuncts to vote to form unions under SEIU’s Adjunct Action campaign in recent weeks, after those at Burlington and Champlain colleges. About 75 percent of St. Michael’s eligible faculty participate in the vote, and the tally was 46 in favor and 26 opposed.
Anne Tewksbury-Frye, an adjunct faculty member at St. Michael’s College and Champlain College, said in a statement that the St. Michael’s union “will serve to improve best practices, and help us learn as educators and teachers in a way that will benefit our students directly.” Jeffrey Ayres, dean of the college, said St. Michael’s remained neutral throughout the process and encouraged all adjuncts to vote. “Adjuncts are an important part of the college in providing an excellent educational experience,” he said. Adjuncts teach about 20 percent of classes there.
The National Endowment for the Humanities on Monday announced a new grant program to promote the publication of serious nonfiction, based on scholarly research, on subjects of general interest and appeal. Winners of the grants will receive stipends of $4,200 per month for 6-12 months. A statement from NEH Chairman William D. Adams said: “In announcing the new Public Scholar program we hope to challenge humanities scholars to think creatively about how specialized research can benefit a wider public.”
Attention is how the mind prioritizes. The brain’s attention circuits stay busy throughout our waking hours, directing on a millisecond-by-millisecond basis where our limited cognitive resources are going to go, monitoring the information that floods into our senses, and shaping conscious experience. Attention is one of the most mysterious and compelling topics in cognitive science. Years of research on the subject are now paying off handsomely in the form of recent advances in our understanding of how these mechanisms work, on both theoretical and physiological levels. And the more we learn, the more we realize that these findings aren’t just important for theory-building -- they offer myriad practical applications that can help people function more effectively across all aspects of life. Teaching and learning is one area where attention research is especially useful for helping us get better at what we do.
In my book Minds Online: Teaching Effectively with Technology, I foreground attention as the starting point for everything designers of college-level online learning experiences should know about human cognition. Without attention, much of what we want students to accomplish -- taking in new information, making new connections, acquiring and practicing new skills -- simply doesn’t happen. And thus, gaining students’ focus is a necessary first step in any well-designed learning activity, whether online or face-to-face.
But how does this principle play out in a contemporary learning environment littered with tempting distractions -- the smartphones that accompany students to class, social networks that let us reach out to friends around the clock, the sites for games, media, and shopping that beckon every time we open our browsers? It’s especially concerning given how overly optimistic people tend to be about their ability to juggle different tasks. As psychologists Christopher Chabris and Daniel Simon eloquently explain in The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us, human beings are notoriously bad at knowing what we can handle, attention-wise. Essentially, we lie to ourselves about what we notice and what we know, believing that we take in much more that we actually do.
For our students, this adds up to a serious drain on learning. And as learning environments become more complex, it is a drain they can’t afford. Consider, for example, some forms of blended learning in which students master foundational knowledge outside of class, usually through online work, then spend class time on focused application and interaction with instructors and classmates. A tightly scheduled and synchronized system like this can work beautifully, but doesn’t allow much margin of error for wasted time and scattered focus.
So what can we do about this situation? One strategy is to educate students about the limits of attention and just how much they miss when they choose to multitask. This, however, is easier said than done. Incorporating a learning module on attention is straightforward enough, but what would it take for such a module to be effective? First, it would need to be brief and to the point, reinforcing just a few crucial take-home messages without a great deal of history, theory or other background more appropriate to a full-length course in cognition. At the same time, quality control would be a major concern, especially for the module to be usable by an instructor without academic training in cognitive psychology. Just Googling for materials on attention brings up at least as much pseudoscience as reputable work, and without this solid scientific grounding, a module on the dangers of multitasking could easily devolve into a “Reefer Madness”-style experience, more laughable than persuasive.
Keeping these caveats firmly in mind, I’ve worked with my instructional designer colleague John Doherty to create a free-standing, one-shot online learning module called Attention Matters that instructors can drop into existing courses as an extra credit or required assignment. Besides being scrupulous about the science, John and I prioritized interactivity and use of the multimedia capabilities of online learning -- enabling us to show students, not just tell them, what distraction can do to performance in different contexts. Too many online learning activities consist essentially of glorified PowerPoint slides, so although there is a certain amount of text within our module, we put most of the emphasis on media, demonstrations, self-assessment and discussion.
As an example, we used a demonstration we called the “Selective Reading Challenge” to show students how attention mechanisms constantly filter incoming information, and also, how little we remember of information we don’t attend to. The demonstration consists of a page of text, alternating lines of bold and regular typefaces. Students are instructed to pay attention to only the bold lines, ignoring everything else, then proceed immediately to the next page. In the “to be ignored” text, we hide a few stimuli that may break through to awareness -- a couple of common names (Michael, Emily, Stephen, Christina), that if they belong to you, will probably pop out, as well as a few attention-grabbing emotional terms (911, murder). After completing the “selective reading,” students are invited to go back review the entire page of text -- bold and regular -- to see what they missed, and what they (likely) don’t remember at all even though it was well within the field of vision.
Other demonstrations illustrate the dramatic slowdown in processing that takes place when we multitask among competing activities. We present an online version of the classic “Stroop effect” to illustrate how distraction -- even from other mental processes going on at the same time -- can make a simple activity slow and inaccurate. The task involves naming the colors of a sequence of multicolored words -- not a difficult task, except when the words are themselves color names. red, green, blue, and so on – that contradict the colors they are printed in. Lastly, we pulled in several video clips from around the Web to drive home the multitasking point. One shows a prank “driving test” in which unsuspecting students were told to text while navigating a practice course, with predictably disastrous results. Another classic clip called “The Amazing Color Changing Card Trick,” created by psychologist Richard Wiseman, dramatically illustrates how attending to one part of a scene causes us to miss major developments going on in practically the same location.
These videos, activities and demonstrations form the anchor for brief, impactful student learning activities throughout the module. Students respond to discussion prompts asking them whether the demonstrations worked on them as predicted, and what they may mean for everyday attention. They also complete self-quizzes with feedback that target the different learning outcomes for each part of the module. At the end, they revisit what they have learned in a brief self-reflection and survey on attitudes and beliefs about attention and its importance for learning.
Attention Matters is an exciting project, offering us the opportunity to apply cognitive science in a novel and – we hope – useful way. The project also has a research component, through which we will be gathering data on student attitudes and beliefs about their own attentional capabilities, as well as on the frequency of different multitasking behaviors in their own lives.
There’s another important side to Attention Matters, and that has to do with the collaboration between an instructional design expert and a Subject Matter Expert, or SME. Much has been written about the virtues of instructional design experts’ pairing up with SMEs, and yet, such collaborations remain fairly rare within higher education. We hope that this project demonstrates the real benefits to be gained – perhaps motivating others to take the plunge.
It’s still too early to know what the long-term impacts of Attention Matters are going to be, or to predict exactly what we might discover about student attitudes and behaviors around multitasking. But I do foresee that as seismic change continues to occur in higher education, we will see more educators entering similar new territory – collaboratively creating focused, technologically delivered learning modules that live outside of traditional courses and use learning theory and cognitive science as the basis for design. And in our case, we may be able to add to our arsenal of strategies for getting students to become better stewards of their own attention.
The University of Illinois at Chicago is at risk of losing $4.5 million if the University of Illinois at Urbana-Champaign rehires James Kilgore as an adjunct, The Chicago Tribunereported. Kilgore has a strong record as an adjunct but was dropped from teaching last year amid reports about his criminal past with the Symbionese Liberation Army. While Kilgore was open abut that history when he was hired, some questioned his suitability to teach, while many faculty groups said that he should be judged on his performance as an adjunct, not his past. The Illinois board last month cleared the way for Kilgore to be rehired, and the Tribune reported that departments are in fact starting the process to employ him.
But the Tribune reported that Richard Hill, a Chicago businessman who last year pledged $6.5 million to the Illinois-Chicago bioengineering department, has informed the university that if it proceeds with Kilgore's rehiring, he will not give the $4.5 million that remains on his pledge. "I no longer wish to be associated with University of Illinois," he wrote to the university. "The academy at the University of Illinois has clearly lost its moral compass." In an email to the Tribune explaining his views, he said, "I will not contribute neither time nor money to such a morally debased enterprise.... If they stand up and police their own organization to assure they are of the highest standards, I will stand with them till my dying days."
“Would you mind telling me what those four years of college were for?”
So asks the father of Benjamin Braddock, the protagonist of "The Graduate." A half-century after Mike Nichols made this film, it remains popular at "senior week" events and other end-of-college rituals. And that's because we still haven't answered its central question: what are we doing here, and why?
When Nichols died in November, obituaries inevitably depicted "The Graduate" as an emblem of youth alienation in postwar America. In the 1967 film’s most iconic line, a family friend gives young Braddock a single word of advice: “plastics.” The term became an ironic rallying cry for a rising generation of rebellious Americans, who rejected their elders’ bland conformity and empty consumerism.
But Braddock simply repeats the phrase — “plastics” — in a glassy-eyed stupor. As Nichols told an interviewer after the film’s release, Braddock is “a kid drowning among objects and things, committing moral suicide by allowing himself to be used finally like an object or thing.” Young Benjamin knows what he doesn’t like, but he has no idea how — or even whether — to change it.
That’s why Nichols decided to give the role to an unknown actor named Dustin Hoffman instead of to an established star like Robert Redford, who also campaigned for the part. When Hoffman read the book on which the film was based, he told Nichols that Braddock should be played by Redford or by another classically handsome white Anglo-Saxon Protestant.
But Nichols had something very different in mind. He saw Braddock as an anti-hero, a loser who sleepwalks through life instead of awakening to its challenges. So the director chose a Jewish actor — with dark, ungainly features — instead of the “walking surfboards” (as Nichols mockingly called them) who usually won the big Hollywood roles.
Braddock has an ambivalent and depressingly passionless affair with one of his parents’ friends, Mrs. Robinson, whose name would be immortalized in the song that Paul Simon wrote for the film. (The other Simon and Garfunkel songs on the soundtrack, including “Sounds of Silence,” predated the movie.) Then Braddock falls in love with Mrs. Robinson’s daughter, Elaine, an undergraduate at the University of California at Berkeley.
Conventional to his core, Braddock resolves to win Elaine in the most predictable, socially acceptable fashion: by marrying her. He drives his sportscar up to the Bay Area, where Nichols treats us to the famous shot of Hoffman speeding across the Bay Bridge (but in the wrong direction, as film buffs often note). The budget-conscious Nichols shot most of his college scenes at the University of Southern California, which was much closer to his studio, although we do get a few glimpses of the neighborhood abutting Cal-Berkeley.
What we do not get is a sense of the Free Speech Movement, demonstrations against the Vietnam War, or any of the other political passions that enveloped Berkeley in the late 1960s. The only hint is an exchange with a hostile boardinghouse manager, who inquires whether Braddock is an “agitator"; a few scenes later, a young tenant (played by Richard Dreyfuss, in one of his first roles) asks the manager if he should call the police to arrest Braddock.
On what charge? Braddock isn’t a threat to anyone at the university, where he follows Elaine through the humdrum rhythms of college life — to a class, to the library — while a clock chimes from the tower overhead. There’s nothing here to engage either of them, except the fact that Elaine is herself engaged to be married — and not to Braddock. So he has to win the girl from his rival, who looks very much like Mike Nichols’ walking-surfboard stereotype.
The film’s courtship rituals feel altogether dated in today’s era of student hook-ups and delayed marriage. But the aimless ennui of college should be familiar to anyone who works or studies at one. We have millions of students who are simply drifting through college, just like Benjamin Braddock does in his parents’ pool. As my colleague Richard Arum and his co-author Josipa Roksa have shown, the average undergraduate studies 12 hours per week, and more than a third report studying less than 5 hours a week.
On the other end of the spectrum are the so-called Organization Kids, who have been programmed to climb the social ladder at all costs. They do hit the books, early and often, but there’s something soulless and depressing about their grim quest for grades, connections, and jobs. They’re “excellent sheep,” to quote the title of William Deresiewicz’ recent book, going along in order to get ahead.
In the years since Mike Nichols made "The Graduate," we have transformed our universities into truly mass institutions. Soon, we are told, we'll have "college for all." But college for what? Asked that by his befuddled father, Benjamin Braddock replies simply, “You got me.” We've got to come up with a better answer than that.