If you can remember the 1960s, the old quip goes, you weren’t really part of them. By that standard, the most authentic participants ended up as what used to be called “acid casualties”: those who took spiritual guidance from Timothy Leary’s injunction to “turn on, tune in and drop out” and ended up stranded in some psychedelic heaven or hell. Not that they’ve forgotten everything, of course. But the memories aren’t linear, nor are they necessarily limited to the speaker’s current incarnation on this particular planet.
Fortunately Stephen Siff can draw on a more stable and reliable stratum of cultural memory in Acid Hype: American News Media and the Psychedelic Experience (University of Illinois Press). At the same time, communicating about the world as experienced through LSD or magic mushrooms was ultimately as difficult for a sober newspaper reporter, magazine editor or video documentarian as conversation tends to be for someone whose mind has been completely blown. The author, an assistant professor of journalism at Miami University in Ohio, is never less than shrewd and readable in his assessment of how various news media differed in method and attitude when covering the psychedelic beat. The slow and steady buildup of hype (a word Siff uses in a precise sense) precipitated an early phase of the culture wars -- sometimes in ways that partisans now might not expect.
Papers on experimentation with LSD were published in American medical journals as early as 1950, and reports on its effects from newspaper wire services began tickling the public interest by 1954. The following year, mass-circulation magazines were devoting articles to LSD research, followed in short order by a syndicated TV show’s broadcast of film footage showing someone under the influence. The program, Confidential File, sounds moderately sleazy (the episode in question was described as featuring “an insane man in a sensual trance”) but much of the early coverage was perfectly respectable, treating LSD as a potential source of insight into schizophrenia, or a potential expressway to the unconscious for psychoanalysts.
But the difference between rank sensationalism and science-boosting optimism may count for less, in Siff’s interpretation, than how sharply coverage of LSD broke with prevailing media trends that began coming into force in the 1920s.
After the First World War, with wounded soldiers coming back with a morphine habit, newspapers carried on panic-stricken anti-drug crusades (“The diligent dealer in deadly drugs is at your door!”) and any publication encouraging recreational drug use, or treating it as a fact of life, was sure to fall before J. Edgar Hoover’s watchful eye. Early movie audiences enjoyed the comic antics of Douglas Fairbanks Sr.’s detective character Coke Ennyday (always on the case, syringe at the ready), or in a more serious mood they could go to For His Son, D. W. Griffith’s touching story of a man’s addiction to Dopokoke, the cocaine-fueled soft drink that made his father rich. But by the time the talkies came around, the Motion Picture Production Code categorically prohibited any depiction of drug use or trafficking, even as a criminal enterprise. Siff notes that in the 20 years following the code’s establishment in 1930, “not a single major Hollywood film dealing with drug use was distributed to the public.”
Not that depictions of substance abuse were a forbidden fruit the public was craving, exactly. But the relative openness of the mid-1950s (emphasis on “relative”) allowed editors to risk publishing stories on what was, after all, serious research on a potential new wonder drug. Siff points out that general-assignment newspaper reporters attending a scientific or medical conference, unable to tell what sessions were worth covering, could feel reasonably confident that a title mentioning LSD would probably yield a story.
At the same time, writers for major newsmagazines and opinion journals were following the lead of Aldous Huxley, the novelist and late-life religious searcher, who wrote about mystical experiences he had while taking mescaline. In 1955, when the editors of Life magazine decided to commission a feature on hallucinogenic mushrooms, it turned to Wall Street banker and amateur mycologist R. Gordon Wasson. He traveled to Mexico and became, in his own words, one of “the first white men in recorded history to eat the divine mushroom” -- and if not, then surely the first to give an eyewitness report on “the archetypes, the Platonic ideals, that underlie the imperfect images of everyday life” in the pages of a major newsweekly.
Suffice it to say that by the time Timothy Leary and associates come on the scene (wandering around Harvard University in the early 1960s, with continuously dilated pupils and only the thinnest pretense of scientific research) it is rather late in Siff’s narrative. And Leary’s legendary status as psychedelic shaman/guru/huckster seems much diminished by contrast with the less exhibitionistic advocacy of LSD by Henry and Clare Boothe Luce. Beatniks and nonconformists of any type were mocked regularly in the pages of Time or Life, but the Luce publications were for many years very enthusiastic about the potential benefits of LSD. The power couple tripped frequently, and hard. (Some years ago, when I helped organize Mrs. Luce’s papers at the Library of Congress, the LSD notes were a confidence not to be breached, but now the experiments are a matter of public record.)
The hippies, in effect, seem like a late and entirely unintentional byproduct of industrial-strength hype. “During an episode of media hype,” Siff writes, “news coverage feeds on itself, as different news outlets follow and expand on one another’s stories, reacting among themselves and to real-world developments. Influence seems to flow from the larger news organizations to smaller ones, as editors at smaller or more marginal media operations look toward the decisions made by major outlets for ideas and confirmation of their own judgment.”
That is the process, broadly conceived. In Acid Hype, Siff charts the details -- especially how the feedback bounced around between news organizations, not just of different sizes, but with different journalistic cultures. Newspaper coverage initially stuck to the major talking points of LSD researchers; it tended to stress the potential wonder-drug angle, even when the evidence for it was weak. Major magazines wanted to cover the phenomenon in greater depth -- among other things, with firsthand reports on the psychedelic universe by people who’d gone there on assignment. Meanwhile, the art directors tried to figure out how to convey far-out experiences through imagery and layout -- as, in time, did TV producers. (Especially on Dragnet, if memory serves.)
Some magazine editors seem to have been put off by the religious undercurrents of psychedelic discourse. Siff exhibits a passage in a review that quotes Huxley’s The Doors of Perception but carefully removes any biblical or mystical references. But someone like Leary, who proselytized about psychedelic revolution, was eminently quotable -- plus he looked good on TV because (per the advice of Marshall McLuhan) he smiled constantly.
The same hype-induction processes that made hallucinogens seem like the next step toward improving the American way of life (or, conversely, the escape route for an alternative to it) also went into effect when the tide turned: just as dubious claims about LSD’s healing properties were reported without question (it’ll cure autism!), so did horror stories about side effects (it’ll make you stare at the sun until you go bling!).
The reaction seems to have been much faster and more intense than the gradual pro-psychedelic buildup. Siff ends his account of the period in 1969 -- oddly enough, without ever mentioning the figure who emerged into public view that year as the embodiment of LSD's presumed demons: Charles Manson. You didn't hear much about the drug's spiritual benefits after Charlie began explaining them. That was probably for the best.
What happens in Wisconsin will not stay in Wisconsin. Lawmakers here are moving quickly to hollow out the definition of tenure and strip away due process rights for faculty members and academic staff. For legislators in other states who want to dismantle public higher education, they might look here to find new plays for their playbooks.
It is not uncommon for legislators to threaten tenure or criticize public education -- many do it for sport. But what’s unique in Wisconsin is that the proposed tenure changes are not coming from a fringe coalition: they are coming from the Joint Finance Committee, the most powerful body in the Legislature.
I am a tenure-track faculty member in the School of Education at the University of Wisconsin at Madison and have been in the state for only two years. I have a lot to learn and am naively optimistic that cooler heads will prevail and the tenure threats will wash over in time. But I cannot bring myself to a place of comfort; I am truly worried. And I am not just worried for Wisconsin, but for other states that will follow suit if this change actually happens.
Wisconsin is unique in that we are the only state (to my knowledge) to have enshrined tenure into state law. Moving this law from state statute to the University of Wisconsin Board of Regents policy would not be entirely uncommon in the national context. What is uncommon is how political our board is compared to other states -- the governor appoints 16 of the 18 members and colleges don’t have their own campus boards to interact with the system.
But even less common -- and far much more egregious -- is Section 39 of the Joint Finance Committee’s omnibus motion. It allows the Board to “terminate any faculty or academic staff appointment… due to a budget or program decision…” So instead of using widely accepted processes, faculty and staff can be terminated for “…program discontinuance, curtailment, modification or redirection, instead of when a financial emergency exists under current law.”
This undermines the core principles of shared governance, strips away due process rights and is an obvious assault on academic freedom. The board says its members will “adopt policies that reflect existing statutory language” and ensure faculty and staff will retain the same due process protections currently under state law.
If Section 39 of the budget bill redefines tenure, then the board must comply with the new state law.
This new definition extends far beyond the standard financial exigency criteria for termination of appointments and is out of line with the American Association of University Professors’ academic freedom guidelines. And the proposed change is happening without consulting the very stakeholders the law was designed to protect -- university faculty and staff members.
I know these tensions aren’tnew; we are constantly justifying our existence and under financial stress. I get that. But this is a bridge too far. It doesn’t matter if the regents use existing statutory language, because this omnibus motion would kill it all. It trumps regents policy.
If this policy change happens, it will set a precedent for other states to follow, so watch Wisconsin closely. Keeping Section 39 could set in motion a series of events that will threaten the university’s ability to recruit and retain faculty, generate revenue, and even threaten our accreditation status.
As much as I wish this were all political theater or a simple misunderstanding, it is not. It is a very real threat and one that has been years in the making.
Instituting the $250 million budget cut will create the conditions where the Board of Regents can exercise their new authority to fire at will. The long-term academic and financial costs will far outweigh the short-term political benefits, and I hope our elected officials have the ability to see that far down the road.
Nicholas Hillman is an assistant professor of educational leadership and policy analysis at the University of Wisconsin at Madison.
There are important issues around diversity -- notably in terms of ethnicity/race, socioeconomic class, sexual orientation and gender -- that have been of concern to institutions of higher education for a while now. The progress made in these areas may be less than impressive, but they have a conspicuous place on our radar screens.
There is another dimension of diversity that has yet to attract the attention it deserves: the diversity of contributions that can be made by different members of an institution’s tenured and tenure-track faculty. Faculty members in these positions are pivotal to fostering the kind of change needed in our colleges and universities if we are to better serve our students. Such change would involve how faculty members judge one another, how departments view their responsibilities, how those responsibilities can best be fulfilled and how the work of faculty members is viewed by academic administrators.
Different institutions have different missions, which should be reflected in what is reasonably expected of their respective faculties. These differences have unfortunately been eroded by status-seeking mission creep. So, for openers, there is the famous advice of Polonius (who has received insufficient respect for his wisdom, probably because he conveyed it in a way that was boring to a younger person): “To thine own self be true.”
While it may seem obvious that a one-size-fits-all approach is inappropriate and undesirable for institutions with different missions and constituencies, it may also be undesirable within an institution as well, even if that institution is a research university. While the holy trinity of research, teaching and service on the face of it provides room for flexibility, differences in how each is valued and assessed yield a generally hierarchical structure with publication and attracting grant funds being the coin of the realm and relatively easy to quantify.
But even in research universities, not all members of a department need to balance their research and teaching contributions in exactly the same proportions. Moreover, one faculty member in his/her time plays many roles -- there may be times in between research projects in which a faculty member might wish to focus more on teaching. (As an aside: the pressure to publish as much and as quickly as possible seems clearly linked to the level of retractions we have been seeing on the part of major scientific and scholarly journals when major research flaws are revealed postpublication.)
A better solution would be an understanding -- reflected in the reward structure -- that not every member of a department needs to make precisely the same contribution to the department in meeting its goals and responsibilities. Crafting such a reward structure is something that the New American Colleges and Universities consortium, for example, has been working on with funding from the Teagle Foundation.
To be sure, one expects that departments in research universities would have a sufficiently strong complement of truly distinguished scholars and scientists who are making significant contributions to the knowledge base in their fields, including some who may not be God’s gift to teaching. Fortunately, many highly distinguished scientists and scholars are also superb teachers. But there should also be room for faculty members whose teaching outdistances their research. If research universities presume to educate undergraduates, they need to consider how well they are fulfilling that responsibility. They should also feel an obligation to prepare their graduate students for occupying positions at a wide range of institutions of higher education; that is, they should be preparing graduate students seeking an academic career for their work not only as researchers, but as teachers.
There have been proposals for a separate track for faculty members who would focus on teaching, as opposed to research. This, however, is a solution that is part of the problem, since it will almost certainly perpetuate a culture of relative disdain for teaching, along with a tendency for teaching-focused appointments to be non-tenure-track. While there is a place for continuing appointments off the tenure track, viewing teaching in general as something unworthy of tenure would be unfortunate both in terms of institutional culture and how universities are viewed by the public.
It would also be desirable to recognize and reward those faculty members who have a special flair for sharing significant results of science and scholarship with a wide audience of readers -- beyond even The New York Review of Books. We already have an admirable complement of public intellectuals who earn their high position in the academic food chain by the traditional measure of research excellence -- though we could always use more of them. In addition, there are those whose contributions to public enlightenment might in and of themselves merit reward beyond what the current system offers.
Barriers to achieving a more informed citizenry may seem daunting, even at times insurmountable, especially when one figures in efforts at deliberate deception by powerful figures and opinion leaders. Indeed, we may feel the need to modify Abraham Lincoln’s famous observation that you can fool all the people some of the time and some of the people all the time by observing that those have turned out to be pretty good odds. But we should reward those who give the advancement of public knowledge their best effort -- and sometimes manage to make a difference.
Judith Shapiro is president of the Teagle Foundation and a former president of Barnard College.
Five years ago,this column looked into scholarly potential of the Twitter archive the Library of Congress had recently acquired. That potential was by no means self-evident. The incensed “my tax dollars are being used for this?” comments practically wrote themselves, even without the help of Twitter bots.
For what -- after all -- is the value of a dead tweet? Why would anyone study 140-character messages, for the most part concerning mundane and hyperephemeral topics, with many of them written as if to document the lowest possible levels of functional literacy?
As I wrote at the time, papers by those actually doing the research treated Twitter as one more form of human communication and interaction. The focus was not on the content of any specific message, but on the patterns that emerged when they were analyzed in the aggregate. Gather enough raw data, apply suitable methods, and the results could be interesting. (For more detail, see the original discussion.)
The key thing was to have enough tweets on hand to grind up and analyze. So, yes, an archive. In the meantime, the case for tweet preservation seems easier to make now that elected officials, religious leaders and major media outlets use Twitter. A recent volume called Twitter and Society (Peter Lang, 2014) collects papers on how politics, journalism, the marketplace and (of course) academe itself have absorbed the impact of this high-volume, low-word-count medium.
One of the book’s co-editors is Katrin Weller, who is an information scientist from the GESIS Leibniz Institute for the Social Sciences, in Cologne, Germany. At present she is in the final month of a Kluge Fellowship at the Library of Congress, which seems like an obvious place to conduct her research into the use of Twitter to study historical events. Or it would have been, if the archive of tweets were open to scholars, which it still isn’t, and won’t be any time soon.
Unable to pursue her original project, Weller used the Kluge Fellowship to broaden her focus -- which, she told me in an email exchange “has been pretty much on working with Twitter data [over] the last years.” She spent her time catching up with the scholarship on other forms of social media and investigating various web-archiving projects at the library.
As for the digital collection that made her want to go to Washington, DC, in the first place… well, the last official statement from library was issued in January 2013. It reported that Twitter’s output from 2006 to 2010 -- consisting of “approximately 21 billion tweets, each with more than 50 accompanying metadata fields, such as place and description” -- had finally been organized, by hour. The process was to be completed that month, even as another half billion or so tweets per day were added to the collection.
The Library of Congress finds itself in the position of someone who has agreed to store the Atlantic Ocean in his basement. The embarrassment is palpable. No report on the status of the archive has been issued in more than two years, and my effort to extract one elicited nothing but a statement of facts that were never in doubt.
“The library continues to collect and preserve tweets,” said Gayle Osterberg, the library’s director of communications, in reply to my inquiry. “It was very important for the library to focus initially on those first two aspects -- collection and preservation. If you don’t get those two right, the question of access is a moot point. So that’s where our efforts were initially focused and we are pleased with where we are in that regard.”
As of early 2013, the library reported it had received more than 400 requests to use the archive. Since then, members of the public have asked for updates on the library’s blog, with no response forthcoming. At this point no date has been set for the archive to be opened to researchers. The leadership of the Library of Congress may be “pleased [by] where we are,” but their delight is not likely to be contagious.
No grumbling from Katrin Weller, though. She sent me a number of her recent and forthcoming papers on what might be called second-order social-media research. That is, they take up the problems and concerns that face scholars trying to study social media.
Apart from the difficulties involved in archiving -- enough on that, for now -- there are methodological and ethical problems galore, as becomes clear from a paper Weller co-authored with her colleague Katharina E. Kinder-Kurlanda, a cultural anthropologist also at the Leibniz Institute. In 2013 and 2014, they conducted 42 interviews with social-media researchers at international conferences. The subjects were from various fields and parts of the world. What they had in common was the use of data gathered from a variety of social-media venues -- not just Twitter and Facebook but “many other platforms such as Foursquare, Tumblr, 4chan and Reddit.”
All of which makes establishing methodological standards -- how material from social media platforms is collected, documented and handled -- extremely difficult, if not impossible. A research team might find it necessary to invent a program to harvest raw data from a site, but if the overall focus of the project is sociological or linguistic, the details will probably not be discussed in the resulting publication. There is also the issue of “data cleaning,” i.e., filtering out messages from spam accounts, bots and the like, in order to create a data set consisting of only human-generated material (as much as that is possible). It is a time- and labor-intensive process, and the thoroughness of the job will in part be a function of the budget.
So the size, quality and reliability of the raw material itself are going to vary widely from researcher to researcher. Weller and Kinder-Kurlanda note the case of the same data being collected from a single social-media website using the same tools, but run in parallel on two different servers. The result was different data sets. And all of this, mind you, before the serious analytical crunching even gets started.
One partial solution, or at least stopgap measure, is to share data sets -- certainly easing the strain on some researchers’ purses. The authors mention finding researchers “who felt an ethical obligation to share their data sets, either with other researchers or with the public.” About a third of the researchers Weller and Kinder-Kurlanda interviewed “had experience in working with data collected by others.” But the practice raises ethical problems about privacy, and it sounds like some of the exchanges take place sub rosa. And in any event, sharing the data sets probably won't change the drift toward some social-media platforms being over- or underresearched because their data are easier to collect or clean.
Weller indicates that she intends to write more about the epistemological issues raised by social media. That sounds like an interesting topic, and a perplexing one. Besides, it will clearly be a long, long time before anyone gets to use Twitter as a tool for historical research.
When Rowland Hussey Macy opened his namesake store in 1858, understanding consumer behavior was largely a matter of guessing. Retailers had little data to assess what customers wanted or how variables like store hours, assortment or pricing might impact sales. Decision making was slow: managers relied on manual sales tallies, compiled weekly or annually. Dozens of stores failed, including several of Macy’s original stores.
Predictive analytics, in the early days of retail, were rudimentary. Forward-thinking retailers combined transactional data with other types of information -- the weather, for example -- to understand the drivers of consumer behavior. In the 1970s, everything changed. Digital cash registers took hold, allowing companies to capture data and spot trends more quickly. They began A/B testing, piloting ideas in a test vs. control model, at the store level to understand the impact of strategy in near real time.
In the early days of AOL, where I worked in the 1990s and early 2000s, we were quick to recognize the risk to brick-and-mortar stores, as online retailers gathered unprecedented data on consumer behavior. Companies like Amazon could track a customer’s movements on their site using click-stream data to understand which products a customer was considering, or how long they spent comparing products before purchasing. Their brick-and-mortar counterparts, meanwhile, were stuck in the 1800s.
Unexpected innovations, however, have a funny way of leveling the playing field. Today, broadband ubiquity and the proliferation of mobile devices are enabling brick-and-mortar stores to track cell phone signals or use video surveillance to understand the way consumers navigate a store, or how much time they spend in a particular aisle. Sophisticated multichannel retailers now merge online behavior with in-person information to piece together a more holistic picture of their consumers, generating powerful data that drive changes in layout, staffing, assortment and pricing. A recent study found that 36 percent of in-store retail purchases -- worth a whopping $1.1 trillion -- are now influenced by the use of digital devices. Retailers who leverage online research to drive brick-and-mortar sales are gaining a competitive advantage.
The use of big data and predictive analytics in higher education is nascent. So-called disrupters often claim that the lecture hasn’t changed in 150 years, and that only online learning can drive transformative, game-changing outcomes for students. Of course, these claims ring hollow among today’s tech-savvy professors.
Since my transition into higher education, I have been struck by the parallel journey retailers and educators face. Both have been proclaimed obsolete at various points, but the reality is that the lecture, like the retail experience, has and will continue to evolve to meet the new demands of 21st-century users.
Like brick-and-mortar stores, lectures were once a black box -- but smart faculty members are beginning to harness the presence of mobile devices to capture unprecedented levels of data in traditional classrooms. And smart institutions are combining real-time engagement data with historic information to spot challenges early and change the academic trajectory for students.
Historical sources of student data (FAFSA, GPA, SAT, etc.) have predictive validity, but they are a bit like the year-over-year data retailers used: limited in depth and timeliness. The heart of a higher education institution is its professors -- and its classes. In addition to professors being experts in their fields, providing unique learning opportunities to their students, studies have shown that when professors have positive relationships with students, it leads to greater student success.
Some of the most interesting early data are coming from the big, first-year lecture courses. While most students experience these as a rite of passage, they also hold great potential as models of how behavioral data can improve engagement and completion rates for students. Faculty are no longer powerless in the face of larger classes and limited insight into their students' learning behavior. They can track how well students are engaging in traditional lecture classes and intervene with students who aren’t engaged in the behaviors (note taking, asking questions and attendance) that correlate with success.
Historically, professors have relied on piecemeal solutions to gather insights on student behavior. So-called student-response systems and learning management software, like digital cash registers in the ’70s, provide useful data -- but they don’t provide the sort of real-time analytics that can inform an instructor’s practice or to identify students in need of additional support and coaching.
A more recent brand of solutions -- in full disclosure, including ours at Echo360 -- are designed to work in conjunction with great teaching, while providing instructors with the tools to track and measure student engagement: Are students taking notes? Are they asking questions? These tools give administrators and instructors insight into how students are interacting and participating both in class, as well as with content or readings before and after class. No more waiting for summative tests to demonstrate that a student misunderstood a concept weeks or months earlier.
The analogy between retail and education has its limitations. The mission and objectives in education are more nuanced, and frankly, more important. However, education, like every sector, has what we call a moment of truth.
For retailers, that moment of truth is centered around the purchase decision. Sophisticated marketers and retailers have used behavioral data to become incredibly skilled at understanding and shaping that purchase decision to achieve extraordinary results.
It’s time to use those learnings for a higher calling. The explosion of digital devices in the classroom allows us to understand the learning process wherever it is happening on campus, and to support education’s vital moment of truth -- a transaction of knowledge between professors and students.
Frederick Singer is CEO and founder of Echo360, which provides active learning and lecture capture services to more than 650 higher ed clients in 30 countries.