Five years ago,this column looked into scholarly potential of the Twitter archive the Library of Congress had recently acquired. That potential was by no means self-evident. The incensed “my tax dollars are being used for this?” comments practically wrote themselves, even without the help of Twitter bots.
For what -- after all -- is the value of a dead tweet? Why would anyone study 140-character messages, for the most part concerning mundane and hyperephemeral topics, with many of them written as if to document the lowest possible levels of functional literacy?
As I wrote at the time, papers by those actually doing the research treated Twitter as one more form of human communication and interaction. The focus was not on the content of any specific message, but on the patterns that emerged when they were analyzed in the aggregate. Gather enough raw data, apply suitable methods, and the results could be interesting. (For more detail, see the original discussion.)
The key thing was to have enough tweets on hand to grind up and analyze. So, yes, an archive. In the meantime, the case for tweet preservation seems easier to make now that elected officials, religious leaders and major media outlets use Twitter. A recent volume called Twitter and Society (Peter Lang, 2014) collects papers on how politics, journalism, the marketplace and (of course) academe itself have absorbed the impact of this high-volume, low-word-count medium.
One of the book’s co-editors is Katrin Weller, who is an information scientist from the GESIS Leibniz Institute for the Social Sciences, in Cologne, Germany. At present she is in the final month of a Kluge Fellowship at the Library of Congress, which seems like an obvious place to conduct her research into the use of Twitter to study historical events. Or it would have been, if the archive of tweets were open to scholars, which it still isn’t, and won’t be any time soon.
Unable to pursue her original project, Weller used the Kluge Fellowship to broaden her focus -- which, she told me in an email exchange “has been pretty much on working with Twitter data [over] the last years.” She spent her time catching up with the scholarship on other forms of social media and investigating various web-archiving projects at the library.
As for the digital collection that made her want to go to Washington, DC, in the first place… well, the last official statement from library was issued in January 2013. It reported that Twitter’s output from 2006 to 2010 -- consisting of “approximately 21 billion tweets, each with more than 50 accompanying metadata fields, such as place and description” -- had finally been organized, by hour. The process was to be completed that month, even as another half billion or so tweets per day were added to the collection.
The Library of Congress finds itself in the position of someone who has agreed to store the Atlantic Ocean in his basement. The embarrassment is palpable. No report on the status of the archive has been issued in more than two years, and my effort to extract one elicited nothing but a statement of facts that were never in doubt.
“The library continues to collect and preserve tweets,” said Gayle Osterberg, the library’s director of communications, in reply to my inquiry. “It was very important for the library to focus initially on those first two aspects -- collection and preservation. If you don’t get those two right, the question of access is a moot point. So that’s where our efforts were initially focused and we are pleased with where we are in that regard.”
As of early 2013, the library reported it had received more than 400 requests to use the archive. Since then, members of the public have asked for updates on the library’s blog, with no response forthcoming. At this point no date has been set for the archive to be opened to researchers. The leadership of the Library of Congress may be “pleased [by] where we are,” but their delight is not likely to be contagious.
No grumbling from Katrin Weller, though. She sent me a number of her recent and forthcoming papers on what might be called second-order social-media research. That is, they take up the problems and concerns that face scholars trying to study social media.
Apart from the difficulties involved in archiving -- enough on that, for now -- there are methodological and ethical problems galore, as becomes clear from a paper Weller co-authored with her colleague Katharina E. Kinder-Kurlanda, a cultural anthropologist also at the Leibniz Institute. In 2013 and 2014, they conducted 42 interviews with social-media researchers at international conferences. The subjects were from various fields and parts of the world. What they had in common was the use of data gathered from a variety of social-media venues -- not just Twitter and Facebook but “many other platforms such as Foursquare, Tumblr, 4chan and Reddit.”
All of which makes establishing methodological standards -- how material from social media platforms is collected, documented and handled -- extremely difficult, if not impossible. A research team might find it necessary to invent a program to harvest raw data from a site, but if the overall focus of the project is sociological or linguistic, the details will probably not be discussed in the resulting publication. There is also the issue of “data cleaning,” i.e., filtering out messages from spam accounts, bots and the like, in order to create a data set consisting of only human-generated material (as much as that is possible). It is a time- and labor-intensive process, and the thoroughness of the job will in part be a function of the budget.
So the size, quality and reliability of the raw material itself are going to vary widely from researcher to researcher. Weller and Kinder-Kurlanda note the case of the same data being collected from a single social-media website using the same tools, but run in parallel on two different servers. The result was different data sets. And all of this, mind you, before the serious analytical crunching even gets started.
One partial solution, or at least stopgap measure, is to share data sets -- certainly easing the strain on some researchers’ purses. The authors mention finding researchers “who felt an ethical obligation to share their data sets, either with other researchers or with the public.” About a third of the researchers Weller and Kinder-Kurlanda interviewed “had experience in working with data collected by others.” But the practice raises ethical problems about privacy, and it sounds like some of the exchanges take place sub rosa. And in any event, sharing the data sets probably won't change the drift toward some social-media platforms being over- or underresearched because their data are easier to collect or clean.
Weller indicates that she intends to write more about the epistemological issues raised by social media. That sounds like an interesting topic, and a perplexing one. Besides, it will clearly be a long, long time before anyone gets to use Twitter as a tool for historical research.
When Rowland Hussey Macy opened his namesake store in 1858, understanding consumer behavior was largely a matter of guessing. Retailers had little data to assess what customers wanted or how variables like store hours, assortment or pricing might impact sales. Decision making was slow: managers relied on manual sales tallies, compiled weekly or annually. Dozens of stores failed, including several of Macy’s original stores.
Predictive analytics, in the early days of retail, were rudimentary. Forward-thinking retailers combined transactional data with other types of information -- the weather, for example -- to understand the drivers of consumer behavior. In the 1970s, everything changed. Digital cash registers took hold, allowing companies to capture data and spot trends more quickly. They began A/B testing, piloting ideas in a test vs. control model, at the store level to understand the impact of strategy in near real time.
In the early days of AOL, where I worked in the 1990s and early 2000s, we were quick to recognize the risk to brick-and-mortar stores, as online retailers gathered unprecedented data on consumer behavior. Companies like Amazon could track a customer’s movements on their site using click-stream data to understand which products a customer was considering, or how long they spent comparing products before purchasing. Their brick-and-mortar counterparts, meanwhile, were stuck in the 1800s.
Unexpected innovations, however, have a funny way of leveling the playing field. Today, broadband ubiquity and the proliferation of mobile devices are enabling brick-and-mortar stores to track cell phone signals or use video surveillance to understand the way consumers navigate a store, or how much time they spend in a particular aisle. Sophisticated multichannel retailers now merge online behavior with in-person information to piece together a more holistic picture of their consumers, generating powerful data that drive changes in layout, staffing, assortment and pricing. A recent study found that 36 percent of in-store retail purchases -- worth a whopping $1.1 trillion -- are now influenced by the use of digital devices. Retailers who leverage online research to drive brick-and-mortar sales are gaining a competitive advantage.
The use of big data and predictive analytics in higher education is nascent. So-called disrupters often claim that the lecture hasn’t changed in 150 years, and that only online learning can drive transformative, game-changing outcomes for students. Of course, these claims ring hollow among today’s tech-savvy professors.
Since my transition into higher education, I have been struck by the parallel journey retailers and educators face. Both have been proclaimed obsolete at various points, but the reality is that the lecture, like the retail experience, has and will continue to evolve to meet the new demands of 21st-century users.
Like brick-and-mortar stores, lectures were once a black box -- but smart faculty members are beginning to harness the presence of mobile devices to capture unprecedented levels of data in traditional classrooms. And smart institutions are combining real-time engagement data with historic information to spot challenges early and change the academic trajectory for students.
Historical sources of student data (FAFSA, GPA, SAT, etc.) have predictive validity, but they are a bit like the year-over-year data retailers used: limited in depth and timeliness. The heart of a higher education institution is its professors -- and its classes. In addition to professors being experts in their fields, providing unique learning opportunities to their students, studies have shown that when professors have positive relationships with students, it leads to greater student success.
Some of the most interesting early data are coming from the big, first-year lecture courses. While most students experience these as a rite of passage, they also hold great potential as models of how behavioral data can improve engagement and completion rates for students. Faculty are no longer powerless in the face of larger classes and limited insight into their students' learning behavior. They can track how well students are engaging in traditional lecture classes and intervene with students who aren’t engaged in the behaviors (note taking, asking questions and attendance) that correlate with success.
Historically, professors have relied on piecemeal solutions to gather insights on student behavior. So-called student-response systems and learning management software, like digital cash registers in the ’70s, provide useful data -- but they don’t provide the sort of real-time analytics that can inform an instructor’s practice or to identify students in need of additional support and coaching.
A more recent brand of solutions -- in full disclosure, including ours at Echo360 -- are designed to work in conjunction with great teaching, while providing instructors with the tools to track and measure student engagement: Are students taking notes? Are they asking questions? These tools give administrators and instructors insight into how students are interacting and participating both in class, as well as with content or readings before and after class. No more waiting for summative tests to demonstrate that a student misunderstood a concept weeks or months earlier.
The analogy between retail and education has its limitations. The mission and objectives in education are more nuanced, and frankly, more important. However, education, like every sector, has what we call a moment of truth.
For retailers, that moment of truth is centered around the purchase decision. Sophisticated marketers and retailers have used behavioral data to become incredibly skilled at understanding and shaping that purchase decision to achieve extraordinary results.
It’s time to use those learnings for a higher calling. The explosion of digital devices in the classroom allows us to understand the learning process wherever it is happening on campus, and to support education’s vital moment of truth -- a transaction of knowledge between professors and students.
Frederick Singer is CEO and founder of Echo360, which provides active learning and lecture capture services to more than 650 higher ed clients in 30 countries.
George Mason economics professor has won many fans with playful videos, but a Roosevelt U professor and his students have produced a response -- and fans of the Austrian school may need to cover their ears.
You can’t judge a book by its neologisms, but the coinages appearing in the first chapter or two of Carl Cederström and André Spicer’s The Wellness Syndrome (Polity) serve as pretty reliable landmarks for the ground its argument covers. We might start with “orthorexia,” which spell-check regards with suspicion, unlike “anorexia,” its older and better-established cousin.
Where the anorexic person avoids food as much as possible, the orthorexic is fixated on eating correctly -- that is, in accord with a strict and punitive understanding of what’s healthy to eat, and in what quantities, as well as what must be avoided as the culinary equivalent of a toxic landfill. It is a sensible attitude turned pathological by anxiety. And in the authors’ interpretation, that anxiety is socially driven: the product of “biomorality,” meaning “the moral demand to be happy and healthy,” as expressed in countless ways in a culture that makes chefs celebrities while stigmatizing the poor for eating junk food.
But diet is only one bailiwick for “wantologists,” somewhat better known as “life coaches,” whose mission it is to “help you figure out what you really want” in life. Cederström is an assistant professor of organizational theory at Stockholm University, while Spicer is a professor of organizational behavior at City University, London. I take it from their account that the wantological professions (there are certification programs) extend beyond one-on-one consulting to include the market in self-improvement and motivational goods and services such as books, workshops and so on. The goal in each case is the combination of physical fitness and positive mental attitude that amounts to an “ideal performance state” for the contemporary employee.
“A recent survey by RAND,” we learn, “found that just over half of U.S. employers with more than 50 staff offer some kind of workplace wellness program,” while 70 percent of companies in the Fortune 200 do so. “In total, U.S. employers spend about $6 billion a year on such programs,” which “are often tied up with employees’ health insurance.”
“Know Yourself, Control Yourself, Improve Yourself” reads one of the chapter subheads, as if to list the slogans from some Orwellian Ministry of Wellness. But where Big Brother ruled through the repression of desire and personal identity, the cultural regime defined by what the authors call “the wellness command” makes every possible concession to individuality and contentment. Indeed, it demands them. Every aspect of life becomes “an opportunity to optimize pleasure and become more productive,” and the experts warn that faking it won’t help: the satisfaction and self-realization must be authentic. We are all the captains of our fates and masters of our souls. Failure to stay healthy and happy -- and flexible enough to adapt to whatever circumstances the labor market may throw at you -- is ultimately a personal and moral failure. So you’d better get some life coaching if you know what’s good for you, and maybe especially if you don’t.
“What is crucial is not what you have achieved,” write Cederström and Spicer, “but what you can become. What counts is your potential self, not your actual self.” The titular syndrome refers to the cumulative strain of trying to respond to all the wellness commands, which are numerous, conflicting and changeable -- a perfect recipe for chronic anxiety, of which an obsession with eating correctly seems like an exemplary symptom. On first reading, I took “orthorexia” to be the authors’ own addition to the language (like “the insourcing of responsibility” and “authenticrat,” per the tendencies described a moment ago) but in fact it turns out to be an unofficial diagnosis in the running for future lists of psychiatric disorders.
The Wellness Syndrome offers, by turns, both a recognizable survey of recent cultural trends and a collage of insights drawn from more original works of social analysis and theory. Much of it will seem more than a little familiar to readers already acquainted with Christopher Lasch’s The Culture of Narcissism, Eve Chiapello and Luc Boltanski’s The New Spirit of Capitalism, Slavoj Zizek’s sundry discussions of the contemporary superego, or any given book by Zygmunt Bauman or Barbara Ehrenreich published in the past twenty years. These works are duly cited but the ideas not pushed in any new direction. The common principle subtending them all is that cynicism about institutions or the possibility of large-scale social change creates a privatized, moralistic ideology that traps people into punitive introspection or the fine-tuning of lifestyles. Unfortunately much of The Wellness Syndrome reads as if such trends began under the administrations of Bill Clinton and Tony Blair.
Alas, no. They were already visible 40 years ago as baby boomers began signing up for weekend explorations in self-discovery with unlicensed therapists who yelled insults at them and wouldn’t let them use the bathroom. Nothing in the new book points to any means or agency capable of changing things in any fundamental way, or even of imagining such a change. Social scientists aren't obliged to be prophets and, of course, they seldom do a very good job when they try; at best they describe and analyze change once it's discernable, not before. But after seven or eight years of shocks and aftershocks from a global financial crisis, it's time for books that do more than put new labels on decades-old problems.
This is not the best of times for faculty members. Many of the problems they face are beyond their control. And yet there are some they can address, especially if they are fortunate enough not to belong to the growing numbers of non-tenure-track, part-time, contingent faculty, but to those who can reasonably expect a secure future in the academy.
First and foremost is how they can transcend the barriers dividing them in finding the best way to serve their students, coming together not just as scholars in the same field and comrades in arms against administrators they perceive as soulless, but as a community of teachers. How can they achieve this by expanding their concept of what is, in fact, “their department”?
For one thing, how might they expand their thinking about the goals of their disciplinary departments themselves? For another, how can they go beyond a focus on their respective departments to contribute to the mission of the wider institution of which they are a part (and which, by the way, pays their salaries)?
We might begin by asking: Are faculty members taking an overly provincial approach, both intellectually and professionally, to their respective departmental programs? Insofar as an undergraduate major is focused on what a student will need to enter a graduate program, it is more properly seen as vocational training than as an integral part of a liberal arts education. Majors with relatively heavy requirements lead to a level of specialization that may be desirable for some students, but unnecessary and premature for others, many of whom will never seek a graduate degree in the field of their major. It is always possible to serve the interests of those heading to graduate school in the field by providing special curricular enhancements.
Faculty members should also consider how undergraduate departmental majors can connect more organically with one another and with the wider curriculum of the institution. This interest is not served simply by creating new interdisciplinary programs, since too often these have simply resulted in a proliferation of departmentlike entities and have failed to create greater intellectual coherence in the undergraduate experience as a whole. So, for example, in the place of separate ethnic studies programs and departments, one might instead see greater multicultural sophistication in the United States history curriculum, not to mention stronger collegial ties among faculty -- and hence students -- in the departments of history, anthropology, sociology and literature. The outcome might also yield a course or courses deemed desirable for all undergraduates.
If, in the spirit of John Donne, we wish to believe that no department is an island entire of itself, that every department is a piece of the main, we are no longer in a position to follow Donne’s next move and argue that if a single program be washed away (presumably, by the administration), the institution is less. As an institution continues to add programs without ever subtracting any, the curriculum comes to take on the aspect of a zombie movie in which the living cohabit with the undead and much frantic bumping into one another ensues.
On occasions when faculty come together for the lengthy, intensive process of an institution-wide “curriculum review,” the outcome too rarely justifies the time and energy expended. (I believe comparative research would show that, in general, the more elite the institution, the more modest the results.) Aside from their ritual dimension, such processes commonly involve the kind of logrolling especially familiar to political scientists, in which faculty members approach “general” or “distributional” requirements in terms of how their respective departmental interests are being served.
And yet, there have been some curriculum reviews that actually aim to make the student experience intellectually coherent, providing room for varying interests and passions while creating a student community that reflects the mission and identity of the institution. And apparently succeed in doing so. Some of us in the foundation world have been in a position to encourage this process, supporting those who are doing the real work.
How might graduate programs also better serve their students’ interests? Leaving aside the question of preparing graduate students for careers outside the academy altogether, graduate programs need to consider preparing them for the range of institutions within the universe of higher education in which they may find themselves. This means focusing on preparing students as teachers and not just as researchers, especially since their students’ chances of getting positions in research universities are clearly shrinking (though, even in such universities, better preparation as teachers would stand them in good stead).
Given that teaching assistantships are an important way of financially supporting graduate students, departmental faculty must decide whether they are viewing those students as junior colleagues or as cheap labor. This choice clearly influences how graduate students see themselves, as well as how well equipped they are for their working lives after graduation. Is responsibility for helping them develop as teachers being farmed out to teaching and learning “centers,” which are all too often teaching and learning “peripheries”? Or are there the strong collaborative ties between such centers and departmental faculty that are essential to the professional development of graduate students?
Some graduate programs are stepping up to this particular plate; more need to do so. Perhaps one way of getting their attention is to present them with the following choice: either (1) broaden the graduate program to properly prepare admitted students for a wider range of careers in higher education and beyond, or (2) limit the number of admitted students to those who are either likely to find jobs in research universities or who are interested in graduate education for its own sake and harbor no expectations about how the program will advance their future careers. Departments choosing the second option would have to find other ways for senior faculty members to occupy their time, which might possibly involve teaching undergraduates.
To put these two options in terms of reproductive biology, some species follow what is termed the R-selection strategy, in which a large number of offspring are produced and few are expected to survive. On the other hand, species that pursue the K-selection strategy produce fewer offspring but invest in them heavily, which results in their relatively high survival rate. Graduate departments, being (generally) composed of human beings, should presumably follow the strategy characteristic of our species.
And if, to continue the biological metaphor, we take note that evolutionary theory in general has come to emphasize cooperation as well as competition, we want to be sure that academics, as a population, are not so focused on departmental rivalries and individual career ambitions that they fail to have a sufficient regard for the common good.
A final point: the case for tenure is most commonly made in terms of academic freedom, which is certainly important. But the argument for tenure would be further strengthened if tenure were seen to reflect a deep mutual commitment between a faculty member and an institution -- a mutual commitment that truly serves them both.
Judith Shapiro is president of the Teagle Foundation and a former president of Barnard College.