You have /5 articles left.
Sign up for a free account or log in.
When the teacher and poet Taylor Mali declares, “I can make a C+ feel like a Congressional Medal of Honor and an A- feel like a slap in the face,” he testifies to the powerful ways teachers can use emotions to help students learn and grow. Students -- and their parents -- put a great deal of trust in college educators to use these powers wisely and cautiously. This is why the unfolding debacle of the Facebook emotional contagion experiment should give educators great pause.
In 2012, for one week, Facebook changed an algorithm in its News Feed function so that certain users saw more messages with words associated with positive sentiment and others saw more words associated with negative sentiment. Researchers from Facebook and Cornell then analyzed the results and found that the experiment had a small but statistically significant effect on the emotional valence of the kinds of messages that News Feed readers subsequently went on to write. People who saw more positive messages wrote more positive ones, and people who saw more negative messages wrote more negative ones. The researchers published a study in the Proceedings of the National Academy of Sciences, and they claimed the study provides evidence of the possibility of large-scale emotional contagion.
The debate immediately following the release of the study in the Proceedings of the National Academy of Sciences has been fierce. There has been widespread public outcry that Facebook has been manipulating people’s emotions without following widely accepted research guidelines that require participant consent. Social scientists who have come to the defense of the study note that Facebook conducts experiments on the News Feed algorithm constantly, as do virtually all other online platforms, so users should expect to be subject to these experiments. Regardless of how merit and harm are ultimately determined in the Facebook case, however, the implications of its precedent for learning research are potentially very large.
All good teachers observe their students and use what they learn from those observations to improve instruction. Good teachers assess and probe their students, experiment with different approaches to instruction and coaching, and make changes to their practice and pedagogy based on the results of those experiments. In physical classrooms, these experiments are usually ad hoc and the data analysis informal.
But as more college instruction moves online, it becomes ever easier for instructors to observe their students systematically and continuously. Digital observation of college instruction promises huge advances in the science of learning. It also raises ethical questions that higher education leaders have only begun to address.
What does it mean to give consent in an age of pages-long terms-of-service documents that can be changed at any time? In a world where online users should expect to be constantly studied, what conditions should require additional consent? What bedrock ethical principles of the research enterprise need to be rethought or reinforced as technology reshapes the frontiers of research? How do we ensure that corporate providers of online learning tools adhere to the same ethical standards for research as universities?
If the ultimate aim of research is beneficence -- to do maximum good with minimum harm -- how do we weigh new risks and new opportunities that cannot be fully understood without research?
Educational researchers must immediately engage these questions. The public has enormous trust in academic researchers to conduct their inquiries responsibly, but this trust may be fragile. Educational researchers have not yet had a Facebook moment, but the conditions for concern are rising, and online learning research is expanding.
Proactively addressing these concerns means revisiting the principles and regulatory structures that have guided academic research for generations. The Belmont Report, a keystone document of modern research ethics, was crafted to guide biomedical science in an analog world. Some of the principles of that report should undoubtedly continue to guide research ethics, but we may also need new thinking to wisely advance the science of learning in a digital age.
In June 2014, a group of 50 educational researchers, computer scientists, and privacy experts from a variety of universities, as well as observers from government and allied philanthropies, gathered at Asilomar Conference Grounds in California to draft first principles for learning research in the digital era. We released a document, the Asilomar Convention for Learning Research in Higher Education, which recognizes the importance of changing technology and public expectations for scientific practice.
The document embraces three principles from the Belmont Report: respect for persons, justice, and beneficence. It also specifies three new ones: the importance of openness of data use practices and research findings, the fundamental humanity of learning regardless of the technical sophistication learning media, and the need for continuous consideration of research ethics in the context of rapidly changing technology.
We hope the Asilomar Convention begins a broader conversation about the future of learning research in higher education. This conversation should happen at all levels of higher education: in institutional review boards, departments and ministries of education, journal editorial boards, and scholarly societies. It should draw upon new research about student privacy and technology emerging from law schools, computer science departments, and many other disciplines.
And it should specifically consider the ethical implications of the fact that much online instruction takes the form of joint ventures between nonprofit universities and for-profit businesses. We encourage organizers of meetings and conferences to make consideration of the ethics of educational data use an immediate and ongoing priority. Preservation of public trust in higher education requires a proactive research ethics in the era of big data.