SHARE

Unanswered Questions

June 23, 2014

Almost 40 years ago, a commission met to establish ethical guidelines governing research with human subjects. Earlier this month, a similar group met to discuss a different kind of guinea pig: the online learner.

The Asilomar Convention for Learning Research in Higher Education, a gathering of faculty members, researchers and legal scholars, on Monday produced a two-page report affirming that higher education research should be guided by transparent data collection processes, shared results and respect for the learner.

Asilomar Report Principles

  • Researchers and institutions should be especially vigilant with collecting and using identifiable learner data.
  • Results of research should be made publicly available.
  • Research should enable the use of learning data in the service of providing benefit for all learners.
  • Data, analytic techniques and research results should be open and accessible.
  • Digital technologies should enhance -- but never be allowed to erode -- the relationships that make learning a humane enterprise.
  • Ethically responsible learner research requires ongoing and broadly inclusive 
    discussion.

The purpose, the group said, is to advance research that could improve higher education outcomes -- work they fear may be held back by confusion over what data belongs to whom and which regulations apply where.

“I frankly think that we need a new starting place for thinking about online learners, and that’s what the convention seeks to do,” said Mitchell L. Stevens, associate professor of sociology and director of digital research and planning at Stanford University. “Is the learner a student? Is the learner a human subject? Is the learner a consumer? Or is the learner a new kind of thing -- a new kind of creature -- that deserves special ethical protections?”

Between federal regulations such as the Family Educational Rights and Privacy Act, consumer laws and institutional review board protocols, there is no shortage of regulations that could protect the rights of online learners. But when Stevens last spring was tasked with reviewing how Stanford’s protocols might apply to the booming field of learning analytics and higher education research, he said he realized cobbling together existing regulations won’t solve the problem.

“There are just as of yet too many unanswered regulatory questions,” Stevens said. “There are too many guidelines that were designed for a different world of data production. It’s not that the ethics have changed -- it’s the conditions under which ethical decisions have to be made.”

The uncertainty expressed by Stevens extends beyond the nebulous group of online learners. As colleges and universities ink deals with companies to provide services hosted in the cloud, many university attorneys are growing increasingly concerned that valuable data is leaking to third parties whenever a residential student opens an app or watches a recorded lecture.

The Asilomar Report follows what has been a tumultuous year for privacy concerns. One year ago, news organizations first began to document the National Security Agency’s vast surveillance capabilities. This spring, the collapses of the student data repository inBloom and the college and career planning startup ConnectEDU raised questions about what happens to student data when stewards go out of business. Days later, Google said it would stop automatically scanning student email for ad keywords, drawing astounded reactions from many unaware of the practice.

To cap things off, a White House working group on May 1 released an expansive report on big data, warning that its uses in education “raises serious questions about how best to protect student privacy.”

Together, these and other events represent a “perfect storm” that has contributed to a greater level of awareness about privacy issues in higher education, legal scholars and industry representatives say.

“I connect inBloom with Google Apps for Education retreating,” said Joel R. Reidenberg, the Stanley D. and Nikki Waxberg chair and professor of law at Fordham University who founded the Center on Law and Information Policy. “What you have are two big data programs involving secondary uses of student information, and they each melted down over privacy. What the higher ed space will learn from this is that privacy is a fault line, and you have to be really, really careful.”

As the placement of general counsel offices at high levels of administration shows, Reidenberg said, colleges and universities are more accustomed to dealing with privacy issues. Not only are institutions required to comply with FERPA and its health care equivalent, the Health Insurance Portability and Accountability Act, but also regulations such as the Gramm-Leach-Bliley Act, which covers financial information.

“Higher ed is more regulated than K-12, and they have the legal resources to really confront it,” Reidenberg said. “That said, they’re still grappling, because they don’t know what to do.”

One way to instill some confidence in universities and their researchers, Stevens said, involves getting companies and the institutions that license their services to agree on how data can and can’t be used. For example, if a student uses the digital companion that came with a physical textbook, is it ethical for a publisher to use the data to tailor future products? And should residential students using content from a massive open online course in a class receive more protection than adult learners taking the MOOC for their own enjoyment?

“Just because I clicked a button somewhere that says ‘I agree’ does not make that process ethically reasonable,” Stevens said. “Clickstream data -- data from online learning environments -- happens continuously and often invisibly to the user. What it means to consent to a research study is different if the research study is simultaneous [to] the learning.”

Steve Mutkoski, worldwide policy director for Microsoft, said recent front-page stories about privacy have changed how institutions negotiate with the company.

"Over the past year, we have seen student privacy go from being an issue raised only occasionally and at a superficial level, to one that is increasingly a top-three issue for most potential customers in the education space," Mutkoski said. "Last year, a vendor could take privacy concerns off the table by saying they would handle student data consistent with FERPA requirements -- now the resolution of the issue often involves probing questions about how the vendor will use student data and more detailed contractual terms restricting a vendor’s rights to use student data."

When they sit down at the negotiating table, however, institutions and vendors are often far apart. One university attorney exasperatedly claimed to have spent “unbelievable amounts of time negotiating with companies” to ensure the institution complies with FERPA. “It’s like Sisyphus pushing boulders up the hill,” the attorney said.

Madelyn F. Wessel, associate general counsel at the University of Virginia, suggested that universities wouldn’t be the only party to benefit from contract negotiations where a conversation about privacy is a given.

“Companies that want to work with institutions of higher education around the development of learning analytics and other technology tools simply have to be cognizant of the privacy and stewardship responsibilities that the institutions have of the students -- and the federal regulatory overview of FERPA and other laws that have to be complied with,” Wessel said, adding that “Companies that are willing to engage appropriately with institutions of higher education can be extremely valuable partners.”

And that's where the debate appears to have stalled. Without any sweeping new regulations, faculty members and students will continue to use tools without first reading the terms of service, vendors will continue to benefit from the data they can collect and university counsel offices will continue to plug whatever gaps they can find.

“The conversation that we convened at Asilomar was among research institutions, but our explicit hope is that proprietary providers would recognize and honor the convention that we developed," Stevens said. "Our hope is that people will see the value of our convention and participate in the enterprise of doing science in the field."

 

Most:

  • Viewed
  • Commented
  • Past:
  • Day
  • Week
  • Month
  • Year
Loading results...
Back to Top