Privacy Opt-Out May Lead to Inequities

Rushing to provide opt-out mechanisms may be hurting rather than protecting students, writes Christopher Brooks, who teaches in a school of information.

September 20, 2021
 
Igor Korchak/iStock/Getty Images Plus
 

Institutions considering allowing students to opt out of data sharing should consider very carefully whether this may create or further amplify inequities faced by learners. Known as consent bias, the problem is that those students who choose to opt out (or decide not to opt in) may differ systematically, such that the conclusions or actions taken based on the data will unfairly bias one of the groups of students.

At the moment, students generally don’t feel they can control access to the data their college collects about them. According to the Student Voice survey conducted this summer by Inside Higher Ed and College Pulse, with support from Kaplan, only 22 percent of students believed they could restrict access to this, while 9 percent did not and the vast majority -- 69 percent -- weren’t sure.

Student Voice explores higher education from the perspective of students, providing unique insights on their attitudes and opinions. Kaplan provides funding and insights to support Inside Higher Ed’s coverage of student polling data from College Pulse. Inside Higher Ed maintains editorial independence and full discretion over its coverage.

Presented by

But rushing to provide opt-out mechanisms may in fact be hurting students, not protecting them. I’m confident that this is a real problem we face.

We emailed 4,000 students at the University of Michigan in 2019 to see if they would opt out (or in) to data sharing to support learning analytics tools. The responses, detailed in this recent International Journal of Artificial Intelligence in Education article, demonstrated clear consent bias: women were more likely to respond than men, and students who identified as Black were less likely to respond, while students who identified as white were more likely to respond. And these response differences were on top of the already skewed demographics of the student population.

The issue of trust that the institution will use the data responsibly stands as the largest factor guiding students’ decisions to consent.

Women were significantly more trusting of the institution or the instructor, despite also having more concerns about the practice of personal data collection. Black students, however, voiced less trust in the institution. This points to a clear opportunity for institutions to build bridges of shared understanding and transparency with students around data collection, and supports initiatives like the U-M ViziBlue dashboard, which gives students information on the types of data that are collected on them and how they are used and shared.

This concern around opting out goes beyond an individual student’s own success. Machine learning methods are increasingly being used to power educational support tools, and the data these algorithms rely upon affect how they function across students. If a student opts out of sharing data, or declines to opt in, we should expect that support tools will become less accurate for similar students at the institution.

Given the consent bias we see, this should give institutions real concern that individual choice alone will have a disproportionate negative impact on groups of learners that may be historically disadvantaged.

Related Stories

Thus, the choice to opt out of data sharing needs to be weighed against the obligation that the student -- and the institution -- has to support the broader learning environment. In light of this, the ethical choice is to proactively engage students through the data to show them how they are used to support the learning of themselves and others.

Indeed, this is a teaching opportunity for us as educators and sits at the nexus of privacy, data literacy and civic engagement. Instead of encouraging opting out, we should be educating on the positive impacts of the data through transparency, while listening to the concerns and ideas students bring to the discussion. Not only will this allow us to support the equitable impact of data-driven educational supports, but it will also allow us to strengthen our relationship with students by building trust in the institution and our educational mission.

Bio

Christopher Brooks is an assistant professor of information in the School of Information at the University of Michigan.

Read more by

Back to Top