You have /5 articles left.
Sign up for a free account or log in.
I have a hard time understanding why anyone would want to place a surveillance device in their home so they can play music, check the weather, or look up a Wikipedia article without having to use the phone that is probably already in their hand. These gadgets (such as Amazon’s Echo or Google’s Home) make lots of sense for people with sight impairments, but for the rest of us? I don’t get it, particularly since they relay information about our private lives to the mothership and, when hooked up to smart devices, make your house extremely hackable.
So I was none too thrilled to read that three universities have decided to show how innovative they are by giving students Echo Dots that will be trained to answer questions students might have, like “when is finals week?” or “is the library open?” Eventually, proponents hope they will be able to answer highly personalized questions – “what grade did I get on my chemistry test?” – and even become personal tutors. Because going to college is all about spending time in your room talking to a sentient hockey puck.
A funny thing: in 2013, when Google began to roll out Google Glass, a virtual assistant you could wear on your head like glasses, there was a backlash that effectively tanked the product for the consumer market. People were unhappy about the notion of being recorded by people wearing these gadgets who were also rich enough to buy the expensive prototype. Surveillance cameras are one thing; in this case you could see who was violating your privacy and you could see it actually happening. This is similar to the reaction you get when someone says “I’m not worried about privacy because I have nothing to hide” but balks when you ask for their phone so you can read their messages. When you actually see a privacy violation as it happens and see the person doing it, it’s much more upsetting than if it's invisibly in the background and no humans appear to be involved, though of course, they are. Inviting Alexa into your bedroom is kind of like inviting a bunch of strangers to sit beside your bed and take notes.
Another thing that’s not so funny: these devices invite you to ask questions, but their answers aren’t always factually correct. How do they respond when the question is complex or there isn’t a singular answer? (The second link is to a paywalled WSJ article, worth looking up through your library if you don't subscribe.) How hard would it be to game Google’s algorithms to give out the answers you prefer? Google search is not a library, it’s a sort-of-library where people bring their stuff to be shelved by a secret classification system and half the books are trying to sell you something. Search engine optimization works the same for selling hate as for selling hats. Apart from its fallibility, in a higher ed setting, do we want to convey that questions have a single answer that's easily found?
How much is the university spending on these things? The devices are inexpensive because you provide a service to the company – a stream of data about your life and a frictionless way to make an impulse purchase) but it’s not just the cost of the devices. Arizona State has several FTE devoted to programming the fleet of hockey pucks to do things like play the school’s fight song. If you want these devices to do anything that isn’t out-of-the-box you have to teach it “skills,” first. This is how big tech makes so much money with relatively few employees - other people have create the content or drive personally-owned cars or whatever it is you're disrupting. You might remember that a few years ago Arizona State decided to fix a budget shortfall by upping the course load for first-year writing instructors from four courses per semester to five without additional pay. Negotiations improved the deal – instructors teaching ten writing courses in a year would earn $40,000. They should be teaching hockey pucks instead of students.
We need to ask “cui bono?” when we look at technological ways to “improve student learning” (which is often another way of saying “to improve our retention stats”) and “personalize learning” (which is often another way of saying “this product will help us trim staff, or cope with the problems we caused when we trimmed staff”). When we outsource the human touch to algorithms to solve institutional problems, we’re also giving our students’ lives and experiences to other people, and we don’t have the right to do that. In this case, I suspect giving out Alexas is a Shiny New Thing, not a trend, but it's emblematic of how little we value students' autonomy when we don't value their privacy.
PS: For librarians thinking about privacy issues in your own workplaces, here are a couple of useful things to read, one a classic, one new: Kyle Jones and Dorothea Salo’s article “Learning Analytics and the Academic Library: Professional Ethics Commitments at a Crossroads” and a just-released report, Library Values & Privacy in Our National Digital Strategies: Field Guides, Convenings, and Conversations.