When downloading an app or approving a software update, I now usually hesitate for a moment to consider something the comedian John Oliver said early this summer: a software company could include the entire text of Mein Kampf in the user agreement and people would still click the “agree” button.
“Hesitates” is the wrong word for something that happens in a fraction of a second. It’s not as if I ever scrolled back through to make sure that, say, Microsoft is not declaring that it owns the copyright to everything written in OneNote or Word. The fine print goes on for miles, and anyway, a user agreement is typically an all-or-nothing proposition. Clicking “agree” is less a matter of trust than of resignation.
But then, that’s true about far more of life in the contemporary digital surround than most of us would ever want to consider. Every time you buy something online, place a cell phone call, send or receive a text message or email, or use a search engine (to make the list no longer nor more embarrassing than that), it is with a likelihood, verging on certainty, that the activity has been logged somewhere -- with varying degrees of detail and in ways that might render the information traceable directly back to you. The motives for gathering this data are diverse; so are the companies and agencies making use of it. An online bookseller tracks sales of The Anarchist Cookbook in order to remind customers that they might also want a copy of The Minimanual of the Urban Guerrilla, while the National Security Administration will presumably track the purchase with an eye to making correlations of a different sort.
At some level we all know such things are happening, probably without thinking about it any more often than strictly necessary. Harder to grasp is the sheer quantity and variety of the data we generate throughout the day -- much of it trivial, but providing, in aggregate, an unusually detailed map of what we do, who we know and what’s on our minds. Some sites and applications have “privacy settings,” of course, which affect the totality of the digital environment about as much as a thermostat does the weather.
To be a full-fledged participant in 21st-century society means existing perpetually in a state of information asymmetry, in the sense described by Finn Brunton and Helen Nissenbaum in Obfuscation: A User’s Guide for Privacy and Protest (MIT Press). You don’t have to like it, but you do have to live with it. The authors (who teach media culture and communications at New York University, where they are assistant professor and professor, respectively) use the term “obfuscation” to identify various means of leveling the playing field, but first it’s necessary to get a handle on information asymmetry itself.
For one thing, it is distinct from the economic concept of asymmetrical information. The latter applies to “a situation in which one party in a transaction has more or superior information compared to another.” (So I find it explained on a number of websites ranging from the scholarly to the very sketchy indeed.) The informed party has an advantage, however temporary; the best the uninformed can do is to end up poorer but wiser.
By contrast, what Brunton and Nissenbaum call information asymmetry is something much more entrenched, persistent and particular to life in the era of Big Data. It occurs, they explain, “when data about us are collected in circumstances we may not understand, for purposes we may not understand, and are used in ways we may not understand.” It has an economic aspect, but the implications of information asymmetry are much broader.
“Our data will be shared, bought, sold, analyzed and applied, all of which will have consequences for our lives,” the authors write. “Will you get a loan, or an apartment, for which you applied? How much of an insurance risk or a credit risk are you? What guides the advertising you receive? How do so many companies and services know that you’re pregnant, or struggling with an addiction, or planning to change jobs? Why do different cohorts, different populations and different neighborhoods receive different allocations of resources? Are you going to be, as the sinister phrase of our current moment of data-driven antiterrorism has it, ‘on a list’?”
Furthermore (and here Brunton and Nissenbaum’s calm, sober manner can just barely keep things from looking like one of Philip K. Dick’s dystopian novels), we have no way to anticipate the possible future uses of the galaxies of personal data accumulating by the terabyte per millisecond. The recent series Mr. Robot imagined a hacker revolution in which all the information related to personal debt was encrypted so thoroughly that no creditor would ever have access to it again. Short of that happening, obfuscation may be the most practical response to an asymmetry that’s only bound to deepen with time.
A more appealing word for it will probably catch on at some point, but for now “obfuscation” names a range of techniques and principles created to make personal data harder to collect, less revealing and more difficult to analyze. The crudest forms involve deception -- providing false information when signing up with a social media site, for example. A more involved and prank-like approach would be to generate a flood of “personal” information, some of it true and some of it expressing one’s sense of humor, as with the guy who loaded up his Facebook profile with so many jobs, marriages, relocations, interests and so on that the interest-targeting algorithms must have had nervous breakdowns.
There are programs that will click through on every advertisement that appears as you browse a site (without, of course, bothering you with the details) or enter search engine terms on topics that you have no interest in, thereby clouding your real searches in a fog of irrelevancies.
The cumulative effect would be to pollute the data enough to make tracking and scrutiny more difficult, if not impossible. Obfuscation raises a host of ethical and political issues (in fact the authors devote most of their book to encouraging potential obfuscators to think about them) as well as any number of questions about how effective the strategy might be. We’ll come back to this stimulating and possibly disruptive little volume in weeks to come, since the issues it engages appear in other new and recent titles. In the meantime, here is a link to an earlier column on a book by one of the co-authors that still strikes me as very interesting and, alas, all too pertinent.