As I read the paper (on paper!) on Sunday, I learned The New York Times has plans to make their online site more personalized. Their current public editor, Liz Spayd, urges the Gray Lady to think about how to avoid creating mini-filter bubbles and suggests that readers be allowed to give feedback – something like a thumbs up or down button – to be “an active part of the process.”
Here’s the thing I realized as I folded back the broadsheet and read that article: I already personalize my reading of the paper. I’m in complete control of whether I read this story or that, whether I put the Travel and Style sections aside while I pore over the national and international news and reach for the Review and page through the business section quickly to see if there's anything of interest. What makes people think news isn’t already personalized? It always has been. By us, the readers.
Sure, the editor decides which stories appear on A1 above the fold, but that doesn’t mean I have to read what’s in that prime spot (though this past Sunday it was well worth it). When I’m reading on my phone, I can decide which sections to browse and which stories to read all the way through, which to skim or skip. I don’t want someone presuming they can do it better than I can, especially if it requires following me around and reading over my shoulder, taking notes on my reading habits.
Of course, the choices I make are captive to a news editor’s judgment. Somebody has to decide where to put journalistic resources and decide when a story isn’t sufficiently sourced to run or whether there are gaps in coverage that need filling. While newspapers are still printed on paper, choosing what goes where and how many column inches it gets is a judgment call that I don’t get to make. Okay, understood. But please, don’t presume to make the decisions for me about what I should read based on my location and what I’ve read in the past. It’s not only creepy, it’s curtailing my freedom to read what I want, how I want, when I want. I want my own freedom of the press.
The more our information worlds are sculpted for us by algorithms we can’t inspect or control, the more our future is a narrowing of focus determined by what we’ve seen before, the less freedom we have to explore ideas. Exploration will always be shaped by human intervention – people adding metadata to records, for example, or making sure titles and abstracts convey key information, or deciding where to put a book on a library shelf using outdated categories created decades ago. Some of it will be shaped by the platform or by past practice. But I would consider it problematic if I couldn’t find a piece of research because someone else decided it wouldn’t interest me and removed it from my search results. It would bother me if two scholars searching the same database in exactly the same way found different results based on assumptions they didn’t make themselves. I know Google does this, but it’s an advertising business; search is just the lure. News organizations and research databases don’t have to follow its irritating lead.
Back in 1945 when Vannevar Bush imagined a machine for the management of information, he envisioned people creating “trails of association” among texts, like instant footnotes or hyperlinks created by the reader. These paths, he thought, could be shared and there could be “professional trail blazers” whose work was creating paths for others to follow. His vision was based on mechanizing much of the work of research, making indexing better, making access faster and more ubiquitous. Using a lot of microfilm to make it more compact. Yet “the creative aspect of thinking” still involved people making connections, not machines.
No doubt, and possibly very soon, artificial intelligence will be smart enough to do this kind of work well, but until the algorithms are a lot smarter than they are now, I don’t want some other person hiding things from me in the name of personalization. I'd rather make my own choices.