You — and You and You — and the Algorithm

In explaining your personal Facebook feed, Nick Clegg forgets about everyone else

“Although Facebook’s critics often talk about sensational content dominating News Feed… many of the most popular posts on News Feed are lighthearted. They’re feel-good stories. We want to show people that the overwhelming majority of the posts people see on News Feed are about pets, babies, vacations, and similar. Not incendiary topics,” Nick Clegg, Facebook’s head of global affairs told Casey Newton last week, as they discussed his recent lengthy defence of Facebook’s algorithm, and its role in social and political polarization,. “In fact,”Clegg went on, “I think on Monday, one [of the] most popular posts in the US was a mother bear with three or four baby cubs crossing a road.”

The bulk of Clegg’s post outlines a simplified perspective on Facebook’s News Feed ranking technology — why you see the stuff you see when you sign into Facebook — placing, as Facebook frequently does, much of the onus on the user. “Thousands of signals are assessed for these posts, like who posted it, when, whether it’s a photo, video or link, how popular it is on the platform, or the type of device you are using,” Clegg writes. “From there, there algorithm uses these signals to predict how likely it is to be relevant and meaningful to you: for example, how likely you might be to ‘like’ it or find that viewing it was worth your time.”

It all sounds a bit like the cartoon DNA strand in Jurassic Park, explaining the process of creating a monster and deliberately leaving out some of the crucial details about how it’s actually made before setting it upon a group of unsuspecting humans.

For instance, Clegg skips briskly past the genesis details. Clegg asks that we accept there are thousands of signals, but doesn’t totally explain how they’re generated in the first place. While Clegg is intent on focusing on the individual user’s Feed, he conveniently neglects to mention much of the information feeding the algorithm, deciding what you might be interested in or find meaningful, is generated from people other than you and your friends.

Facebook, like other web giants (like Google), uses a process called collaborative filtering to determine what content might interest users, beyond what they might immediately like, comment on, or search for. Collaborative filtering is a “recommender systems technique that helps people discover items that are most relevant to them… this might include pages, groups, events, games, and more,” Facebook’s engineering team wrote in a blog post in 2015. Collaborative filtering “is based on the idea that the best recommendations come from people who have similar tastes.” In other words, you may be shown content that interests you based on choices someone else — whose profile is a bit like yours — made.

It’s even missing from the diagram Clegg includes in his post, in which the “inventory” the ranking algorithms use in the process of generating content for the News Feed, is: “posts from Friends, Pages and Groups that you are eligible to see, minus posts that are removed under our Community Standards.” Other people, Facebook’s billion-or-so other users beyond those who are either your Friends or part of the same Groups you’ve joined, aren’t mentioned anywhere.

Instead of offering users less algorithmic influence on their Feed — which, presumably, might limit the impact of collaborative filtering and indeed only show users posts from their friends or pages they deliberately choose to follow — Clegg offers more of it. He introduces a new product called Favorites, which will allow users to “train” the algorithm by allowing you “to see the top friends and Pages that Facebook predicts are the most meaningful to you — and, importantly, you can adopt those suggestions or simply add other friends and Pages if you want,” Clegg writes.

But while Clegg argues that “Facebook’s systems are not designed to reward provocative content” and that the platform actively “reduces the distribution” of that content by demoting distribution of things like sensational health claims or spam, he doesn’t explain what might happen if someone were to train their algorithm, using the new tools Clegg unveils. Would the algorithm still actively demote content someone deems personally meaningful, even if it’s sensational (or worse)? If so, on what grounds? What’s ultimately more important, the personalized experience or Facebook’s community standards? Recent history shows the former usually wins out.

Left unsaid, also, is whether or how those selections — and similar selections made by other users — ultimately impact everyone else. It’s difficult to believe that Facebook would create even greater user personalization without seeing the value in the resulting data. This option will now allow Facebook to determine (at least) what kinds of Pages are deemed meaningful and important to certain kinds of people — and that’s saying nothing of the friend ranking. This is Facebook getting you to do its work for it, laboring to train an algorithm for Facebook’s ultimate financial gain, while you — and everyone else — get… what, again?

The problem with social platform filters isn’t what they keep out, it’s what they let in.

After all, isn’t it everyone else we should ultimately worry about — not just how good our algorithm is, but how the choices we make to perfect it will impact other people?

Clegg goes to great lengths to dispel the idea that so-called filter bubbles have the power some claim, particularly in dividing society. Citing studies from Stanford, Harvard, and Pew, Clegg makes a decent case to support the idea that social media filters don’t work quite the way we once thought they might. In short, Clegg argues, they don’t tend to keep content — stuff we might find disagreeable or simply from a different perspective — away from us. And fair enough. But maybe that’s not actually the problem. The problem with social platform filters isn’t what they keep out, it’s what they let in.

Which is precisely why Clegg’s anecdote regarding the most popular post on Facebook being about a bear and its cubs crossing a road pretty disingenuous — or at least erroneous. As Clegg surely knows, the problem with Facebook is not its most popular post of the day; the problem is much deeper, much more central to its business model than that.

The problem is Facebook cannot stop recommending things, period — whether that’s cute videos of bears or militia groups — and that it operates on a foundational assumption, built into not only its algorithms but its ethos of community-building and connection, that the best recommendations are from people who share common features. This is why it’s been so successful, but also why it will never escape its tendency to spread things like misinformation or hate. What people love about Facebook is that it always gives them what they want. The problem is that it’s designed to give it to everyone else, too.

writer.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store