Social Media ‘Engagement’ Is Morally Bankrupt

How social media’s business model might be edging out a more important governmental model: democracy

In August, Facebook and Twitter revealed their efforts to shut down more foreign attempts to spread online discord and misinformation. In both cases, Iran was implicated. Facebook said it removed “652 Pages, groups and accounts for coordinated inauthentic behavior that originated in Iran and targeted people across multiple internet services in the Middle East, Latin America, UK and US.”

In one case, Facebook shut down an account called “Liberty Front Press”, which they say was linked to Iranian state media. The accounts associated with Liberty Front Press were set up in 2013, but by 2017, were increasingly focused on the U.K. and the U.S.. Those accounts and pages “posed as news and civil society organizations sharing information in multiple countries without revealing their true identity.”

Facebook reported, “About 155,000 accounts followed at least one of these Pages, 2,300 accounts joined at least one of these groups, and more than 48,000 accounts followed at least one of these Instagram accounts.” The company also found that, since 2015, the accounts had spent more than $6,000 in advertising.

In the fall of 2017, Facebook was finally forced to reveal posts and images created and posted by Russia’s so-called troll factory, the Internet Research Agency. What was most telling was how few of those posts overtly sought to inflame hatred for one political party or another, but rather they focused on fringe ideas that spoke to a larger ideological framework; ones which supporters of Donald Trump or Hillary Clinton might also align. Many of them were simply strange.

It was the same this past week as well. The examples Facebook provided of the latest misinformation campaign — probably one of many more — show, for example, mockups of purported post-Brexit stamps, and another of a film poster for The Notebook altered to feature Kim Jong Un and Trump, in place of Rachel McAdams and Ryan Gosling.

In a post explaining why it took action to ban groups associated with the latest misinformation source on its platform, Facebook said it did it, “because we want people to be able to trust the connections they make on Facebook.”

But, as we’ve known for a while, Facebook and Twitter are only partly about connecting with people. The other, more profitable and addictive part of both platforms, is what you do once you’re connected: you engage.

Photo by Vlad Tchompalov on Unsplash

For a while, and increasingly lately, engagement has been a controversial metric for social media. Engagement — that is, the number of times and rate at which something is seen and shared on a social media platform — is at the heart of how the companies that own them make money.

Engagement is more than simply measuring how many people liked, shared, retweeted, viewed, or commented on content; it’s about ensuring those people become addicted to the content. The continual dopamine hit that social media platforms deliver for their users is critical to their bottom line. The more people are addicted and using the platform regularly, the more the platform can get to know its users — what they like, who they know, what they’re saying — and the more they can attract advertisers looking to find a specific audience.

What keeps people most engaged with — that is, addicted to social media — is content that makes them emotional. More specifically, the stuff that makes people mad. Which brings us to the current problem. Social engagement has run amok, and is currently being used to tear apart, and render moot, all those positive social connections platforms purports to make.

In a recent conversation with Vox, tech philosopher Jaron Lanier, and most recently the author of Ten Arguments For Deleting Your Social Media Accounts Right Now, discussed how misinformation spreads, and why the result is so toxic. The way social media platforms game the system to ensure maximum engagement is by deploying algorithms that continually see who else might like the same stuff. He used, as an example, the Black Lives Matter social movement, which was used to target both left- and right-leaning Facebook users to undermine faith in American institutions, politics, and society.

People that hated that the Black Lives Matter movement members, for instance, “were not only identified by the algorithm, but introduced to each other,” Lanier told Vox. “And their annoyance was reinforced, and reinforced, and reinforced, not out of any ideological bent on the part of a company like Facebook, but rather just through the algorithmic seeking of engagement.”

Even people who are trying to do things that are “very positive and attractive and worthwhile,” Lanier explained, end up having their energy “inverted by this machine in the background, into something that’s the opposite; something horrible and destructive of society.”

Photo by Alex Radelich on Unsplash

It’s time to call off engagement. By now it is obvious — and must be so to the tech platforms — that engagement, as a business model, is morally bankrupt. Engagement, driven by algorithms and reliant on human addiction, is charting us on a ruinous path. So far, that path has helped drive deeper wedges between us politically and ideologically. It has prompted people to take to the streets in the name of causes that were all but fabricated.

Two days after Twitter and Facebook announced their proactive measures to cut down on misinformation, details emerged of a new study about what the Internet Research Agency has been up to: a widespread misinformation campaign in the months leading up to the 2016 presidential election, during which time its divisive posts were seen by as many as 10 million people.

The study, headed by David Broniatowski, an assistant professor of engineering at George Washington University, suggests the IRA hasn’t always been focused on politics. Rather, they’ve sought to push people to extremes on health matters, as well, specifically, vaccines. The researchers reportedly found that, not only were Russian-linked accounts 22 times more likely to tweet about vaccines than the average account, the majority sought to spread anti-vaccine messages. One account tweeted: “Did you know there was a secret government database of #Vaccine-damaged child? #VaccinateUS”.

What’s engagement worth to us? In his latest book, Lanier gives us ten reasons to quit social media altogether, arguing that only mass exodus will force positive change, or spawn alternative networking tools — ones that are not inherently designed to allow profit, or outsized influence, via manipulation.

Whether or not that could be possible, it’s time to see things for what they are. No matter whether the platform is Facebook,Twitter, or YouTube, the gargantuan task of keeping misinformation from spreading is not going to be solved ad hoc. The problem will never be solved so long as engagement continues to be the driving force behind social platforms. Forget regulating. What’s more urgently required is dismantling.

Photo by Kayla Velasquez on Unsplash

Seeing the news of the anti-vaccine campaign, Tom Nichols, a professor at the U.S. Naval War College, and author of The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters, tweeted simply: “War”.

Nichols didn’t clarify whether he meant the anti-vaccine misinformation campaign should be considered grounds for launching a war, or whether we are already engaged in one, but perhaps it’s not important.

The important thing is, even in denouncing the campaign and drawing attention to its destructive, and potentially life-threatening fallout, Nichols — along with many, many others — engaged.

writer.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store