And so it begins.
Facebook is apparently hiring a reporter for something called its “Facebook stories project”. They’re searching for someone with “expertise in research, storytelling, news judgment” and who shows “a deep knowledge and passion for Facebook.”
A few weeks back I discussed the very real possibility of Facebook, rather than merely being a repository for news gathered and written by other outlets, might become one itself. Given the latest, it seems this is a reality closer at hand than perhaps we had suspected.
In some ways, Facebook and other social media sites like Instagram or Snapchat are already well on their way to becoming, if not news providers, at least conveyors of exclusive “newsy” content. As John Herrman pointed out earlier this month, Instagram, Snapchat, and Twitter all had “coverage” of the X Games this winter — including material on each that one couldn’t get on the others.
And, again, advertising likely serves a role here. Already, via its Instant Articles, under the right contract conditions, it stands to reap the majority of advertising profits associated with news stories. Those advertising revenues are likely key. Facebook’s biggest serious rival for advertising revenue generation is Google, which still generates way more, but Facebook, as it learns more and more about its users, is gaining some ground.
Interestingly, yesterday’s launch of a more diverse “like” function on Facebook could play a crucial role in expanding that base of knowledge. Before its introduction, Facebook could see what people clicked on, what they liked, and what they shared or posted. Now, as Wired outlines this morning, there is potentially so much more to know. While Facebook hasn’t said exactly how reactions might affect advertisers, “just seeing how you react… to their ads offers greater insight to brands that they can use to inform future campaigns.”
Wired quotes Facebook product manager Sammi Krug as saying that “over time, we hope to learn how the different Reactions should be weighted differently by News Feed to do a better job of showing everyone the stories they most want to see.”
Advertising is a key battleground, and Facebook might be looking to bank on a future where Google doesn’t control everything. Right now, Google is effectively in charge of nearly all advertising online, via its DoubleClick for Publishers and AdX. However, as The Verge noted back in September, a war is brewing between Apple, which holds a large stake in mobile traffic (thanks to the overwhelming success of the iPhone) and Google, which gets most of its revenue from the web.
Crucially: “iOS9 includes a refined search that auto-suggests content and that can search inside apps, pulling content away from Google and users away from the web, it allows users to block ads, and it offers publishers salvation in the form of Apple News, inside of which Apple will happily display (unblockable!) ads, and even sell them on publishers’ behalf for just a 30 percent cut.”
Which might mean two things: 1. Apple may also potentially soon start hiring reports (perhaps a privacy beat?), and 2. Facebook, knowing people are moving to mobile, is trying to get out ahead of that latter eventuality, and effectively build a news service inside the app we already have, so that searching for stories on your iPhone doesn’t drive you to the web, but instead to Facebook. In turn, it can gradually cut into Google’s hold on online advertising. Facebook and Apple are not necessarily allies in this version of events, but happen to have an aligning goal: destroy Google.
This is the fight that’s happening: not necessarily between companies, but between platforms — the web as we once knew it versus the mobile internet of today.
But what does this all mean for news?
Given all the above, we might wonder what reading news on Facebook might be like. One might suspect it will be ad-filled. What if it’s not? What if it’s actually the opposite? We might picture a future where searching for news on Google means having to sit through pop-up ads or scroll past them in a story, whereas Facebook — happy to be gathering more and more rich data on its users to, with ever more accuracy, make advertiser’s content seen by the right people in its News Feed — offers actual reported news without any ads at all, thereby making it a nicer (and fast!) reading experience. Who knows?
As far as news content is concerned, Facebook already has a News Wire, where it collects important or viral stories from around the world, and even has a daily roundup of need-to-know news. Original content would fit into that structure quite easily.
Facebook’s job posting says the company is seeking a candidate with “a passion for reporting on cultural digital trends and connectivity, as well as the impact these issues have on individual people” as well as “know how to tell stories about data and major world events.” It’s a bit of a kitchen sink position, it seems, though likely it is only the first of many. One suspects, given the trend in place, we’ll see an emphasis on exclusive content — breaking stories only available on Facebook, whether they’re hard news or (likely to start) not. No doubt, a video news component will arise, as will foreign bureaus. There is no reason to think they won’t, or at least can’t, exist.
Does it matter?
We could speculate forever about what kind of news Facebook might deliver, but before getting to that, we should first consider how it will deliver it.
This very morning in Ottawa, lawmakers were again discussing the future of Canadian media. They’re doing so because it’s widely assumed and accepted that, for a democracy to work properly, there must be a robust media to act as a check on power — that is, a matter of public policy. Media outlets have long been controlled by either massive corporations or, a bit further back, titans of industry or just really rich guys, but they are now collapsing. What happens to democracy then? Would having Facebook (or any other large tech company) as a news generating organization have any effect?
Recent data from the U.S., where “about six-in-ten online Millennials (61%) report getting political news on Facebook in a given week, a much larger percentage than turn to any other news source,” according to the Pew Research Center. Further, “roughly a quarter (24%) of Millennials who use Facebook say at least half of the posts on the site relate to government and politics”. We could make an assumption that a similar, though not identical, result might be found in Canada. If we take that assumption a bit further, on a very surface level (for argument’s sake), we might attribute a self-reported rise in youth voter turnout in 2015 in some way to being more exposed to news via Facebook. Maybe! (There is, of course, also the important element of what politicians were saying.)
In any case, let’s assume that at the very least, exposure to news on social media like Facebook plays some role in voter turnout. Does that mean a Facebook news division might be good for democracy?
You can probably already guess.
What’s weird about Facebook becoming a news organization is its ability to filter news much more effectively than any newspaper owner would have been able to do before. And the important difference is in perception.
Whereas in the past, a newspaper might have a reputation for carrying the flag for a particular ideology, Facebook’s news feed has, at least in one study, been shown to be trusted to be unbiased — enough that people aren’t even aware of it. A study from the University of Illinois noted that 62.5 percent of participants did not know an algorithm edited their news feed. “In their opinion, missing a public story was due to their own actions, rather than to those of Facebook,” the study noted. This means people assumed whatever was posted to Facebook was shown to them without any filtering — that the feed itself was merely a passive conduit. That’s not the case.
Even more alarming might be some data revealed in a lengthy Aeon piece by research psychologist Robert Epstein about Google, that other online news dissemination centre. (It should be noted that Epstein is a long-time vocal Google critic.)
In 2013, researchers, including Epstein, split 102 people in San Diego into three groups who saw mocked internet search results for the Australian election. One group saw results favouring one candidate, another saw results favouring a rival, and the final group saw “a mix of rankings that favoured neither candidate.” The results, which he says the researchers went on to replicate with bigger samples, are surprising.
“We predicted that the opinions and voting preferences of 2 or 3 per cent of the people in the two bias groups — the groups in which people were seeing rankings favouring one candidate — would shift toward that candidate. What we actually found was astonishing. The proportion of people favouring the search engine’s top-ranked candidate increased by 48.5 per cent (emphasis his), and all five of our measures shifted toward that candidate. What’s more, 75 per cent of the people in the bias groups seemed to have been completely unaware that they were viewing biased search rankings. In the control group, opinions did not shift significantly.”
Moreover, like the people who didn’t know that Facebook news feed results had been selected for them, Epstein and his research team found that “when people… are looking at biased search rankings, they look just fine. So if right now you Google ‘US presidential candidates’, the search results you see will probably look fairly random, even if they happen to favour one candidate.” This, Epstein says, is worrisome, given Google’s near-total monopoly on internet queries in the U.S.
Google and Facebook differ slightly in how they present news or search results. However, the overall point is that we may be unaware of the effect algorithms have on our sense of the world around us. Whether it’s via Facebook or Google, it seems important that many of us apparently remain oblivious to potential, subtle, manipulation.
Epstein goes even further into the dark future when he notes that not only have tech companies contributed to presidential campaigns in the past, Google’s Eric Schmidt established The Groundwork, “for the specific purposes of putting Clinton in office.” The implication is clear: Google could become an unseen vote-changer, directly working for a political candidate. Start the requisite Orwellian nightmares, etc.
The fear that Facebook or Google could shift election results merely by changing the rankings of news stories or search results might yet be a bit far-fetched (there is, of course, individual human actions to be considered — we may love our computers, but we are not yet entirely controlled by them), but the implications even at a less extreme level are interesting to consider, especially if those companies aren’t just ranking political news stories, but acting as news organizations. If there were to be an editorial bias, would we ever be able to tell?
What now, then?
Yikes! Right? Whatever happened to the nice social networks and internet that let us connect with old friends and watch cat videos? When did everything get so… scary? Let us say for now that a cautious optimism is perhaps the best course of action. It’s good that reporters are getting hired! It’s good that more stories can get told! Right? Right…? Oh, boy.
Updated @ 13:15ET in spots for clarity