« Facebook tinkered with users emotions in 2014 news feed experiment | Main | New nerd glasses ? Low Bridge fit by Warby Parker »

Facebook makes the news

According to a statement from Tom Stocky, who is in charge of the trending topics list, Facebook has policies "for the review team to ensure consistency and neutrality" of the items that appear in the trending list.

But Facebook declined to discuss whether any editorial guidelines governed its algorithms, including the system that determines what people see in News Feed. Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people's previously held points of view. If News Feed shows news that we're each likely to Like, it could trap us into echo chambers and contribute to rising political polarization. In a study last year, Facebook's scientists asserted the echo chamber effect was muted.

But when Facebook changes its algorithm -- which it does routinely -- does it have guidelines to make sure the changes aren't furthering an echo chamber? Or that the changes aren't inadvertently favoring one candidate or ideology over another? In other words, are Facebook's engineering decisions subject to ethical review? Nobody knows.

The other reason to be wary of Facebook's bias has to do with sheer size. Ms. Caplan notes that when studying bias in traditional media, scholars try to make comparisons across different news outlets. To determine if The Times is ignoring a certain story unfairly, look at competitors like The Washington Post and The Wall Street Journal. If those outlets are covering a story and The Times isn't, there could be something amiss about The Times's news judgment.

Such comparative studies are nearly impossible for Facebook. Facebook is personalized, in that what you see on your News Feed is different from what I see on mine, so the only entity in a position to look for systemic bias across all of Facebook is Facebook itself. Even if you could determine the spread of stories across all of Facebook's readers, what would you compare it to?

"Facebook has achieved saturation," Ms. Caplan said. No other social network is as large, popular, or used in the same way, so there's really no good rival for comparing Facebook's algorithmic output in order to look for bias.

What we're left with is a very powerful black box. In a 2010 study, Facebook's data scientists proved that simply by showing some users that their friends had voted, Facebook could encourage people to go to the polls. That study was randomized -- Facebook wasn't selectively showing messages to supporters of a particular candidate.

TrackBack

TrackBack URL for this entry:
http://www.stylizedfacts.com/cgi-sys/cgiwrap/fotohof/managed-mt/mt-tb.cgi/10201

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)