« Economics of News: New Statesman Alan Rusbridger on paywalls and funding schemes | Main | Facebook tinkered with users emotions in 2014 news feed experiment »

Facebook data-mined objective truth unmolested by the subjective attitudes

Facebook has also acquired a more subtle power to shape the wider news business. Across the industry, reporters, editors and media executives now look to Facebook the same way nesting baby chicks look to their engorged mother -- as the source of all knowledge and nourishment, the model for how to behave in this scary new-media world. Case in point: The New York Times, among others, recently began an initiative to broadcast live video. Why do you suppose that might be? Yup, the F word. The deal includes payments from Facebook to news outlets, including The Times.

Yet few Americans think of Facebook as a powerful media organization, one that can alter events in the real world. When blowhards rant about the mainstream media, they do not usually mean Facebook, the mainstreamiest of all social networks. That's because Facebook operates under a veneer of empiricism. Many people believe that what you see on Facebook represents some kind of data-mined objective truth unmolested by the subjective attitudes of fair-and-balanced human beings.

Even if you believe that Facebook isn't monkeying with the trending list or actively trying to swing the vote, the reports serve as timely reminders of the ever-increasing potential dangers of Facebook's hold on the news. That drew the attention of Senator John Thune, the Republican of South Dakota who heads the Senate's Commerce Committee, who sent a letter on Tuesday asking Mr. Zuckerberg to explain how Facebook polices bias.

The question isn't whether Facebook has outsize power to shape the world -- of course it does, and of course you should worry about that power. If it wanted to, Facebook could try to sway elections, favor certain policies, or just make you feel a certain way about the world, as it once proved it could do in an experiment devised to measure how emotions spread online.

There is no evidence Facebook is doing anything so alarming now. The danger is nevertheless real. The biggest worry is that Facebook doesn't seem to recognize its own power, and doesn't think of itself as a news organization with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.

That myth should die. It's true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.

"Algorithms equal editors," said Robyn Caplan, a research analyst at Data & Society, a research group that studies digital communications systems. "With Facebook, humans are never not involved. Humans are in every step of the process -- in terms of what we're clicking on, who's shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans."

Everything you see on Facebook is therefore the product of these people's expertise and considered judgment, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It's often hard to know which, because Facebook's editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.

TrackBack

TrackBack URL for this entry:
http://www.stylizedfacts.com/cgi-sys/cgiwrap/fotohof/managed-mt/mt-tb.cgi/10198

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)