Facebook, from aggregator to news editor
Facebook doesn't seem to recognize its own power, and doesn't think of itself as a news organization with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.
That myth should die. It's true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.
"Algorithms equal editors," said Robyn Caplan, a research analyst at Data & Society, a research group that studies digital communications systems. "With Facebook, humans are never not involved. Humans are in every step of the process -- in terms of what we're clicking on, who's shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans."
Everything you see on Facebook is therefore the product of these people's expertise and considered judgment, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It's often hard to know which, because Facebook's editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.