« PinkPike is like a well designed eBay for bicycles | Main | Momentum Rocker $590 in 2016 »

Facebook editorializing

FacebookEditorializingThuneNYT.png

What most people don't realize is that not everything they like or share necessarily gets a prominent place in their friends' newsfeeds: The Facebook algorithm sends it to those it determines will find it most engaging.

For outlets like The Daily Caller, The Huffington Post, The Washington Post or The New York Times -- for whom Facebook's audience is vital to growth -- any algorithmic change can affect how many people see their journalism.

This gives Facebook enormous influence over how newsrooms, almost universally eager for Facebook exposure, make decisions and money. Alan Rusbridger, a former editor of The Guardian, called this a "profound and alarming" development in a column in The New Statesman last week.

For all that sway, Facebook declines to talk in great detail about its algorithms, noting that it does not want to make it easy to game its system. That system, don't forget, is devised to keep people on Facebook by giving them what they want, not necessarily what the politicos or news organizations may want them to see. There can be a mismatch in priorities.

But Facebook's opacity can leave big slippery-slope questions to linger. For instance, if Facebook can tweak its algorithm to reduce click bait, then, "Can they put a campaign out of business?" asked John Cook, the executive editor of Gawker Media. (Gawker owns Gizmodo, the site that broke the Trending story.)

Throughout the media, a regular guessing game takes place in which editors seek to divine how the Facebook formula may have changed, and what it might mean for them. Facebook will often give general guidance, such as announcing last month that it had adjusted its programming to favor news articles that readers engage with deeply -- rather than shallow quick hits -- or saying that it would give priority to live Facebook Live videos, which it is also paying media companies, including The New York Times, to experiment with.


A cautionary tale came in 2014. The news site Upworthy was successfully surfing the Facebook formula with click bait headlines that won many eyeballs. Then a change in the Facebook algorithm punished click bait, which can tend to overpromise on what it links to. Steep traffic drops followed. (Upworthy has recovered, in part by relying on more on video.)

Many of Mr. Zuckerberg's visitors seemed at least temporarily placated by his explanation: That Facebook had so far found no systemic attempt to excise conservative thought from the Trending list and that any such move would harm Facebook's primary imperative (which is, in lay terms, to get every single person on earth to spend every waking moment on Facebook and monetize the living expletive out of it).

But a more important issue emerged during the meeting that had been lying beneath the surface, and has been for a while now: the power of the algorithms that determine what goes into individual Facebook pages.

"What they have is a disproportionate amount of power, and that's the real story," Mr. Carlson told me. "It's just concentrated in a way you've never seen before in media."
As his staff prepared answers to pointed questions from Senator John Thune of South Dakota, Mr. Zuckerberg took another step into the sunshine last week by holding a grievance session at Facebook's campus with conservative commentators and media executives, including the Fox host Dana Perino, the Daily Caller editor Tucker Carlson and the Blaze founder and commentator Glenn Beck, who wrote a defense of Facebook afterward.

TrackBack

TrackBack URL for this entry:
http://www.stylizedfacts.com/cgi-sys/cgiwrap/fotohof/managed-mt/mt-tb.cgi/10196

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)