So many people all around the world have been using Facebook and its other apps, they see everything right from cute baby pictures to vaccine misinformation and literally everything in between and everything they see on their feeds is with the help of its algorithms.
Facebook whistleblower Frances Haugen have collected hours of testimony and number of pages of documents about the impact of Facebook and its algorithms on teens, democracy and society all together. The fallout has made some to wonder how much Facebook, and possibly other platforms, can or should reconsider their use of algorithms to determine which pictures, videos, and news users see.
Facebook’s former product manager, Frances Haugen, who has a background of “algorithmic product management,” has problems especially related to the company’s algorithm that has been designed in such a way that it shows users content they’re most likely to engage with. Problems such as fueling polarization, misinformation and other toxic content are Facebook’s problems, which she considers herself somewhat responsible for them.
Facebook understands that if it makes the algorithm safer, “people will spend less time on the site, they’ll click on fewer ads, they’ll make less money,” she said on the 60 Minutes appearance she made. (Facebook CEO Mark Zuckerberg has pushed back at the idea that the company prioritizes profit over users’ safety and well being.)
After Haugen’s Senate hearing on Tuesday, Facebook’s head of global policy management, Monika Bickert, told CNN that it’s “not true” that the company’s algorithms are designed to promote inflammatory content, and that the company actually does “the opposite” by demoting so-called click-bait.
Haugen appeared to suggest a radical rethinking of how the news feed should operate at times during her testimony in order to address the issues she presented via extensive documentation from within the company. In testimony before a Senate subcommittee last week, she said, “I’m a strong proponent of chronological ranking, ordering by time.” “Because I don’t believe we should let computers decide what we should focus on.”