Facebook Whistleblower Exposes Company’s Problematic Business Model
Former Exec Alleges the Company Prioritizes Profits Over Users’ Safety.
November 4, 2021
On Sunday, October 3, a former employee of Facebook came forward on a 60 minutes interview with shocking information about the company’s decision-making.
Former employee Frances Haugen created copies of tens of thousands of pages of research conducted by Facebook, revealing the motives behind the company’s actions. According to this research, the social media giant prioritizes profits over the safety of the public.
Haugen alleged Facebook’s complex algorithm works in a way so that a user will be provided with content related to topics they had previously interacted with. Topics that are prone to misinformation often cause emotional reactions from the user, leading to them spending more time on Facebook, furthering the amount of content that they will consume while using the app. The leaked research reveals this creates a cycle of misinformation that many people fall into, simultaneously spreading misinformation, and providing Facebook with higher engagement from its users.
According to a study conducted by Facebook this year, it is estimated that Facebook takes action against, “… as little as 3-5% of hate…despite being the best in the world at it.” Haugen explained that social media are places where hate can spread more quickly because people are less scared to say things over a computer than they would be face to face. Facebook does not exercise that much control over it because it gives them more interaction, even though they could prevent hate quite thoroughly and efficiently.
Haugen continued to explain that after the 2020 presidential election, Facebook dissolved the civic integrity that had been put in place to prevent misinformation from spreading. After they went back on these precautions, Facebook was used to plan the January 6th insurrection. Facebook removed the procedures that had been put in place to prevent events such as this from occurring in exchange for increased interaction.
Haugen joined the company in 2019 after having worked for companies such as Google and Pinterest for fifteen years. Haugen said on 60 Minutes, “I’ve seen a bunch of social networks, and it was substantially worse at Facebook than anything I had seen before.”
When asked why she decided to leave the company and come forward with this, Haugen said, “I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.” Facebook had this issue locked down to the point where no changes were being made to the company’s ways despite internal efforts.
Alongside the backlash of public opinions, Facebook could be looking at legal issues as well. In the 60 Minutes interview, one of Francis’s lawyers says that “Facebook is required to not lie to its investors, or even withhold material information.” Last March, CEO Mark Zuckerberg testified to Congress that Facebook’s system was “the best approach we’ve found to address misinformation…” Facebook has made decisions that could negatively impact its investors, who were not given all of the information.
It is possible that their impending legal issues with their investors could cause Facebook to amend their algorithms to take action against more hate across their platform. The safety of their users could start to be more prioritized over just profits if the company hopes to repair its public image.