Telegram (AI) YouTube Facebook X
Ру
Whistleblower: Facebook uses algorithms to inflame hatred for profit

Whistleblower: Facebook uses algorithms to inflame hatred for profit

Facebook’s algorithms fuel the spread of disinformation, incitement of hatred and ethnic violence for profit. Former product manager Frances Haugen told the U.S. Senate, ABC News reports.

“The leadership of the company knows how to make Facebook and Instagram safer, but does not implement the necessary changes because they prioritise their astronomical profits above people,” Haugen said.

She also said that Facebook’s feed-ranking algorithm drives the spread of disinformation and hate speech. According to leaked documents, the company knew of this but did not take active action, as high user engagement with such content fuelled platform growth.

“Facebook knows that the engagement rate on Instagram can lead children from very innocuous topics, such as recipes … healthy eating, to content about anorexia in a very short period of time,” Haugen said.

She says she does not want to harm the social network, but merely wishes to make it safer for users.

A few minutes after giving the testimony, Facebook said Haugen had worked at the company for “less than two years, had no direct reports, and never attended decision-making meetings with senior leadership”.

Mark Zuckerberg also issued a statement. He said the idea that Facebook prioritises profit over safety “simply does not reflect reality.”

“The argument that we deliberately promote harmful content for profit is deeply illogical. We make money from advertising […]. And I don’t know of a single technology company that would deliberately create products designed to provoke anger or depression in people,” he wrote.

Before the Senate testimony Haugen told CBS News in an interview that Facebook’s algorithms pushed misinformation to users. Ahead of the U.S. presidential election in 2020, the social network recognised the risks of misinformation and added safety systems to mitigate them.

“As soon as the elections ended, they turned them off or rolled back the settings to what they were before, to prioritise growth over safety,” she said.

Until May 2021, Haugen worked in Facebook’s civil misinformation team. Before leaving, she secretly copied tens of thousands of pages of internal company research, which she handed to journalists, members of the Senate Committee on Consumer Protection and the Securities and Exchange Commission.

Based on these documents, The Wall Street Journal published a series of articles alleging that the company ignored content that harmed children and teenagers.

On the evening of October 4, users around the world reported outages of Facebook, Instagram and WhatsApp. Services were restored after six hours.

Facebook’s team said that the cause was a change in the configuration of backbone routers, and user data were not affected.

In August, South Korea fined Facebook for illegal collection of biometric data.

Subscribe to ForkLog’s news on Telegram: ForkLog AI — all the news from the world of AI!

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK