Facebook’s advertising algorithms use race, gender and age recognition for targeted delivery of sponsored messages. This is stated in a study by the Northeastern College of Computer Sciences at Huri.
\n\n\n\n
Researchers say the same ad, paired with different images, is shown to people in different ways.
\n\n\n\n
“If you use in an advertisement an image of a Black person, the likelihood of it being shown to Black users will significantly increase,” said one of the study’s authors Alan Mislov.
\n\n\n\n
He added that an ad featuring a woman is more likely to appear in the feeds of a female audience. Meanwhile, an advertisement featuring a young woman is likely to be targeted at older men.
\n\n\n\n
According to Mislov, the distribution of messages occurs regardless of ad-serving settings. The algorithm decides which sub-group of people to show the ad, assessing relevance and prior interaction history, he added.
\n\n\n\n
“The system must understand: ‘What can I do to make someone click on an ad?’ In this case, race and gender can predict the probability of a click, so the algorithm uses them,” says Mislov.
\n\n\n\n
According to co-author Petr Sapiezynski, the algorithm is indifferent to ethnicity, gender and age. However, advertisers could use this to boost campaign effectiveness.
\n\n\n\n
Researchers say the problem partly arose from a lack of transparency in algorithms. The team spent tens of thousands of dollars and hundreds of hours tuning ads to determine how the system operates. But the average user of ad-promotion tools does not have such resources, the scientists say.
\n\n\n\n
The researchers also said that anti-discrimination laws in advertising should be extended to digital ads.
\n\n\n\n
“We need to make clearer how the algorithms work, and then give advertisers the option to say: ‘I don’t want the ad delivery system to potentially violate civil rights law,'” Mislov said.
\n\n\n\n
Earlier, Meta was accused of using a biased algorithm for housing ads. The company said it disabled it as part of settling a lawsuit with the U.S. Department of Justice.
\n\n\n\n\n\n\n\n
Earlier in September, Facebook algorithms were accused of ‘abetting’ genocide of the Rohingya in Myanmar.
\n\n\n\n
In August, reports emerged that Meta, with the help of AI, fired 60 moderators of the social network.
\n\n\n\n
In October 2021, Facebook algorithms were accused of spreading misinformation, inciting hatred and ethnic violence for profit.
\n\n\n\n
Subscribe to ForkLog news on Telegram: ForkLog AI — all the news from the world of AI!
