Facebook disables ethnicity advert targeting system
- Published
Facebook has temporarily turned off a system that let advertisers choose which ethnic and minority groups saw their ads.
It said it would investigate how the feature was being used by advertisers.
News organisation ProPublica discovered that the system could be abused by posting discriminatory ads on the social network.
Facebook said it would look for a way to change the system so it could not be used "inappropriately".
Legal action
Last year, ProPublica first discovered the ethnic discrimination via advertising was possible.
US laws prohibit discrimination in the way ProPublica demonstrated - in adverts relating to housing, for example - was possible.
to post discriminatory ads that were not shown to people who were:
African-American
Jewish
Hispanic
interested in Islam
part of other ethnic or minority groups
All the ads it submitted were approved.
Facebook does not explicitly ask its users to declare their ethnicity, but it typically infers someone's ethnic group from their activity on the social network.
When the targeting was first uncovered, Facebook said it would find a way to spot and block attempts to post discriminatory ads.
Facebook's failure to do this raised questions about "its ability and commitment to police discriminatory advertising", said ProPublica.
On Thursday, Facebook boss Sheryl Sandberg said it had now turned off the tools that let advertisers choose which "multicultural affinity segments" they wanted to reach.
Ms Sandberg said it would also look into how these tools were used especially in respect of "potentially sensitive segments" such as those with disabilities.
But she also defended ads that were targeted on the basis of ethnicity or culture - saying the practice was common and legitimate in the industry.
In an earlier statement, Facebook said the ads placed by ProPublica had been approved because of a "technical failure" in its enforcement system.
"We're disappointed that we fell short of our commitments," Ami Vora, vice-president of product management, told the news organisation.
Ms Vora said the discrimination-spotting system Facebook had created after ProPublica's first investigation had managed to spot millions of other ads that had broken its guidelines.
"Our systems continue to improve, but we can do better," she said.
- Published25 October 2017
- Published15 September 2017
- Published8 November 2016
- Published19 January 2017
- Published1 May 2017