In February 2017, the Congressional Black Caucus raised concerns that Facebook was allowing advertisers to use the exclusion tools on their ads to target specific ethnic groups. Later that month, Facebook addressed these concerns with a new algorithm that would theoretically identify housing, credit, or employment advertisements that discriminated on the basis of ethnicity and reject those ads.
In November 2017, ProPublica tested Facebook’s new algorithm. They intentionally purchased ads that were racist and sexist to see if Facebook’s new algorithm would detect them. As Julia Angwin, Ariana Tobin, and Madeleine Varner reported, Facebook’s program failed to reject or take down any of the discriminating ads. The algorithm did not identify the fake advertisements and allowed for them to tailor their ads to a targeted audience.
In response, Facebook has decided to once again discontinue the option for advertisers to filter their ads by excluding specific ethnic groups. In the meantime, they are reviewing their ad regulating system. “Until we can better ensure that our tools will not be used inappropriately, we are disabling the option that permits advertisers to exclude multicultural affinity segments from the audiences for their ads,” Facebook COO Sheryl Sandberg wrote.
As Angwin reported in a subsequent article, these findings indicate that Facebook continues to violate the Fair Housing Act of 1968, which made it illegal to “make, print, publish, or cause to be made, printed or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.”
This story as well as other Facebook-centered news has received some coverage in publications like USA Today and the New York Times. A November 2016 New York Times article covered Facebook’s initial response to the criticism that it was violating anti-discrimination laws by allowing marketers to post ads that targeted Facebook users by ethnicity. The Times reported, “This is hardly the first time Facebook has received intense scrutiny over its ad-targeting practices,” noting a 2013 settlement of $20 million in a class-action lawsuit against the company for sharing data with advertisers about users’ ‘likes’ without asking permission. In a November 2017 article, the USA Today reported that “Washington scrutiny of Facebook has intensified in recent months after hundreds of fake accounts from a Kremlin-linked organization in Russia injected inflammatory ads on politically divisive issues from race to religion into unsuspecting Facebook users’ news feeds in the tense political climate surrounding the U.S. election.”
Julia Angwin, Ariana Tobin and Madeleine Varner, “Facebook (Still) Letting Housing Advertisers Exclude Users by Race,” ProPublica, November 21, 2017, https://www.propublica.org/article/facebook-advertising-discrimination-housing-race-sex-national-origin.
Julia Angwin, “Facebook to Temporarily Block Advertisers from Excluding Audiences by Race,” ProPublica, November 27, 2017, https://www.propublica.org/article/facebook-to-temporarily-block-advertisers-from-excluding-audiences-by-race.
Student Researcher: Grace Phillippe (North Central College)
Faculty Evaluator: Steve Macek (North Central College)
Editor’s Note: For previous Project Censored coverage of concerns about Facebook’s advertising practices, see “Facebook Buys Sensitive User Data to Offer Marketers Targeted Advertising,” story #23 in Censored 2018: https://www.projectcensored.org/23-facebook-buys-sensitive-user-data-offer-marketers-targeted-advertising/.