The US Department of Justice announced today that it has entered into an agreement with Meta, Facebook’s parent company, to resolve a lawsuit alleging Meta engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). The proposed settlement must be reviewed and approved by a judge in the Southern District of New York, where the lawsuit was originally filed. But assuming it moves forward, Meta said it has agreed to develop a new housing advertising system and pay a fine of about $115,000, the maximum under the FHA.
“When a company develops and implements technology that deprives users of housing opportunities that are based in whole or in part on protected features, it has violated the Fair Housing Act, just like when companies advertise discriminatory using more traditional advertising methods,” said attorney Damian of the US Williams said in a statement. “Due to this landmark lawsuit, Meta will — for the first time — change its ad serving system to address algorithmic discrimination. But if Meta fails to demonstrate that it has modified its delivery system enough to guard against algorithmic bias, this office will pursue the lawsuit.”
The lawsuit was the Justice Department’s first challenging algorithmic bias under the FHA, alleging that the algorithms Meta uses to determine which Facebook users receive housing ads relied in part on features such as race, color, religion, sex, disability, family status and national origin – all of which are protected by the FHA. External investigations have provided evidence to support the Justice Department’s claims, including a 2020 paper by Carnegie Mellon that found bias in Facebook’s advertising tools exacerbating socioeconomic inequalities.
Meta said it will stop using a housing ad advertising tool, Special Ad Audience, which allegedly relied on a discriminatory algorithm to find users who resemble other users as part of its settlement with the Justice Department. ” based on FHA – protected features . Meta will also develop a new system over the next six months to “address racial and other inequalities caused by the use of personalization algorithms in its housing ad serving system,” according to a press release, and implement the system by December 31, 2022. .
An independent, third-party reviewer will continuously investigate and verify that Meta’s new system meets the standards agreed upon by the company and the Department of Justice. If the Justice Department concludes that the new system does not sufficiently address discrimination, the settlement will be terminated.
Meta must notify the Department of Justice if it plans to add targeting options.
In a blog post, Meta said the new system will apply to employment and credit ads, as well as housing — likely also in response to criticisms of the company’s ad targeting system in these areas. The aforementioned Carnegie Mellon study found that ads on Facebook related to credit cards, loans and insurance were disproportionately sent to men, while employment ads were shown to a higher proportion of women. Users who chose not to identify their gender or referred to themselves as non-binary or transgender were rarely, if ever, shown credit ads of any type, the co-authors found.
US officials have long accused Meta of discriminatory ad targeting practices. In 2018, Ben Carson, then secretary of the United States Department of Housing and Urban Development, filed a formal complaint against Facebook over ad systems that “unlawfully discriminated against” users based on their race, religion and other categories. A separate lawsuit filed by the National Fair Housing Alliance in 2019 alleged that Meta offered advertisers discriminatory targeting options.