The Department of Justice (DOJ) has entered into a Settlement Agreement resolving allegations that Meta Platforms, Inc., formerly known as Facebook, Inc., has engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). The proposed agreement resolves a federal lawsuit alleging that Meta’s housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status, and national origin.
Meta uses algorithms in determining which Facebook users receive housing ads, and these algorithms rely, in part, on characteristics protected under the FHA.
Under the settlement, which still must be approved by the federal court, Meta will stop using an advertising tool for housing ads (known as the “Special Ad Audience” tool) that relies on a discriminatory algorithm. Meta will also develop a new system to address racial and other disparities caused by its use of personalization algorithms in its housing ad delivery system.
The settlement marks the first time that Meta will be subject to court oversight for its ad targeting and delivery system.
The DOJ lawsuit alleged both disparate treatment and disparate impact discrimination. Disparate treatment because it intentionally classifies users on the basis of FHA-protected characteristics and disparate impact because the algorithms affect Facebook users differently on the basis of their membership in protected classes.