Meta has agreed to settle 2019 charges that it enabled housing discrimination through ad targeting. The deal with the Justice Department will have the company end use of a “Special Ad Audiences” tool that allegedly used a discrimination-prone algorithm to widen the reach of housing ads on Facebook. Meta is instead developing a new method, the “variance reduction system,” to ensure home ads match their eligible targeted viewers.
The decision to retire Special Ad Audiences also applies to credit and employment ads, Meta added. The company said all three categories were part of a “deep-rooted problem” with discrimination in the US.
This is the first time the DOJ has used a case to tackle algorithmic biases under the Fair Housing Act, according to officials. Meta said it collaborated with the Department of Housing and Urban Development for over a year to more accurately target ads and avoid discrimination. The new system will also be subject to DOJ approval and monitoring.
The firm had already limited ad targeting in 2019 after settling another lawsuit accusing it of violating housing discrimination law. Advertisers haven’t been allowed to target campaigns based on age, gender or ZIP code. Special Ad Audiences was launched at the same time to help address issues with a previous system, but Meta said its algorithms needed to adapt to ensure fairness.
A settlement isn’t shocking. Meta has faced other accusations of allowing problematic ad targeting in fields like politics. Other tech heavyweights have also faced penalties. The Federal Trade Commission slapped Twitter with a $150 million fine for reportedly “deceptive” ad targeting that relied on sensitive contact information. The agreement could help Meta avoid similar punishments, and suggests it’s willing to cooperate when ad systems come under scrutiny.