Meta Platforms has agreed to change its advertising technology and pay a $115,054 fine to settle a Justice Department claim of race and gender discrimination by the algorithm used to display its housing ads. “Meta will — for the first time — change its ad delivery system to address algorithmic discrimination,” U.S. attorney for the Southern District of New York Damian Williams said in a statement. “But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”
The Justice Department alleged Meta’s Facebook restricted which users could see housing ads by using factors such as their current address. Meta says Facebook will adjust its technology and AI to regularly check that eligible users see all relevant housing ads.
“The new method, which is referred to as a ‘variance reduction system,’ relies on machine learning to ensure that advertisers are delivering ads related to housing to specific protected classes of people,” The New York Times explains.
“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” Williams said in the DOJ statement.
While the $115,054 penalty is a mosquito bite for Facebook (which avoids admission of guilt), it is the maximum amount permitted by the Fair Housing Act for this type of infraction, according to the DOJ. The suit stems from the March 2019 results of a Housing and Urban Development (HUD) investigation that said the agency found a pattern of discrimination.
“While the settlement pertains specifically to housing ads, Meta said it also planned to apply its new system to check the targeting of ads related to employment and credit,” NYT writes, noting that “the company has previously faced blowback for allowing bias against women in job ads and excluding certain groups of people from seeing credit card ads.”
TechCrunch says this lawsuit marks the first Justice Department challenge of “algorithmic bias under the FHA,” and reports Meta’s settlement includes an agreement to “stop using an advertising tool for housing ads, Special Ad Audience, which allegedly relied on a discriminatory algorithm to find users who ‘look like’ other users based on FHA-protected characteristics.”