In a Settlement, Meta Agrees to Modify Ad-Targeting Technology Meta was accused of housing discrimination by U.S.
HUD for allowing marketers to restrict who could view housing advertisements on Facebook based on criteria such as race, religion, and nationality.
Tuesday, in a settlement with the Justice Department, Meta agreed to modify its ad-targeting technology and pay a $115,054 fine for allegedly engaging in housing discrimination by allowing advertisers to restrict who may see ads on the platform based on race, gender, and ZIP code.
Under the terms of the agreement, Meta, the company formerly known as Facebook, stated that it would change its technology and employ a new computer-assisted method that aims to regularly verify whether the targeted and eligible audiences for housing advertisements are actually viewing those advertisements. The new technology, known as a “variance reduction system,” relies on machine learning to ensure that advertisers transmit housing-related advertisements to certain protected groups of individuals.
Meta also said that it will discontinue the usage of “special ad audiences,” a technology it created to assist advertisers in reaching a larger audience. According to the corporation, the tool was an early attempt to combat bias, and its current ways will be more effective.
Roy L. Austin, vice president of civil rights and deputy general counsel of Meta, stated in an interview, “We will periodically take a snapshot of marketers’ audiences, determine who they target, and eliminate as much variation as possible from that population.” He referred to it as “a huge technological leap in the use of machine learning to provide targeted advertisements.”
Facebook, which became a corporate titan by gathering user data and allowing advertisers to target ads based on the characteristics of an audience, has faced criticism for years that some of these methods are discriminatory and unfair. The company’s ad technologies have enabled advertisers to determine who viewed their advertising based on hundreds of distinct attributes, allowing them to exclude members of a variety of protected categories.
While Tuesday’s settlement applies to housing advertisements, Meta wants to apply its new approach to verify the targeting of employment and credit advertisements. The corporation has attracted criticism in the past for enabling discrimination against women in employment advertisements and barring certain groups of individuals from viewing credit card advertisements.
“Because of this precedent-setting case, Meta will for the first time alter its ad delivery system to address algorithmic discrimination,” said U.S. attorney Damian Williams in a statement. This office will proceed with the lawsuit if Meta fails to demonstrate that it has sufficiently modified its distribution system to prevent algorithmic bias.
The problem of skewed ad targeting has been debated in housing advertisements in particular. Ben Carson, then-secretary of the Department of Housing and Urban Development, filed a formal complaint against Facebook in 2018, accusing the corporation of having ad systems that “illegally discriminated” on the basis of race, religion, and disability. A 2016 ProPublica investigation uncovered Facebook’s potential for ad discrimination by revealing that the business made it easy for advertisers to exclude specific ethnic groups for advertising purposes.
In 2019, HUD filed a lawsuit against Facebook for housing discrimination and Fair Housing Act violations. The agency stated that Facebook’s systems did not send advertisements to “a diverse audience,” even if the advertiser desired for the advertisement to be viewed by a large audience.
Mr. Carson stated at the time, “Facebook discriminates against people based on who they are and where they reside.” “Using a computer to restrict a person’s housing options is just as discriminatory as slamming a door in their face.”
Civil rights groups have asserted that the enormous and complex advertising systems that support some of the largest online platforms include inherent biases, and that tech companies like as Meta, Google, and others should do more to combat these biases.
The branch of study known as “algorithmic fairness” has attracted a great deal of attention from computer scientists in the field of artificial intelligence. Decades ago, prominent scholars, including former Google scientists Timnit Gebru and Margaret Mitchell, raised the alarm about such biases.
In the years following, Facebook has restricted the number of categories from which marketers may choose when purchasing housing advertisements, reducing the number from thousands to hundreds and deleting targeting choices based on ethnicity, age, and ZIP code.
Meta’s new system, which is still in construction, will periodically verify that the audiences being offered advertising for housing, jobs, and credit are the same as the ones that marketers wish to target. Theoretically, if advertisements began to bias strongly toward white men in their 20s, for example, the new system will notice this and display advertisements more fairly to larger and more diverse audiences.
Meta stated that it will collaborate with HUD over the next few months to incorporate the technology into its ad targeting systems, and agreed to a third-party examination of the effectiveness of the new system.
According to the Justice Department, Meta’s settlement penalty is the utmost allowed under the Fair Housing Act.