Facebook Accused of Allowing Advertising to Exclude Ethnic Communities
Social media behemoth Facebook is facing a lawsuit alleging that its targeted advertising feature allows advertisers to illegally exclude individuals based on race for housing and employment ads. The Fair Housing Act and Title VII of the Civil Rights Act prohibit discrimination in employment and housing, respectively, on the basis of sex, age, religion, and other characteristics. According to the suit, “Facebook’s advertising platform allows advertisers to target and exclude specific Facebook users to see their advertisements. This targeting and exclusion is based on Facebook users’ “affinity” groups, which Facebook uses to identify a person’s ethnic, gender and other affinities based on their Facebook activity. A user’s affinity may be determined by their Facebook profile and interactions with organizations and other users on Facebook.” To be clear, complaints filed in federal and state court merely lay out allegations that must be proven at a later period in litigation. Here, however, the complaint included screenshots of Facebook’s advertising platform as it existed at the time of filing.
As quoted in the suit, Facebook describes affinities as: “a relationship like a marriage, as a natural liking, and as a similarity of characteristics. We are using the term “Multicultural Affinity” to describe the quality of people who are interested in and likely to respond well to multicultural content. What we are referring to in these affinity groups is not their genetic makeup, but their affinity to the cultures they are interested in. The Facebook multicultural targeting solution is based on affinity, not ethnicity. This provides advertisers with an opportunity to serve highly relevant ad content to affinity-based audiences.” However, these affinities are represented as: “‘African American (US),’ ‘Asian American (US),’ and four categories of ‘Hispanic (US).’” The suit claims that these affinities are proxies for race and ethnicity, as well as multiple other protected classes, such as sex, religion, and national origin. Additionally, Facebook has not offered an explanation for why “White” or “Caucasian American” does not fall amongst the demographics that advertisers can choose to exclude from their targeted ads.
Safiya U. Noble, Ph.D. and Sarah T. Roberts, Ph.D., both assistant professors in the Department of Information Studies at UCLA, point out that “Facebook, in its capacity as an online advertising company — its primary business model for revenue generation — attempted to automate a nuanced, contextual practice with deeply problematic historical roots and contemporary implications, with seemingly little foresight or regard for the potentially disastrous, racist and even illegal outcomes it would foster.”
Facebook has defended its actions by noting that targeted advertising by ethnicity is common practice in the advertising world, and that each ad placed on their site is approved by a human to ensure their legality. Steve Satterfield, privacy and public policy manager at Facebook has remarked that Facebook “take[s] a strong stand against advertisers misusing our platform: Our policies prohibit using our targeting options to discriminate, and they require compliance with the law. We take prompt enforcement action when we determine that ads violate our policies.” However, two ProPublica journalists were able to put up ads in the housing section that excluded by race within five minutes. While targeting by race is not illegal in advertising in general, there has been a resurgence in scrutiny in recent years of potential redlining practices, or denying services to individuals based on race or income. The director of advertising acceptability of the New York Times (another media company that has been accused of discriminatory advertising in the past) notes that advertisers generally don’t go the overt route anymore: “I haven’t seen an ad with ‘whites only’ for a long time.” Instead of verbalizing such preferences, some institutions instead “have quietly institutionalized bias in their operations.” Due to concerns around its legality, Facebook has since discontinued the affinity targeting feature on ads regarding housing, employment and credit.
Allegations of bias within the infrastructure of tech companies are nothing new. Building massive structures of information in an equitable way is a learning process that requires companies and consumers to pay close attention. Though there are some proponents of big data who believe collecting race-specific information can actually lead to a reduction in inequality, it is important to note that algorithms propagate the values of the humans who make them and are ethics agnostic. Programmers, information architects, and advertisers are not uniquely susceptible to bias. The tendency for cognitive biases to influence work is part of the human condition, and one that requires proactive education and management.
LawRoom powered by EverFi provides online compliance training on topics like ethics, unconscious bias, and data security to thousands of companies and universities. To learn more, visit us at LawRoom.com.