[ad_1]
New research by the University of Southern California, Northeastern, and a public interest group called Upturn makes an interesting conclusion about Facebook’s ad delivery algorithms: they skew based on racial and gender stereotypes. The algorithms are a trade secret of Facebook’s, so auditing them based on their design is not possible. Researchers instead ran experiments based on the results of the algorithms, focusing on where ads end up being displayed.
Facebook’s Ad Delivery Skews Along Race and Gender Lines
Facebook allows an advertiser to target various, razor-sharp demographics. You can target people who like a specific page or even a particular book. You can target groups of people based on a broad range of categories. But for the research, published yesterday, researchers made no demographic selections. The goal was to determine what Facebook would do with advertisements if it weren’t told a specific demographic to target.
Advertisers may hope that their message reaches the broadest possible demographic when they select a region. For the study, the researchers only chose the certain zip codes as a demographic, rather than age, gender, interests, or anything else. This was the best possible way to ensure that ad delivery was left up to Facebook.
Facebook doesn’t allow you to target based on “race,” but researchers were able to determine the results of the ad delivery by using something called Designated Market Area “as a proxy.” Basically, by using publicly available voter records, they can determine the racial densities of various zip codes or municipalities. The report mentions some cities North Carolina as a specific area studied.
Determining the Race of Facebook Users by Proxy
They found two regions which were majority white and two which were majority black. They then tested to see how many of their ads were targeted to people in each area.
When we run ads where we want to examine the ad delivery along racial lines, we run the ads to one audience from the first grouping and the other race’s audience from the second grouping. We then request that Facebook’s Marketing API deliver us results broken down by DMA region. Because we selected DMA regions to be a proxy for race, we can use the results to infer which custom audience they were originally in, allowing us to determine the racial makeup of the audience who saw (and clicked on) the ad.
The researchers developed an even more exciting hypothesis, though, which is irresistible in terms of speculation.
Using an equal group of images that are stereotypically attractive to men and women, they then skewed the pictures so that people wouldn’t be able to see them. Some image formats, like PNG, allow for “alpha” channel to be modified – for the purpose of transparency and other cosmetic changes to an image.
Machine Learning Probably Determines Stereotypical Racial and Gender Attractiveness of Images
The researchers made the images 98% invisible to the human eye. An algorithm would still be able to gather information about the pixels, mind you. Facebook let the researchers use these images, and skewed the delivery of them: 42% of images normally attractive to men were delivered to men, and 39% of the images deemed normally attractive to women were delivered to women. The result of this test in particular inform the hypothesis that Facebook uses machine learning to determine the content of advertising images.
Thus, the researchers conclude:
We have observed that differences in the ad headline, text, and image can lead to dramatic difference in ad delivery, despite the bidding strategy and target audience of the advertiser remaining the same.
Another interesting finding: even though women have higher click-through rates than men, and the ads were not targeted at either “man,” “woman,” or “neutral,” the reserachers found that ads which would stereotypically (especially based on image content alone) appeal to men were delivered to them more often.
On page 11 of the report, we begin to reach the damning evidence. The researchers ran three types of entertainment ads linking to the top 30 albums in three categories: country music, hip hop, and general. They found that the ads for hip-hop were delivered to black people more often, while the country top list was delivered overwhelmingly to white people.
Hip Hop Ads Delivered to 87% Black People
The ad targeting and budgeting strategy was exactly the same for each ad. In fact, only 13% of people who saw the hip hop list were white. Again, Facebook does not allow targeting based on race, but a stereotypical understanding of people would assume that more blacks like hip hop than whites. Actual album statistics might discount such a belief – hip hop is, in fact, a widely popular genre, with a high number of white people buying its products.
We find that Facebook ad delivery follows the stereotypical distribution, despite all ads being targeted in the same manner and using the same bidding strategy. […] Assuming significant population level differences of preferences, it can be argued that this experiment highlights the “relevance” measures embedded in ad delivery working as intended.
So, perhaps more black people on Facebook are looking at hip hop pages and sharing hip hop news. That might explain it. To deal with this, the researchers focus on other types of advertising: employment and housing.
Less Trivial: Housing and Employment Targeting When Advertiser Indicates No Preferences
The researchers found that the images used in the advertisements made a big difference in where they were delivered. They make efforts to scale back any conclusions, pointing out that their study was not overly exhaustive: they ran eleven different types of ads with five different types of images. Yet, the results are certainly interesting because, again, at no point did they target beyond region.
When selecting the ad image for each job type, we selected five different stock photo images: one that has a white male, one that has a white female, one that has a black male, one that has a black female, and one that is appropriate for the job type but has no people in it. We run each of these five independently to test a representative set of ads for each job type, looking to see how they are delivered along gender and racial lines. Thus, the target audiences that we use for these experiments are the North Carolina audiences described […] We can immediately observe drastic differences in ad delivery across our ads along both racial and gender lines: our five ads for positions in the lumber industry deliver to over 90% men and to over 70% white users in aggregate, while our five ads for janitors deliver to over 65%women and over 75% black users in aggregate.
Similar results are found for housing ads, based on the wording of the ads indicating the value of the housing offer and the image of a black family or a white family.
The researchers want you to know that their results are not conclusive. They are merely trying to raise awareness in hopes of getting these issues explored further by the public and public interest groups, especially when it comes to how employment and housing advertisements are delivered.
We demonstrate that, during the ad delivery phase, advertising plat-forms can play an independent, central role in creating skewed, and potentially discriminatory, outcomes.
Advertisers may be interested to know that it could be impossible to simply target everyone on Facebook. Based on the text, image, and nature of an advertisement, Facebook may discriminate for you.
Facebook’s HUD Case Just Got More Interesting
It’s not evident whether this is a more profitable approach than allowing advertisers to go where they wish, but it does belie any claim that Facebook isn’t using its vast resources to categorize people based on things like race.
If the researchers get what they want, it seems likely that further studies on Facebook’s advertising delivery mechanisms will be conducted. Facebook is such a large social network at this point that some people want to regulate it as a public utility. Their impact on wider society is undeniable, and as such, the results of studies like this one should not be taken lightly.
As a side note, Facebook is currently under investigation by the Housing and Urban Development department for enabling advertisers to illegally discriminate. HUD’s charges that Facebook violates the Fair Housing Act with its advertising platform are likely to be emboldened by research of this kind, and, as we said, more is likely to follow.
[ad_2]
Source link