When AI Meets PC

image

Google has spent many years and billions of dollars making its algorithms really smart. They detect behavior and the kind of content we look at, and then Google targets ads to users which its algorithms “think” fit them best. The recent calamity where it categorized African Americans as “Gorillas,” and more recently the one where it categorized a Chinese man as a “horse,” may be just the tip of a problem that most of us probably didn’t see coming.

If you make your algorithms super smart, the problem is they will behave as trained and work really hard to send the right ads to the best match, irrespective of the societal norms. In recent studies, researchers at Carnegie Mellon created fake male and female users and browsed sites which have a strong gender bias — think caranddriver.com vs. cosmopolitan.com. It’s not an exact study, but it’s an interesting approach. They then set these fake identities to browse gender-neutral sites and tracked what ads Google presented to the various fake users.

The results were striking. Male users of news sites like CNN were more frequently shown job ads with higher salaries than females on the same sites. They also found that an image search for CEO only presented 11% female results. Frequent browsers about addiction were sent ads for rehab.

Some of these results can be explained empirically. The fact is there are many fewer female CEOs, so the number of images indexed is likely proportionately lower. Likewise, the addiction result. Google tracks your browsing history, so if you spend a lot of time looking at fly fishing sites, it’s likely that’s what you will get ads for on CNN.  The fact that it also works for addiction feels different to fly fishing, but it’s not really.

The jobs and gender question is much weirder. It suggests that Google is divining your gender from your behavior, then making a value judgment. If we can assume that job ads don’t/can’t in the vast majority of cases specify gender, then it’s odd and perhaps worrying that Google is doing the math for us. It seems unlikely that someone in Google sat down and came up with this as a neat strategy. But rather, it’s probably the Google ‘brain’ watching the kinds of jobs people of different gender apply for and preferring those kinds of jobs to that gender because they are more likely to earn that click. Google only gets paid when ads get clicked on, so it tries really hard to fit the best ad to what it knows about the user every time. That’s fine for fly fishing, but weird for jobs.

I suspect that this is simply the unintended consequence of the AI running the ad platform getting a little too good at its job. In society, we adopt behaviors and constructs to address what we perceive to be intrinsic societal problems. We make extra efforts to be inclusive; we go the extra mile. AIs don’t have the same PC conscience, unless it’s programmed into them. I imagine there will be a bunch of changes made in the near future to “level the playing field” on this topic. Ironically, it will probably reduce the overall effectiveness of Google’s ad product in some areas. But that might be a small price to pay to avoid the discrimination law suits.

Leave a Reply

Your email address will not be published. Required fields are marked *