A year or so ago, I spent an entire afternoon watching the YouTube channel of a young British teenager who usually spent his time on-camera reviewing various classics of literature. He was funny and engaging and not at all like typical YouTubers. He kept my attention for that afternoon, and then I forgot about him. Until last Friday when, in a discussion of another young YouTuber, the literary critic from last year popped back into my head.
But how to find him? How to find anything on the internet is a very interesting question, and seasoned searchers like myself will often tell you that the most literal route — “Google ‘my ankle has been dry for two months’” I say to myself — is the best. This is all to tell you how I came, last Friday, to do a Google image search for the word “teen.”
The results were shocking, mostly because they were all girls. White girls. Many of them wearing almost nothing. In isolation, each image was innocent enough: just youth, natural beauty, and a lot of white skin. Taken as a whole, it is upsetting. Am I a pervert, or is something wrong here? It didn’t help that the person I was looking for was, in fact, a boy. The word “teen,” at least to me, wasn’t gendered. I googled “tween.” It was even worse. I opened an incognito browser window to be sure. The results were still just as bad. The next suggestion, an image search for “selfie,” is a nightmare.
The related search terms for these searches speak volumes: “curvy,” “latina,” “hot,” “group,” “bra,” “skinny,” “bed.” I know where this is going.
The first image of a boy was of a teen behind the wheel of a car with braces. The second was a mugshot. The boys are all clothed.
Search Engine Land's Danny Sullivan, who has written for years about how searches work, says that Google does “tends to reflect the web and the web, when it comes to posting pictures about teens, likely goes with stereotypical white teenagers.” But the results are really gendered, and if you're being real with yourself, they're also highly sexualized. Sullivan points out that other engines like Bing and Baidu don't differ too much, but concedes that Google, which “has made a big deal about using machine learning as part of image search over the past year or so,” needs to “do a better job.” He also pointed out that making the words plural, as in "teens," helps a lot.
But Google can and does override what we get back when we search for things all the time. You can't search for "child pornography" and get back child pornography. And in fact, the company routinely tweaks entire categories of “suggested” search strings of words with negative connotations. So even though a lot of people might be googling “Sharon Tate murdered,” Google doesn’t ever actually “suggest” that string of words — it just stops suggesting after that “murd” gets typed in. What you get instead is a suggestion of "Sharon Tate Murdered Innocence," a film. Hate speech, generally, is not suggested by results. The point is that Google can and does tinker where it feels it’s necessary. But suggesting searches and tinkering with actual results are two very different things.
And search suggestions are also usually based on a string of words. My search was one word, and my issue isn’t with suggestions but with what I actually got back from the search. Without context: teen. If an alien landed here and googled “teen,” it would come away with the idea that all of Earth’s teens were white girls. And as far as I know, that’s not accurate.
Are gross men fucking up my Google image searches? The short answer is “yes.” Google's boilerplate response to my inquiry about this issue was thus: "Our image search results are a reflection of content from across the web, including the frequency with which types of images appear and the way they’re described online."
While I'm not “offended” at the results, I do take issue with the multiple assumptions that Google, and by extension, “my computer,” is making about me: It assumes I am a straight man looking for images of young, white women. And none of that is a reflection of reality.