Best Guess for this Image: Brassiere ( The sexist, commercialised gaze of image recognition algorithms.)
Did you know the iPhone will search your photos for brassieres and breasts, but not for shoulders, knees and toes? Or boxers and underpants either for that matter. “Brassiere” seems to be a codeword for cleavage and tits.
I discovered this not by suddenly wanting to see photos of bras, but because I did a reverse image search for a sketch of a pregnant woman’s belly selfie to see if the sketch had anonymised it sufficently that Google wouldn’t find the original. Lo and behold, all the “related images” were of porn stars with huge tits and ads of busty women trying to sell bras, which surprised me, given that the original was of a woman with a singlet pulled up to show her pregnant belly. I would have expected person, woman, pregnant, selfie, belly, bare arms to show up as descriptions, but brassiere? Was that really the most salient feature of the image? Apparently so, to Google.
Usefully, one of the text hits for the image was to this article explaining with horror that Apple has “brassiere” as a search term for your photos. Well, clearly Google does too.
I promptly did a search in the photos on my iPhone, and was appalled to see a selfie I took one morning show up?—?I wasn’t wearing a bra, but a singlet, and the image is mostly of my face, neck and upper chest.
Seriously? I suppose you might think that the main point of that image was the triangle of my singlet that could have been a bra, but really?
The other images are a screenshot of some Kim Kardashian thing I saved for some reason I don’t remember, and fittingly enough, in the middle there, is a video of part of Erica Scourti’s excellent video poem, Body Scan, which is precisely about how commercial image recognition commodifies the human body. (Here is a paper I wrote about Body Scan)
The app Erica Scourti was using, CamFind, is in some ways more nuanced than the iPhone’s image recognition, which has no idea how to look for a human hand or a knee or a foot. That’s because those words weren’t among the images the system was trained to recognise.
Yeah. Somebody decided to program the systems to look for breasts and bras, but not for knees or toes or backs. I wonder why.
In her book on racist search engine results, Safiya Umoja Noble argues that one reason why a Google Search for “black girls” a few years ago only gave you porn results was that Google prioritised sales and profit above all, and so prioritised the results people would pay for rather than, say, showing web sites that black ten-year-old girls might like to visit.
Presumably that’s why “brassiere” is a search term for my photos, too. Some people will pay to see photos of tits and some people want to sell bras. The fact that other people want to sell socks and mittens just isn’t as lucrative as bras and tits.
Actually, my iPhone can find photos of mittens. Or at least, it can find photos I took that it thinks show mittens. I guess they must have fed the machine learning algorithm more photos of breasts and brassieres than of mittens, becuase the mitten recognition is far less accurate.
Two feet. My theatrical daughter’s efforts at terrifying me by sending a photo of her hand gorily made-up in stage makeup. A picture of a kid in a snapchat filter drinking juice. None of those are mittens.
It’s entirely probably that the image recognition algorithms were trained on pornography and ads for bras. There’d be a precedent for it: the Lena image, which is the most commonly used test image for image compression algorithms, is a scan of a Playboy centrefold, so that naked shoulder actually leads to a naked body in a fake Wild West setting. (This image is one of the main cases discussed in Dylan Mulvin’s forthcoming book Proxies: The Cultural Work of Standing In).
So why does this matter? It matters because these algorithms are organising our personal photos, memories we have captured of ourselves and of our loved ones. Those algorithms that create those cute video compilations of my photos, showing the kids’ growing up over the years, or all the smiley photos from our family holiday?—?they are also scanning my private photos for breasts and cleavage.
I really don’t like that my phone thinks the best way to describe my selfie is “brassiere”. I hate that. Image recognition needs to do something more than simply replicate a commercialised version of the male gaze.
Rob Gehl
Hello, Jill —
I have a co-authored piece on something similar, called “Training Computers to See Internet Pornography,” (http://journals.sagepub.com/doi/full/10.1177/1527476416680453.) We looked at computers science papers on training computers to identify pornographic images. The papers certainly look for particular body parts and, as you say, not others. The missing body part in this literature, however, is not knees but male sex organs.