I kept writing and reading instead of doing the more DH-specific data visualization I was intending to do. But it’s so interesting! I’m writing about filters, you know, Instagram style filters, but I’m extending the notion of filter by seeing technological and cultural filters as basically the same thing. I talk a bit about that in the presentation I did a couple of weeks ago, and am working on writing that out properly so it makes sense.

But one example of how cultural and technological filters mix is the skin-tone bias in photography, especially 20th century photography, although today’s digital cameras aren’t perfect either. Early photographic colour film was designed to show white people’s skin in detail, but did a terrible job of representing a person with darker skin – especially when people with very different skin tones were in the same image. There were complaints from parents – for instance at poorly lit children with darker skin in diverse school groups – from the 1950s on, but it wasn’t until the 70s that Kodak actually did anything about the problem, and then only because companies wanting to sell dark woods and chocolates wanted film that would show the detail better.

An older "Shirley card" for skin tone and colour calibration of cameras.
An older “Shirley card” for skin tone and colour calibration of cameras.
Recent, international calibration image.
Recent, international calibration image.

Lorna Roth, who wrote an interesting article about this history (“Looking at Shirley, the Ultimate Norm: Colour Balance, Image Technologies, and Cognitive Equity“, 2009), notes that an important reason there were no real campaigns for film producers to create film that did a better job of representing darker skin tones was our general assumption that technology and science are objective – we thought photographs just were that way. But “[h]ad NASA, the U.S. intelligence service, or meteorological scientists already completed their research on photography of low-light areas at the time of the popular development of still photography, the evolution of film chemistry might have unfolded quite differently,” and it could certainly have changed once those developments were in place.

Sylveeta McFadden’s article in Buzzfeed, “Teaching the Camera to See My Skin“, is a very interesting personal take on this. And the comments on the Jezebel article about the same topic are wonderful explanations of how disgust at how badly conventional photography represents darker skin is a pretty common motivation for taking selfies:

Her piece is beautiful and I struggle with this thought ALL the time. Growing up all of my girlfriends (and immediate female relatives) were white. I would watch them effortlessly take a photo or get their photo taken and in return get an image that looked just like them. I never really felt that way. I still don’t – unless I take my own photo. And people call it vanity but really I just want to be able to see myself in a picture. I don’t see myself in other people’s photos, I just don’t. 

And another:

Ive alwaaays felt this way too. Some people laugh at me for wanting to take selfies rather than have someone take the photo but I’ve always felt kind of shitty pre smartphone era when the photos would come back developed and I just woudnt look like me. I think there is a features proportions issue as well as skin colour issue. – Ive felt like its not just me but other black people I see in photos too .

McFadden writes the same thing at Buzzfeed:

I couldn’t help but feel that what that photographer saw was so wildly different from how I saw myself. Is that how you see me? Could you not see blackness? Its varying tones and textures? And do you see all of us that way? (..) I started taking pictures to self protect. I just couldn’t bear seeing anymore shitty pictures of me. I didn’t want know what I wanted these images to say, but I knew I could make something beautiful.

I’m going to be using this case in talking with students about how technological determinism, cultural co-construction and how technology encodes cultural biases, that’s for sure.

Leave A Comment

Recommended Posts

Triple book talk: Watch James Dobson, Jussi Parikka and me discuss our 2023 books

Thanks to everyone who came to the triple book talk of three recent books on machine vision by James Dobson, Jussi Parikka and me, and thanks for excellent questions. Several people have emailed to asked if we recorded it, and yes we did! Here you go! James and Jussi’s books […]

Image on a black background of a human hand holding a graphic showing the word AI with a blue circuit board pattern inside surrounded by blurred blue and yellow dots and a concentric circular blue design.
AI and algorithmic culture Machine Vision

Four visual registers for imaginaries of machine vision

I’m thrilled to announce another publication from our European Research Council (ERC)-funded research project on Machine Vision: Gabriele de Setaand Anya Shchetvina‘s paper analysing how Chinese AI companies visually present machine vision technologies. They find that the Chinese machine vision imaginary is global, blue and competitive.  De Seta, Gabriele, and Anya Shchetvina. “Imagining Machine […]

Do people flock to talks about ChatGPT because they are scared?

Whenever I give talks about ChatGPT and LLMs, whether to ninth graders, businesses or journalists, I meet people who are hungry for information, who really want to understand this new technology. I’ve interpreted this as interest and a need to understand – but yesterday, Eirik Solheim said that every time […]