You probably saw it already, but I wanted to make a note of the autotune of Charles Ramsey, the neighbour who rescued three women who were kidnapped ten years ago and had been kept captive in a suburban house. He heard one of the women screaming for help and let her out – and has become famous for the colourful interview he gave afterwards.

The autotune remix highlights the racial bias that Ramsey himself highlighted in the interview, at which point the interviewer rapidly and sort of embarrassedly cuts off the interview:

I knew something was wrong, when a little pretty white girl ran into a black man’s arms. (..) That’s the only reason she’d run into a black man’s arms: either she’s homeless, or she’s got problems!

Will Stabley has an interesting analysis of this, although I’m not sure I agree that the autotune treatment in itself is racial.

The fact that the autotune highlights the uncomfortable description of racial almost-segregation is an interesting example of how remixes can politicize existing comments simply by changing the order, repeating – and autotuning. The blander part of the interview in this autotune remix is not autotuned, and that, and the repetition of the black man/white girl statement certainly serves to emphasise it.

Update: also, see The Guardian on the race issues here, and then another story reports that maybe Ramsey wasn’t the main hero, but a Hispanic neighbor who had too poor English for the news reporters to bother to interview him. There are certainly many levels to this story.

1 Comment

  1. Wibe

    Slate har en lik vinkling. Her går man for rasismesporet. Føler det er bomskudd siden det er “magefølelsen” journalisten bruker men man har kanskje et poeng. Uansett, her er linken
    http://www.slate.com/blogs/browbeat/2013/05/07/charles_ramsey_amanda_berry_rescuer_becomes_internet_meme_video.html

Leave A Comment

Recommended Posts

Image on a black background of a human hand holding a graphic showing the word AI with a blue circuit board pattern inside surrounded by blurred blue and yellow dots and a concentric circular blue design.
AI and algorithmic culture Machine Vision

Four visual registers for imaginaries of machine vision

I’m thrilled to announce another publication from our European Research Council (ERC)-funded research project on Machine Vision: Gabriele de Setaand Anya Shchetvina‘s paper analysing how Chinese AI companies visually present machine vision technologies. They find that the Chinese machine vision imaginary is global, blue and competitive.  […]