Death, disease, disaster and despair. For a long time, these are words that have defined how the rest of the world sees Africa. How much has that changed? Going by the evidence of a recent experiment conducted by This Is Africa, not much.
For our experiment, we did three different google searches for “African children”, “American children” and “European children.” Here’s a sampling of the results we got.
African children
American children
European children
Africa’s “sad” children
When know you noticed it too. Google spits out more unflattering results for “African children” than it does for the other searches. Going by the image results, Africa’s children are sad-faced, poverty-stricken and so undernourished, they are at death’s door. But that’s not all, if you’re not happy with your results Google has some suggestions for alternative searches for all three. See if you notice anything “off” in the options provided.
“African children”
“American children”
“European children”
Yes, the alternative searches Google asks you to try after googling “African children” are “starving”, “happy”, “sad”, “in need” and “poverty”. The alternatives for the other two are noticeably more upbeat and positive. The way the Google image results tell it, there’s certainly a lot of gloom to go around for African children.
Going by the image results, Africa’s children are sad-faced, poverty-stricken and so undernourished, they are at death’s door
Is this Google’s fault?
Who is to blame for the stark differences in the search results and the negative options provided as alternatives? It’s an attractive foil for our anger but we would be wrong to direct our fury at Google. A recent controversy in the US is helpful in understanding why this is so. Recently, a black teenager in the US noticed that he got more unflattering results when he searched “three black teenagers” on Google images as compared to “three white teenagers”.
Many called out the search giant for being “prejudiced”. Here’s how Google responded to the online pile on:
‘Our image search results are a reflection of content from across the web, including the frequency with which types of images appear and the way they’re described online. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query. ‘These results don’t reflect Google’s own opinions or beliefs – as a company, we strongly value a diversity of perspectives, ideas and cultures.”
The blame is on “us”
It’s hard not to agree with Google here. Google search results are more a reflection of our society than any wrong doing on the part of the company. Search engines like Google don’t think for themselves, they respond to what we feed them. When Google images spits out unflattering images of African children it’s merely reflecting the conscious and subconscious prejudices of those who use it.
So just to be clear, this is not on Google, it’s on us.