South Africans clashed with the powerful Google algorithm last week when it was revealed that a search for South African squatter camps throws up white squatters in a majority of results.
It is a gross distortion of the national data on overall racial demographics in informal settlements, and paints a false picture of South Africa's housing and poverty crisis — but it is the main way that ordinary searchers around the world will be informed of how race is lived in South Africa.
As such it is an act of misinformation, but who is responsible?
In its simplest form, an algorithm is a set of steps by which a computer accomplishes a task, but in the 21st century and in what is called a fourth industrial revolution, algorithms increasingly make judgments about us and decisions for us.
Google runs, by wide miles, the most powerful search engine in the world, and its powerful algorithm has become an adjective. If you look for something online, chances are you will not search for it, but "Google it!"
Local commentators have largely let Google off the hook on the distortions of its squatter camp search, and argued that the algorithm is neutral tech that only throws up "what is searched", but is it that simple?
Not quite. The director of research at the Tow Center For Digital Journalism at Columbia University, Jonathan Albright, calls the algorithm a "black box" that needs breaking.
It is important to understand how algorithms are shaped, and what role relative power plays in shaping them.
Google is notoriously secretive about how the black box works, but Albright has begun to excavate. The ability to disrupt searches and sow disinformation is a form of media manipulation, said Albright, who was speaking at the Global Editors Network summit in Lisbon earlier this month: "It shows how search results distort and expose people to disinformation."
In his research, Albright had typed in the searches "Black culture is..." and "How to stop fake news..." Both had turned up instances of racism and misinformation in the top results, all as troubling as the search for South African squatter camps.
"Transparency is only one part of understanding the algorithm. We have to understand why it matters and what it says about power dynamics." In the instance of the South African case on squatter camps, it can display the power of a lobby that uses white people living in squatter camps as an example of a new post-apartheid impoverishment.
That thing about Google images representing "SA squatter camps" as a poor white nation experience. It's really dope and telling about the nature of the internet platform...— Mpumelelo Mfula 🌱 (@frypanmfula) June 15, 2018
Albright said it is also important to understand what personalisation — by which internet companies try to sculpt your online experience according to your previous online history — does to search results.
His research revealed that two people sitting next to each other are likely to get different search results to the same questions. He said it is important to understand how algorithms are shaped, and what role relative power plays in shaping them.
Albright believes journalists have an important role to play in pushing for algorithmic accountability — a new front in digital rights.