75 Books - Algorithms of Oppression by Safiya Umoja Noble

For his 75 Books interview, Jon Joey Telebrico chose to talk about “Algorithms of Oppression” by Safiya Umoja Noble. Noble argues that our common belief that search engines like Google are objective and neutral is actually misguided. Instead, Google’s search results are oriented towards maximizing their profits as an advertisement business, and they are incentivised to provide misinformation that is especially harmful to groups that are often stereotyped and marginalized. 

Noble explains the process through which statistic based algorithms can turn into identity biased and socially harmful weapons, and gives case studies highlighting the immediate damage being done by inequitable algorithms. She claims that to fix the immorality and inefficiency of our current search system we need more government intervention with regulations, anti-monopoly action, and potentially a new public search engine independent of market incentives. 

Algorithms are created by people, and, while they hide under the facade of using statistics and other seemingly objective metrics, they can easily reflect the biases of their creators and the market incentives of their parent company. People often defend search algorithms by claiming that statistics and math based software are factual and untainted by bias or external incentives. In reality, Noble explains, even if the algorithm creators had the best intentions, their creators are fallible humans who unintentionally code their biases and social misunderstandings into the algorithms they create. These inner workings are also invisible to the majority of the public, and therefore difficult to critique, leading many to simply take them on faith. 

Noble points out a recent example of this with the controversy over the courtroom software, Northpointe. Northpointe was a program designed to help judges pick sentence lengths by predicting alleged future criminal potential of the defendants. While the software never directly asked defendants questions about race, it did ask about statistics that have traditionally been closely related with race and socioeconomic status of their home region. For example, Northpointe asked defendants about their social environment, asking if friends or family had been victims of crime, and nearby gang related activity. As a result, Northpointe miserably predicted future criminal activity post release from prison, and had an obvious racial bias in its sentencing. Black defendants were over incarcerated while white defendants were under incarcerated. 

The scandal with Northpointe effectively highlights that, even with good intentions, algorithms are easily imbued with their creator’s bias, but it also shows how strongly algorithms influence people’s lives. The modern information network of the internet is a public good that has an immense amount of influence over society’s perception of different groups and understanding of the world. It has become our primary avenue for gaining knowledge, and has a responsibility to properly inform the public. But the internet is controlled by a company whose goal is to maximize their advertising profits, not provide the best information. Noble says that this split between the public’s expectations of quality information and Google’s incentive to make more money is at the center of this marginalizing nature of its search algorithm. The results are not necessarily based on accurate popularity, instead they allow our most demeaning beliefs to rise to the top. The circulation and normalization of these views affirms users and innocuously radicalizes their implicit biases, keeping them on Google. 

Noble argues that because users view the internet as a public good there is a moral responsibility to keep the information network fair and equitable. This position is controversial because many people want to defend the integrity of the internet. They believe that it is central to the internet that it accurately portrays what is most popular and clicked on, even if that displays content that stereotypes marginalized groups. 

CMC professor Rima Basu wrote an essay engaging with this exact question. She explains that there are situations we encounter where we might feel that we have to sacrifice accuracy if we want to stay fair and moral. This argument is relevant in many cases of racial profiling. Someone might argue that forming belief based on a stereotype is statistically supported, and thus is the most epistemically rational choice. But this stereotyping is dehumanizing and reduces a person to a stat. So how do we reconcile the separation of what we feel is fair and what is most accurate? 

This same issue is central to debates over search engine regulation. People argue that if we regulate search engines more, we lose epistemic accuracy. Basu suggests that this doesn’t need to be a dilemma at all. Instead, if we reorient our understanding of moral stakes, both the moral and the accuracy considerations are taken into account. This is moral encroachment. Situations where we need to choose between beliefs can either be high stakes or low stakes, depending on how bad it would be if we were wrong. For example, Basu asks us to consider the following scenario. It’s a Friday, and you just received your paycheck. You have an automatic payment on Monday, and you don’t know if the bank is open Saturday. You ask your partner and they say that they are pretty sure the bank is open Saturday, but not certain. If you already have enough money in your bank account to pay on Monday, this choice is simple. You can risk going on Saturday and clear up your Friday afternoon. If you instead need the money from this specific check for Monday’s payment, the stakes are much higher; you should probably not risk it and just go today. 

Even though the evidence for if the bank will be open is the same either way, our decision is not. The external change in repercussion for being wrong has an effect on how we decide how much evidence is enough. Noble argues that this is the same framework that we should consider when evaluating search engines. Allowing content that further institutionalizes oppression and stereotyping carries huge moral risk, and, as a result, there is an obligation for companies like Google to regulate their content. 

But Google is a company with profit incentives, so they will not regulate their content if it hurts their profitability. This is why, Noble argues, we need government intervention. Governments need to take a more active role in moderating the internet, and view it as a public good that the majority of people actively rely on for their beliefs and information. At the very least Noble says we need to have stronger regulations and break up Google’s information monopoly. If there is at least more search engine competition, one engine like Google won’t dominate the entire market. Ultimately, though, Noble says that we need a public search engine that separates the information network from market incentives, and contains a high level of advanced search specifications to prevent inequitable content from rising to the top as often. 


By Nic Burtson ‘24