Ningqi (Carina) Zhao's Passion Project - READING BOOKS AND INTERVIEWING THE 75 BOOKS PROJECT

For my passion project, I will read several books recommended by CMC community members through Gould center’s 75 Books project and interview the recommenders about the books. This winter break, I will travel back to China to reunite with my family. I need to quarantine in a hotel for two weeks, which is a perfect time for me to read books.

My winter passion project will be divided into two parts. For the first part, I will read six books selected from the 75 Books Project. I’ve watched all the book interviews posted on the website and chose the following six books: Night by Elie Wiesel, Tell the wolves I’m home by Carol Rifka Brunt, A Christmas carol by Charles Dickens, Algorithms of oppression by Safiya Noble, The cultural map by Erin Meyer, and Maybe you should talk to someone by Lori Gottlieb.

For the second part of the project, I will write book reviews for the selected ones and interview the recommenders. Each of the book reviews will be approximately 400 to 600 words, in which I will share my take-aways and feelings after reading the books. Then, I will reach out to the book recommenders, members from CMC community, about the books. After interviewing them, I will convert our conversation into interview transcripts.

Click the button to find out more!

75 Books - The Picture of Dorian Gray by Oscar Wilde

For her interview, Emma Merk ‘24 talked about “The Picture of Dorian Gray” by Oscar Wilde which tells the story of Dorian Gray’s descent into vice and corruption as he interacts with Basil and Lord Henry. As Dorian is drawn further and further into his pursuit of impulse and excess, the beautiful and pure portrait that Basil gifted him at the beginning of the book becomes contorted and disgusting all while Dorian remains untouched. Dorian acts as a pristine facade of who he wishes to be perceived as while the painting transforms to represent who he actually is. The book shows us the ways in which we often hide our true selves to present a picture or disguise free from blame and venality. Dorian chooses to give in to impulsive pleasures all while showing a vice free self that enabled him to continue his descent into despondency. 

These themes we see in Dorian are struggles we engage with daily. How we choose to present ourselves to others and how that disguises the truth of our ubiquitous imperfection is not an uncommon consideration, especially in college. We are in a small community where the constancy and familiarity makes us feel like we are always being perceived. Everyone has those people who they walk by every Tuesday or awkwardly recognize in some absurdly long line for Collins whose names and personalities we know nothing about. And they may feel the same way. Just as we look upon others, we know them to be looking upon us. There is a terrifying effect this knowledge has on us. We feel as if there is a sort of (as our friendly bud Sartre would describe it) “look” that is inflicted upon us by everyone who sees us. They look upon us and they surmise a being of us that is up to their choosing, separate from how we understand ourselves. It is alienating, our true understanding of self is pulled away from our being and we are left simply as their best guess of who we are. We are the sum of their biases, their shortsighted assumptions, and what little they have perceived to give them these notions. 

So in this inauthentic game what do we do? Wilde suggests that we respond to this by reconstructing our external selves. We curate a facade to show others who we wish to be. Our clothes, our rhetoric, our social media, our physical appearance, all of these can be altered to alter the us which others view. I may dress differently or talk differently or even start working out (at least I would really like to) to take control of that little sliver of self that I show to others. This feels, to me, at the root of what we observe in Dorian. Dorian has severed his external self from his internal self through the painting. He leaves his sense of being, the part of ourselves which makes us feel responsible for our authenticity, behind, and is free to show whichever self he pleases. It is in essence our own attempts to curate our external self different from our internal self amplified to an extreme. 

What can we learn from Dorian Gray? On the surface, the book presents like a parable. It tells us of the immorality of giving into our impulses and vices, and it shares a warning of the dangers we may encounter if we are deceptive of our true identities. Dorian, in hiding his authentic being from others, began to lose a grasp of it as well, and when he was finally forced to confront who he had become, he crumbled under the pressure and rage, acting out in violence. But I think that “The Picture of Dorian Gray” has more to offer than this upfront moral of the story. More than communicating the dangers of separating our outward self from our authentic self, it gives an interpretation of how we conceive of “selves” in the first place. In showing how Dorian separates the sides of his self through the painting, Wilde shows that Dorian sees the self as having something essential, or internal, and the part that is internal. Dorian views these two as completely distinct. The inner self has its sanctuary of preservation, while you are free to be as corrupt and viceful with your external self. But, evidently, these two cannot be separate. Our sins of the external self are destined to seep in and taint anything internal. 

This argument from Wilde has more strength than simply discouraging excessive self curation on social media, personality, or whatever it is kids are doing these days. Wilde is giving us a reason why we should fundamentally change the way we conceive of our “selves.” Dorian’s story suggests that we likely ought not separate them at all, for any deceit in our external selves only stands to seep in and taint our entire self. In an almost Socratic sense we can see this as an argument to pull the two closer together, to redefine being as non-compartmentalizable. We should be more authentic in our presentation of ourselves, not simply because of external standards where people tell us to “be ourselves” or something like that, but because it is more desirable even by our own standards in seeking pleasure and happiness. As we saw, we only alienate ourselves to madness with our facade. 

By Nic Burtson ‘24

75 Books - Algorithms of Oppression by Safiya Umoja Noble

For his 75 Books interview, Jon Joey Telebrico chose to talk about “Algorithms of Oppression” by Safiya Umoja Noble. Noble argues that our common belief that search engines like Google are objective and neutral is actually misguided. Instead, Google’s search results are oriented towards maximizing their profits as an advertisement business, and they are incentivised to provide misinformation that is especially harmful to groups that are often stereotyped and marginalized. 

Noble explains the process through which statistic based algorithms can turn into identity biased and socially harmful weapons, and gives case studies highlighting the immediate damage being done by inequitable algorithms. She claims that to fix the immorality and inefficiency of our current search system we need more government intervention with regulations, anti-monopoly action, and potentially a new public search engine independent of market incentives. 

Algorithms are created by people, and, while they hide under the facade of using statistics and other seemingly objective metrics, they can easily reflect the biases of their creators and the market incentives of their parent company. People often defend search algorithms by claiming that statistics and math based software are factual and untainted by bias or external incentives. In reality, Noble explains, even if the algorithm creators had the best intentions, their creators are fallible humans who unintentionally code their biases and social misunderstandings into the algorithms they create. These inner workings are also invisible to the majority of the public, and therefore difficult to critique, leading many to simply take them on faith. 

Noble points out a recent example of this with the controversy over the courtroom software, Northpointe. Northpointe was a program designed to help judges pick sentence lengths by predicting alleged future criminal potential of the defendants. While the software never directly asked defendants questions about race, it did ask about statistics that have traditionally been closely related with race and socioeconomic status of their home region. For example, Northpointe asked defendants about their social environment, asking if friends or family had been victims of crime, and nearby gang related activity. As a result, Northpointe miserably predicted future criminal activity post release from prison, and had an obvious racial bias in its sentencing. Black defendants were over incarcerated while white defendants were under incarcerated. 

The scandal with Northpointe effectively highlights that, even with good intentions, algorithms are easily imbued with their creator’s bias, but it also shows how strongly algorithms influence people’s lives. The modern information network of the internet is a public good that has an immense amount of influence over society’s perception of different groups and understanding of the world. It has become our primary avenue for gaining knowledge, and has a responsibility to properly inform the public. But the internet is controlled by a company whose goal is to maximize their advertising profits, not provide the best information. Noble says that this split between the public’s expectations of quality information and Google’s incentive to make more money is at the center of this marginalizing nature of its search algorithm. The results are not necessarily based on accurate popularity, instead they allow our most demeaning beliefs to rise to the top. The circulation and normalization of these views affirms users and innocuously radicalizes their implicit biases, keeping them on Google. 

Noble argues that because users view the internet as a public good there is a moral responsibility to keep the information network fair and equitable. This position is controversial because many people want to defend the integrity of the internet. They believe that it is central to the internet that it accurately portrays what is most popular and clicked on, even if that displays content that stereotypes marginalized groups. 

CMC professor Rima Basu wrote an essay engaging with this exact question. She explains that there are situations we encounter where we might feel that we have to sacrifice accuracy if we want to stay fair and moral. This argument is relevant in many cases of racial profiling. Someone might argue that forming belief based on a stereotype is statistically supported, and thus is the most epistemically rational choice. But this stereotyping is dehumanizing and reduces a person to a stat. So how do we reconcile the separation of what we feel is fair and what is most accurate? 

This same issue is central to debates over search engine regulation. People argue that if we regulate search engines more, we lose epistemic accuracy. Basu suggests that this doesn’t need to be a dilemma at all. Instead, if we reorient our understanding of moral stakes, both the moral and the accuracy considerations are taken into account. This is moral encroachment. Situations where we need to choose between beliefs can either be high stakes or low stakes, depending on how bad it would be if we were wrong. For example, Basu asks us to consider the following scenario. It’s a Friday, and you just received your paycheck. You have an automatic payment on Monday, and you don’t know if the bank is open Saturday. You ask your partner and they say that they are pretty sure the bank is open Saturday, but not certain. If you already have enough money in your bank account to pay on Monday, this choice is simple. You can risk going on Saturday and clear up your Friday afternoon. If you instead need the money from this specific check for Monday’s payment, the stakes are much higher; you should probably not risk it and just go today. 

Even though the evidence for if the bank will be open is the same either way, our decision is not. The external change in repercussion for being wrong has an effect on how we decide how much evidence is enough. Noble argues that this is the same framework that we should consider when evaluating search engines. Allowing content that further institutionalizes oppression and stereotyping carries huge moral risk, and, as a result, there is an obligation for companies like Google to regulate their content. 

But Google is a company with profit incentives, so they will not regulate their content if it hurts their profitability. This is why, Noble argues, we need government intervention. Governments need to take a more active role in moderating the internet, and view it as a public good that the majority of people actively rely on for their beliefs and information. At the very least Noble says we need to have stronger regulations and break up Google’s information monopoly. If there is at least more search engine competition, one engine like Google won’t dominate the entire market. Ultimately, though, Noble says that we need a public search engine that separates the information network from market incentives, and contains a high level of advanced search specifications to prevent inequitable content from rising to the top as often. 


By Nic Burtson ‘24