Patricia Silva explores the impact of Google’s Search algorithm on BIPOC and queer cultures and highlights the iconoclastic work of the Feminist.AI collective, a community of academics, artists, and designers who seek to empower people with ethical ways to store, use, and search information.
According to its 2019 fourth quarter earnings report, Google nets $15 billion US dollars annually, and the building block of its revenue is ad sales from Search. As the internet began to expand in the early 1990s, the need to search its uncatalogued environment became a critical building block for digital interconnectedness. Two approaches to the logic of searching the internet emerged: American investor Bill Gross promoted search results as sites to be auctioned to the highest bidder, while Larry Page and Sergey Brin vehemently opposed advertising and developed an algorithm. These two search logics (algorithm or advertising model) informed how the internet’s most powerful tool– Search – would become a space where valuation, attention, and capitalization would be in constant contestation with one another. The success of Google’s algorithm is due to its ability to constantly adapt and produce relevance that is topical, timely, geographic, and hyper-personalized (Rosenberg and Schmidt). What the algorithm delivers are results that are temporarily popular, or already inscribed as dominant mores.
To interrogate search engine results as a mode of producing/reproducing social meaning is to challenge the foundations of what is dominant. LGBTQIA+ subcultures, as an example, have a history of being blocked or banned by search engines because of patterns of use that train algorithms to associate a particular orientation (lesbian, bisexual) with pornography. Queer subcultures have struggled against definitions that don’t come from within our communities, but are projected onto them/us. Without a strategy, online space isn’t any different. The third week of the Critical Code Studies Working Group introduced us to a Feminist Search project investigating the power dynamics of online searching and how technology replicates oppression.
Examining the Visual Languages of Searching
Dr. Safiya Umoja Noble’s Algorithms of Oppression: How Search Engines Reinforce Racism is a landmark work on the relationship between algorithms, bias, and how patterns of use amplify technology’s role in maintaining dominant hierarchies, which in turn depend on images to affirm control. The visual language of online searching has been of interest to me, an interest that other artists have also shared. At a diasporas conference in 2007, I shared a preview of my project examining online toxic tourism. Most poignantly, I remember a 2013 photo series by Qiana Mestrich titled Namesake. Based on Google Images search results that overwhelmingly featured mugshots of predominantly Black and Brown women named Qiana, Mestrich’s project showed how algorithms reflect and perpetuate society’s tendencies to criminalize women of color. At around the same time of Mestrich’s project,
Bitch Magazine published Noble’s essay Missed Connections: What Search Engines Say about Women: “Search engine results don’t only mask the unequal access to social, political, and economic life in the United States as broken down by race, gender, and sexuality – they also maintain it.” This is a very particular reading of our everyday technologies, but it’s not a literacy that traditionally accompanies the development of code. Coding practices still operate under some uncontested guise of neutrality. Scholars like Noble and artists like Mestrich are bringing their critical, social, and visual literacies to the act of searching.
Code Critique: Feminist Search:
Noble’s book is a touchstone for the Feminist.AI collective, the American group developing Feminist Search in collaboration with Noble. What makes a search engine feminist? The collective “strives to redefine AI from its current development in private companies and academic settings to community and socially-driven futures” (Meinders). This approach disrupts not just the pyramid-style power structure of single authorship, it also critically confronts the discriminatory qualities and inherent inequities present in the temporal cloak of code.
Feminist.AI’s collaborative visual search engine is trained on images that individuals donate and assign a classification, such as “safe” or “danger,” which are the starting points for conditioning the engine’s algorithms. While Google harvests our online behavior, Feminist.AI’s Feminist Search only accepts consentually donated data. As Sarah Ciston commented on the main thread, most online “users are unaware they are inadvertently ‘tagging’ their data as they participate in these [corporate] digital systems.” In contrast, The Feminist Search team centers users as owners of their information, as contributors of context and meaning, and as co-developers of a searching ecosystem.Donating data to Feminist Search is easy: in about fifteen seconds I uploaded one of my favorite illustrations of a cat exploring the Milky Way on a broom, and tagged it “safe.” Just this simple act makes one aware that subjectivity is inextricable from the creation of code. In Week 3’s code critique post, Christine Meinders wrote: “The interesting problems in data science and machine learning aren't in churning out mathematically good predictions. The outcome of an algorithm is only as good as the data given to it and how the person(s) constructing it use that data in the creation of a model.”
Feminist Search introduces intersectional methodologies to artificial intelligence and in doing so, the project re-animates new meanings for images. As an engine that prioritizes visualization, Feminist Search will eventually become a deliberate assembly of intentional, user-generated meanings that can be searched, found, and analyzed through the parameters of a far more engaged consciousness than the current defaults of online space. Images will acquire multiple and inevitably contradicting classifications, which challenges the common assumption that photographs are self-evident, that images have one author, and that meanings fall within strict categories such as “homeless” or “refugee.” By assigning a multitude of classifications to images as the basis for a visual search engine, the act of searching is expanded. We are made aware that the image we are looking at may not represent what we expected and as users, we are challenged to consider that representation is a living context. Because of my own background in Photography, I’m quite taken with the visual potential to decenter oppressive frameworks, but Christine Meinders, on the code critique thread, shared a potential I didn’t anticipate: “it would be interesting to explore not only what a Feminist Search pattern looks like, but possibly sounds like and feels like.”
The exchanges on the Feminist AI main thread alluded to not just ethics, and efficacy, but also the speed at which human attention processes information. L Foster posed an inquiry about enabling “slower movement, sideways thinking (Puar), queer use (Ahmed), situated knowledges (Haraway), queer failure (Halberstam)” and Catherine Griffiths responded with the need to “slow computation down so that we can trace and test and communicate the possible paths it might take. [...] [T]hinking about tactics for slow computation can be a decentering strategy, considering that computation takes place at a beyond-human scale of perception.” The collective has identified a creative geneology for their work: Ursula Damm's generative video project Membrane, Rebecca Fiebrink’s Wekinator; Lauren McCarthy’s LAUREN; Anne Burdick's Trina; Tactical Tech’s Gender and Tech resources project; Sarah Ciston’s ladymouth; Catherine Grffiths' Visualizing Algorithms, and Caroline Sinders' work on the Feminist Data Collection, which details a path for building data collections and ontologies in a feminist reference and framework. Feminist Search is working to maintain the rich practices of feminist communities, including the algorithmic processes of weaving, and knitting. Margaret Hamilton, the Code Leader for the command and lunar modules of the Apollo 11 Computer used in the US 1969 Moon landing; and Augusta Ada Lovelace who created the first complex program in England, in the 1840s. Building upon trajectories of resistance to social norms Feminist Search is a grand leap towards a deeper engagement with the social inheritance of meaning and our individual agency to critically examine and undermine patterns of dispossession.
Ahmed, Sara. What’s the Use?. Duke University Press, 2019.
Haraway, Donna. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” Feminist Studies 14, no. 3, 1988.
Halberstram, Jack. The Queer Art of Failure. Duke University Press, 2011.
Meinders, Christine. “Week 3: Feminist AI (Main Thread).” CCS Working Group 2020, http://wg20.criticalcodestudies.com/index.php?p=/discussion/87/week-3-feminist-ai-main-thread.
Noble, Safiya Umoja. “Missed Connections: What Search Engines Say About Women.” Bitch Magazine, Issue 54, Spring 2012.
Puar, Jasbir. Terrorist Assemblages: Homonationalism in Queer Times. Duke University Press, 2007.
Rosenberg, Jonathan, and Schmidt, Eric. How Google Works, Grand Central Publishing, 2017.