Rosner, Daniela. "Bias Cuts and Data Dumps". M/C Journal 26, n.º 6 (26 de noviembre de 2023). http://dx.doi.org/10.5204/mcj.2938.
Resumen
Introduction “Patterns are everywhere”, design researcher Anuradha Reddy told her virtual audience at the 2023 speaker series hosted by Brilliant Labs, a Canadian non-profit focussed on experiential digital learning and coding (Brilliant Labs / Labos Créatifs). Like other technology fora, this public-facing series offered designers an opportunity to highlight the accessibility of code. But unlike many such fora, Reddy’s code was worn on the body. Sitting at the now-standard webinar lectern, Reddy shared a flurry of images and contexts as she introduced a garment she called b00b, a bra that she created in 2021 to probe the encoding of more than aesthetic possibility. Her presentation included knotted motifs of Andean Quipus; symbolic arcs of Chinese Pan Chang knots; geometric transformations of African American cornrow hairstyles (Eglash and Bennett, Brilliant Labs / Labos Créatifs). She followed the patterned imagery with questions of uncertainty that are often central for design researchers like her. Facing what might be a possible swipe, tap, or otherwise engagement, a technologist cannot fully determine what a user does. But they can “nudge”, a term popularised by behavioral economists Richard H. Thaler and Cass R. Sunstein in 2008 and later propagated within technoscientific discourses on risk (see Duffy and Thorson; Rossi et al.; Thaler and Sunstein). Adjacent bodies of scholarship frame the related concept of trust as a form of compliance (Adam et al.; Gass and Seiter). The more trustworthy an interface, the more likely a user is to comply. Rooted in social-psychological precepts, this line of scholarship frames trust less as a condition than a perception. When a user trusts an indicator light, for example, an app is more likely to see increased acceptance and engagement. Reddy approaches trust from and with b00b, an emphatically intimate (soft, pliable, textile) artifact. “How do we use these … perspectives to deal with uncertainty and things we do not know yet in the future?”, Reddy asks her Brilliant Labs audience (Brilliant Labs / Labos Créatifs). To make this argument, I examine Reddy’s b00b in conversation with a legacy feminist textile performance that brings questions of embodiment (and embodied trust) to an ostensibly disembodied technocratic scene. b00b is a decorative bra that emulates two-factor authentication, or what Reddy calls “b00b factor authentication.” The bra uses its two cups to verify a user’s access to a Website describing the project. With this interaction, the bra is self-referential—asking users to unlock a link that brings them back to someone’s chest. In practice, b00b asks users to scan a bra cup that relies on scanning the companion bra cup for a second passcode. Rather than messaging users, an initial passcode that triggers a second passcode sent by text message, the engagement requires bodily proximity. The bra cups take the place of electronic media (such as the text message) so that a close encounter with the bra enlivens digital trust. Under these circumstances, a trusted user becomes a risk-taker—gaining access while transgressing personal boundaries. In the sections that follow, I thread conversations on digital and algorithmic trustworthiness with critiques of trust and compliance that pervade Reddy’s 2021 handmade experiment. To date, technology analysts tend to treat trust as a perception: feelings of confidence in a person or thing (Gilkson and Woolley). As Natasha Schüll notes, a user might trust a slot machine but might miss its implications for further (and potentially excessive) gambling. Additionally, media scholars such as Evgeny Morozov have since mapped this addiction principle within social media development, pointing to a familiar science of incentive structures, gamification dashboards, and behaviour-change techniques, each designed to raise user engagement and keep people in apps longer. Thinking with Reddy’s work, I argue that trust can reveal an embodied desire, something momentarily felt and differentially shared (see also Gregg; Sharma; Irani). Reddy frames the weft of woven material as code, the purl and knit stitches of knitting as binary, and the knots of rope as algorithms. She urges her audience to see fabric as a means of challenging common assumptions about technology. With needles and thread, she proffers algorithmic trust as a relational ethics. In Technology We Trust From a design perspective, trust grows from the strategic balancing of risk and uncertainty (Cheshire). Users who find a digital feature reliable or trustworthy are more likely to grow their engagement and convince others to join in (Hancock et al.). In a recent analysis of the overlapping dynamics of algorithmic trust and bias, communication and information scholars Jeff Hancock, Mor Namaan, and Karen Levy (95) argue that machine learning tools such as the Chrome extension Just Not Sorry often replicate bias within training data. The extension disproportionately alerts femme users when they use qualifying words like “sorry”, and “I think”. In other contexts, Hancock and colleagues suggest, an AI-aided tool may help mitigate interpersonal biases since if it “imparts signals of trustworthiness between peer-based social exchange partners, these countervailing cues may neutralise stereotypes that would otherwise impede the transaction” (ibid). Here, the signal of trustworthiness holds the promise of accountability. But because the signals focus on cognition (manipulating an individual’s perceptions), what they refer to and how they may alleviate harms caused by entrenched cultural bias remains less clear. Grounded in social-psychological tenets, technology analysts codify trust as the relationship between two primary concepts: risk and uncertainty. As information scholar Coye Chesire (50) explains, “trust is not simply the absence of risk and uncertainty. More accurately, trust is a complex human response to situations that are rife with risk and uncertainty”. Through a range of controlled methods including observations, self-reports, survey questions, and the experimental conditions of a lab study, researchers measure the trustworthiness of user interface features as assessments of risk and uncertainty that explain differing motivations for use and disengagement. For example, design researcher Nick Merrill’s and Cheshire’s study of heart rate monitors finds that listening to an acquaintance's normal heart rate can lead to negative trust-related assessments in challenging contexts such as waiting to meet the acquaintance about a legal dispute. Parallel work by Hancock and colleagues uses self-reports and large-scale experiments on platforms like Facebook to map the significance of AI-enabled curation features like news feeds (Hancock et al.). As a psychological state, trustworthiness tends to indicate a behavioral metric that can be numerically encoded and individually addressed. By measuring trust-infused dimensions of user activity, analysts seek to systematically identify new ways of scaffolding trust-building behaviour by manipulating perception (Hancock, Namaan, and Levy), ultimately convincing a user to comply. A core goal is to maximise participation. The US government applied these principles to mass data collection and dissemination efforts during national census such as the COVID response (Halpern). But a secondary effect grows from the political-economic dimensions of user experience. Through compliance, users become easier to place, measure, count, and amend—a process Michelle Murphy names the economisation of life. When people’s certainty in interpersonal relationships grows, “the source of uncertainty then shifts to the assurance system, thereby making trustworthiness and reliability of the institution or organisation the salient relationship” (Cheshire 54). For instance, we may trust people in our text messages because we meet them face to face and put their numbers in our phones. But once we trust them, this assurance moves to our social media service or cellular phone provider. The service that manages our contacts also preserves the integrity of our contacts, such as when a messaging platform like WhatsApp automatically updates a cell phone number without our knowledge or explicit consent. Conversely, feelings of assurance in a digital interface feature may dwindle with decreased feelings of assurance by a platform. Until November 2022, users may have trusted someone with a blue checkmark on Twitter more than someone without one, even if they did not trust them at an interpersonal level. But with a chaotic acquisition that, according to a Washington Post report (Weatherbed), led to shifting check mark meanings and colours, this assurance grew more complicated. Murphy (24) might call these quantitative practices enriched with affect the “phantasmagrams” of rationalised assurance. Like a check mark that may or may not index a particular measure of confidence, excitement or worry, these shifting dynamics reveal the “trust and belief that animates numbers” (52). A less considered outcome of this framing is how individuated expressions of distrust (situations that foster psychological and physiological concern, skepticism, or fear for a single person) overshadow its complement: non-unconditional expressions of care. How might a user interface foster networks of connection for self and community? As Anna Lauren Hoffmann suggests, efforts to thwart algorithmic discrimination undergird this conundrum—“mirroring some of antidiscrimination discourse’s most problematic tendencies” (901). The particular value placed on trust often proceeds quick-fix techniques such as multi-factor authentication and cryptography that reduce trust to a neutral transaction (see Ashoori, et al.). In this discussion, design researchers have only begun to conceive trust (and distrust) as a deeply embodied process. Looks, Cuts, and Scans Reddy’s b00b invites audiences to explore embodied positioning. Sitting on a static mannequin, the garment invites audience members to engage the handiwork laid atop its breasts. In video documentation (Reddy), Reddy holds up a phone to a mannequin wearing the bra. She touches the phone to the mannequin’s right nipple, and the phone screen opens a Web browser with a password-protected field. As Reddy moves the phone to the mannequin’s left nipple, the phone shares the password ‘banjara,’ a reference to the community from which the embroidery techniques derive. The password opens a Website full of descriptive text and imagery detailing this material reference. In this interaction, b00b joins a movement of artistic work that uses textile artifacts to frame boundaries of self and other as porous and shifting. Consider Nam June Paik’s 1969 TV Bra for Living Sculpture. Across the 1970s, Charlotte Moorman performed the work by playing cello while wearing a transparent brassiere with two miniature television screens mounted on her chest (Paik; Rothfuss). As Moorman played her cello, wires connecting the cello to the two television sets sent sonic signals to the video that manipulate its imagery. Moorman’s instrumentation controlled the visuals displayed on the screens, inviting audience members to come closer to the electronic garment and her body—or, as Joan Rothfuss explains, “never mind that the bra actually encouraged prurience by compelling spectators to stare at [Moorman’s] breasts” (243). TV Bra invited its audience to breach conventional limits of closeness and contact much like users of b00b. Yoko Ono’s celebrated Cut Piece has sparked a similar prurience. During the work Ono dressed in some of her finest clothes and invites audience members to walk on stage and shear away pieces of fabric. Notably documented in the Albert and David Maysles film of Ono’s 1965 Carnegie Hall performance, the audience leaves Ono’s body nearly fully exposed at the performance’s end, save for her arms holding remaining pieces of fabric. With scissors in hand, the performance threatens imminent danger—inspiring snickers, pause, and discomforting ease among audience members eager to participate. Cut Piece encourages the audience to disregard consent and expose a certain breach of trust, practice mirrored with b00b. In this process of cutting cloth, often on the bias (or on a slanted angle; see Benabdallah, et al.; Rosner), feminist performance works have long prompted audiences to trouble the intimate relationship between themselves and the performer. As Vivian Huang has deftly argued, Ono’s shredded fabrics are more than neutral inconveniences; they also hint at whatever racialised and gendered feelings of trust might or might not exist between Ono and her audience. “If Orientalist conflations of the East with femininity have in turn sexualized Asian women as simultaneously hypersexual and submissive”, Haung contends, “then how can we as viewers and readers performatively read Asian femininity in a different, and not anti-relational, orientation to hospitality?” (187). b00b asks a similar question with systems of verification. Examining this possibility, Peggy Kyoungwon Lee recently puts Cut Piece in conversation with the contemporary media art of Lisa Park, and notes that “Ono’s signature composure both enacts and challenges archetypes of the feminized Asian body: cognitive efficiency, durability, calculative emotionality, docility, passivity” (54). For Lee, Cut Piece continues to open pathways for interpretation by diverting audience members from the compliance arguments above. Where algorithmic trust further complicates the making of trust with an added layer of uncertainty (is this made by an algorithm or is this not?), Cut Piece and TV Bra see in and through uncertainty to recentre a relational ethics. This concern for the relationality endures in Reddy’s b00b. To fashion the near-field communication (NFC) cards, Reddy draws from Banjara embroidery, a heritage craft technique featured in her home city of Hyderbad (Telangana). Like Banjara, b00b incorporates varied accessories (mirrors, tassels, shells) with colourful pattern. She embellishes the bra with lively zig-zagging embroidery, fashioning each nipple with a mirror that expertly doubles as an NFT tag hidden behind the embroidery. Garments like Ono’s, Paik and Moorman’s, and now Reddy’s, share an understanding that technology can and should reflect a certain felt complexity. At the Brilliant Labs event, Reddy presents b00b to conference-goers invested in shared hardware design specification standards. Across the 48-minute presentation, b00b interrupts the audience's presumed intentions. As Elizabeth Goodman has argued, hackers and tech enthusiasts interested in schematics, wireframes, and other digital drawings often prioritise formats that anyone can examine, adapt, use, and circulate by overlooking their situated social and political stakes. In the theatrical setting of a tech forum, b00b’s fabric draws attention to the body—manoeuvring the (often white Western) gaze around femme Asian subjectivities and questioning proximities between one body and another. Through its embodied relationality, real or imagined, b00b shares a concern for reimagining trust within mechanisms of control. b00b is Reddy’s attempt at generative justice, a concept of inclusive making she calls part of “bringing the Open Hardware community closer to heritage craft communities” (Reddy). In documentation, she discusses the geopolitical conditions of NFC-based authentication that relies on intimate connection as a means of state-led coercion and control. Situating her work in contemporary trust politics, she describes the Aadhar biometric identification system designed to compel Indian residents to record biometric data through iris scans, fingerprints, and photographs in exchange for a unique identity number (Dixon). She writes that systems like Aadhar “make minority communities more vulnerable to being identified, classified, and policed by powerful social actors” (Dixon). Wearing b00b challenges efforts to root NFC transactions in similar carceral and colonial logics. With an intimate scan, a user or audience makes room for counter-expressions of dis/trust. Sitting across from Reddy during a recent Zoom conference, I felt the tug of this work. With the piece modelled on a mannequin in the background, it reminded me of the homegrown techno-armour worn throughout Friedrichshain, a lively neighborhood in the former eastern part of Berlin. For the onlooker, the bra incites not only intrigue but also a careful engagement; or what Reddy names the “need to actively participate in conveying trust and intimacy with the bra’s wearer”. I couldn't help but wonder what an attendee at the Open Hardware Summit might make of the work. Would they bristle at the intimacy, or would they—like Ono’s audiences—cut in? On the surface, b00b presents a playful counterpoint to the dominant narrative of technology as slick, neutral, and disembodied. By foregrounding the tactile, handmade qualities of electronic media, Reddy’s work suggests we reconsider the boundaries between physical and digital worlds to complicate readings of computational risk. She is taking a highly technical process typically used for practical applications like finance, online identity, or other well-defined authentication problems, and enlivening it. The garment invites her audience to appreciate two-factor encryption as something intimate—both in an abstract sense and in a resolutely embodied sense. By defamiliarising digital trust, Reddy calls attention to its absurdity. How can a term like “trust” (associated with intimacy and mutual concern) also denote the extractive politics of algorithmic control (the verification of a user, the assessment of risk, the escalating manipulation of use)? Look closer at b00b, and the focus on authentication offers something specific for our ideas of algorithmic trust. Reddy turns a computational process into an extension of the body, registering a distinctly affective intrusion within the digital codification of assurance and accountability. Working with interaction design in the tradition of feminist performance, b00b directs our digital gaze back toward the embodied. Toward a Relational Ethics of Trust Fabric artifacts like b00b have long challenged digital scholars to consider questions of uncertainty and accountability. From what counts as computational, to whose labour gets recognised as innovative, woven material sparks a particular performance of risk. As Lisa Nakamura (933) shrewdly observes, gendered and racialised “traits” associated with textiles tend to fuel technological production, casting women of colour as the ideal digital workers. Looking to transnational flows connected with making, Silvia Lindnter argues that these stereotypes bring strategic meanings to feminised Asian bodies that naturalise their role within digital economies. Whose bodies get associated with fabric (through making, repair, consumption, aesthetics) reflects deep-seated stratifications within the masculine history of computing—with seemingly few possibilities for circumvention. If trust works as a felt condition, digital developments might more fully honour that condition. Bringing textile possibilities to NFTs suggests examining how authentication systems work on and through the body, even without touch. It is in this reciprocal encounter between content and user, audience and performer, textile and algorithm that something like a bra can hint at a profound ethics of connection. Reddy’s work reveals the consensual contact that can meaningfully shape who and how we digitally trust. While this essay has focussed on trust, I want to end with a brief consideration of the way a textile—in this case a conceptual and maybe even ontoepistemic (da Silva) artifact—brings the status of users closer to that of audience members. It begins to weave an analytic thread between the orientations, capacities, and desires of performance and design. Across this connection, b00b’s design works as minoritarian performance, as Jasmine Mahmoud (after José Esteban Muñoz) describes: a practice that “centers performance—as an object of study, a method, and theoretical container—as a means of centering minortized knowledge”. As minoritarian knowledge, the embroidered NFT expands Rozsika Parker’s profound insight into the subversive power of needlecraft. As Julia Bryan-Wilson (6) observes, “accounting for textiles—objects that are in close physical contact with us at virtually every minute of the day—demands alternative methodologies, ones that extend from shared bodily knowledge”. For digital scholars, b00b opens a similar possibility under racial technocapitalism. It asks us to notice how an indicator light on an AI-trained surveillance camera, for instance, does not map to an engaged or disaffected condition for an over-monitored user. It registers the need for probing relationships that underlie those tools—relationships between workers and employers, between non-users and corporate platforms, between differentially marked bodies. It challenges the reduction of trust dynamics into individualised or universalised motivations. To trust and be trusted with thread opens the possibility of algorithmic re-embodiment. Acknowledgements I’m grateful to insightful comments and suggestions from Anuradha Reddy, Amanda Doxtater, Scott Magelssen, Jasmine Jamillah Mahmoud, Adair Rounthwaite, Anne Searcy, James Pierce, and the anonymous reviewers of the current M/C Journal issue. References Adam, Martin, Michael Wessel, and Alexander Benlian. "AI-Based Chatbots in Customer Service and Their Effects on User Compliance." Electronic Markets 31.2 (2021): 427-445. Ashoori, Maryam, and Justin D. Weisz. "In AI We Trust? Factors That Influence Trustworthiness of AI-Infused Decision-Making Processes." arXiv 1912.02675 (2019). Benabdallah, Gabrielle, et al. "Slanted Speculations: Material Encounters with Algorithmic Bias." Designing Interactive Systems Conference (2022): 85-99. Brilliant Labs / Labos Créatifs. “AlgoCraft: Remixing Craft, Culture, and Computation with Dr. Anuradha Reddy.” 2023. <https://www.youtube.com/watch?v=UweYVhsPMjc>. Bryan-Wilson, Julia. Fray: Art and Textile Politics. Chicago: U of Chicago P, 2021. Cheshire, Coye. "Online Trust, Trustworthiness, or Assurance?" Daedalus 140.4 (2011): 49-58. Dixon, Pam. “A Failure to ‘Do No Harm’—India’s Aadhaar Biometric ID Program and Its Inability to Protect Privacy in Relation to Measures in Europe and the US.” Health and technology 7.4 (2017): 539-567. Duffy, Margaret, and Esther Thorson, eds. Persuasion Ethics Today. Routledge, 2015. Eglash, Ron, and Audrey Bennett. "Teaching with Hidden Capital: Agency in Children's Computational Explorations of Cornrow Hairstyles." Children Youth and Environments 19.1 (2009): 58-73. Ferreira da Silva, Denise. Unpayable Debt. Sternberg Press / The Antipolitical, 2022. Gass, Robert H., and John S. Seiter. Persuasion: Social Influence and Compliance Gaining. Routledge, 2022. Glikson, Ella, and Anita Williams Woolley. “Human Trust in Artificial Intelligence: Review of Empirical Research.” Academy of Management Annals 14.2 (2020): 627-660. Goodman, Elizabeth Sarah. Delivering Design: Performance and Materiality in Professional Interaction Design. Berkeley: U of California P, 2013. Gregg, Melissa. Counterproductive: Time Management in the Knowledge Economy. Durham: Duke UP, 2018. Halpern, Sue. “Can We Track COVID-19 and Protect Privacy at the Same Time?” New Yorker 27 Apr. 2020. <https://www.newyorker.com/tech/annals-of-technology/can-we-track-covid-19-and-protect-privacy-at-the-same-time>. Hancock, Jeffrey T., Mor Naaman, and Karen Levy. "AI-Mediated Communication: Definition, Research Agenda, and Ethical Considerations." Journal of Computer-Mediated Communication 25.1 (2020): 89-100. Huang, Vivian L. "Inscrutably, Actually: Hospitality, Parasitism, and the Silent Work of Yoko Ono and Laurel Nakadate." Women & Performance: A Journal of Feminist Theory 28.3 (2018): 187-203. Irani, Lilly. "‘Design Thinking’: Defending Silicon Valley at the Apex of Global Labor Hierarchies." Catalyst: Feminism, Theory, Technoscience 4.1 (2018): 1-19. Lee, Peggy Kyoungwon. "The Alpha Orient: Lisa Park and Yoko Ono." TDR 66.2 (2022): 45-59. Mahmoud, Jasmine. “Minoritarian Performance.” Research Cluster, University of Washington, 2022. <https://simpsoncenter.org/projects/minoritarian-performance>. Merrill, Nick, and Coye Cheshire. "Habits of the Heart(rate): Social Interpretation of Biosignals in Two Interaction Contexts." Proceedings of the 19th international Conference on Supporting Group Work (2016): 31-38. Morozov, Evgeny. “The Mindfulness Racket.” New Republic 23 Feb. 2014. 1 Sep. 2016 <https://newrepublic.com/article/116618/technologys-mindfulness-racket>. Muñoz, José Esteban. Cruising Utopia. Tenth anniversary ed. New York: New York UP, 2019. Murphy, Michelle. The Economization of Life. Duke UP, 2017. Nakamura, Lisa. "Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture." American Quarterly 66.4 (2014): 919-941. Oldenziel, Ruth. Making Technology Masculine: Men, Women and Modern Machines in America, 1870-1945. Amsterdam: Amsterdam UP, 1999. Paik, Nam June, and S. Moorman. "TV Bra for Living Sculpture." 1969. 6 Mar. 2014 <http://www.eai.org/kinetic/ch1/creative/video/paik_tvbra.html>. Parker, Rozsika. The Subversive Stitch: Embroidery and the Making of the Feminine. Chicago: U of Chicago P, 1984. Reddy, Anurandha. “b00b-Factor Authentication.” 2022. 7 Nov. 2023 <https://www.youtube.com/watch?v=41kjOXtUrxw>. ———. “b00b-Factor Authentication in Banjara Embroidery.” 2023. 7 Nov. 2023 <https://anuradhareddy.com/B00B-Factor-Authentication-in-Banjara-Embroidery> (password: 'banjara'). Rossi, John, and Michael Yudell. "The Use of Persuasion in Public Health Communication: an Ethical Critique." Public Health Ethics 5.2 (2012): 192-205. Rothfuss, Joan. Topless Cellist: The Improbable Life of Charlotte Moorman. Cambridge: MIT P, 2014. Schüll, Natasha Dow. Addiction by Design. Princeton: Princeton UP, 2012. Sharma, Sarah. In the Meantime: Temporality and Cultural Politics. Durham: Duke UP, 2014. Weatherbed, Jess. “Elon Musk Says Twitter Will Begin Manually Authenticating Blue, Grey, and Gold Accounts as Soon as Next Week.” The Verge 25 Nov. 2022. <https://www.theverge.com/2022/11/25/23477550/twitter-manual-verification-blue-checkmark-gold-grey>.