Salon with Caroline Sinders #
On 5 March 2025 we discussed with Caroline Sinders about AI, politics and feminism.
Caroline Sinders is a machine-learning-design researcher and artist. For the past few years, she has been examining the intersections of technology’s impact in society, interface design, artificial intelligence, abuse, and politics in digital, conversational spaces. Sinders is the founder of Convocation Design + Research, an agency focusing on the intersections of machine learning, user research, designing for public good, and solving difficult communication problems. As a designer and researcher, she has worked with Amnesty International, Intel, IBM Watson, the Wikimedia Foundation, and others.
Caroline’s website: https://carolinesinders.com/
Watch the video #
Read the transcript #
Look at the participation #
Notes #
-
[notes from the presentation]
-
Design as an organizing principle of digital infrastructure (cf CS past work at ICO: privacy regulators)
- Importance of design/UX layer to empower consumers/users
- Digital rights as a subset of human rights (studying “dark patterns” and other invasive UX/design decisions, having difficulty with “opting out” or “in”)
-
Feminist dataset
- Influences: critical design (art util?), feminist technology
-
“Snake oil” about ML steeping in - a narrative?
-
Art as necessary - (reminds me of an ethan hawke’s quote)
-
“What is research-driven art?” a step-by-step process
- Verify experiential data with communities—part of decolonizing and problem-solving
- Research can be (professional) activism driven: meaningful engagement with legislation, platforms/velops
- Next step, “advocacy”: what do we want to see changed, and why?
- Last step: “art.” Illustrating for an audience in a way journals/literature (for example) cannot. A “trojan horse” (per Nora Khan); providing a shield (activism is dangerous) as well as a
-
Arte Útil — perceived dichotomy between aesthetic appreciation and utility (Tania Bruguera) — American Artist, Francis Tseng, MIMI ỌNỤỌHA, Dunne & Raby, Adam Harvey with SyrianArchive/Mnemonic (VFrame: on computer vision)
- “Digital witnessing”
- “Why is this ‘art’?” Because Adam is an artist and considers this art. (“Which I think is a fine answer.”)
- No, the solutions offered by art/”digital witnessing” won’t be unilateral: “Bandaids as art” provide a “necessary provocation”
- “Digital witnessing”
-
Feminist dataset - a “participatory policy LARP”
- Node: Data collection - participants add notes to a wall
- Node: Data cleaning
- Workshops are key - ways to think through what meaningful community participation looks like and inclusive data collection
- interactive workshop with a group of participants attempting to sort intersectional data into “buckets,” kind of applying a framework or taxonomy to a diffuse data set
- Participants’ “wants” are (necessarily) considered when sorting this data (which is to say, this is a very human process that reflects community values and priorities)
-
“Community-driven datasets”
-
Data quality, curated datasets are an important way of reducing biased and unfair and even irresponsible AI, yet the pace in which models are fed and come out to social use is far faster as the emergence of curated datasets. How it can be possible to have protocols to reduce these savage models but promote a more reasoned way of producing data to feed them that need more time to be produced, validated, collectively maintained?
-
“All minor points are major points [in feminist theory]” (I really appreciate this)
-
Data being “slow” — this contradicts the logics of perpetual, endless, ever invasive capture in Big Data
-
Thomas Thwaites’ Toaster Project as a distillation/explication of mass industrialization (“supply chain”?) — CS: “when critiquing big tech, how possible is it for us to intervene?”
- Accessibility and documentation — reproducibility v. ownership/control?
- Open source has its own strengths and weaknesses; also see analysis of rhetorics of “openness”
-
“We are interested in: -process -friction -failure -harm reduction” (slide)
-
TRK “technically responsible knowledge” — problematizing tool design and UIUX; references Datasheets for Datasets
-
UX has a “politics” to it; Design is not a neutral technology
-
(Qs) 3, 4 on activism: collaborating with/serving mutual aid orgs; reciprocal knowledge exchange. When doing human rights work […] “what is the friction we’ve found?” With barriers to secure messaging (regarding abortion access in Louisiana), CS worked with a local organization to do security training. Spyware in India: the same group provided security training and connected them to other organizations, like “digital forensics” (compromised phones being the concern). Links shared even on end-to-end encrypted messaging still tend to have identifying strings in the url; the org recommended having a “link sharing” toggle built into Signal. Part of a coalition with Fight for the Future and Mozilla re: secure messaging; they pushed for new UI elements in Slack, for a block button and for end-to-end encrypted DMs. A lot of activist organizing is done through apps like Slack. “Coalition Against Online Violence.” Any campaign will partner with many other organizations, “galvanizing all these different communities.”
-
6a, b on audiences: art as a space/interface where we can challenge ourselves; as a trojan horse — where folks can feel safe to engage in as non-experts. “Collective data governance.” “Participatory policy LARP”! TRK as a way to contextualize wages for gig economy data collection.
Links #
- More about Caroline’s work: https://carolinesinders.com/
- More on Arte Util: https://taniabruguera.com/introduction-on-useful-art/
- Adam Harvey & SyrianArchive’s VFrame reminds me of Luciana Parisi’s writing on negative optics, which folds in themes of racial capitalism and Paul Virilio’s theory of a “sightless vision”
- TRK is reminiscent of David Widder and Dawn Nafus’ analysis of modularity in software engineering as a framework/metaphor that postpones confronting accountability/ responsibility. “Barrier between client (CS) and data labeler” — “who are the workers, how can their tools truly support them in their work?” CS references Milagros Miceli (?)’s work on labor equity in data work.
Questions #
- [questions from the presentation]
- You mentioned that Art comes after, and Research comes first because it’s practical. Could you tell us more when “art” comes in? :)
- Curious how you feel about language in cultural narratives of machine learning technologies — in a (Western) academic framework that is entangled in industry-military agendas, how precise might users/consumers want to be? What is the role of literacy in self-advocacy, if at all?
- [Asked] Is activism a practice that we need to be better at to avoid technology like AI being driven by / serve others?
- [Asked] Activism is often a collective effort, are you also working in a collective? If so, who are they, other artists, researchers, journalists…?
- On ways to assert user/data rights: what’s your stance on decentralization as it’s practiced today? When we rethink digital, technological governance, how helpful is an economic framework such as cryptocurrency/the blockchain? (In other words, must governance models be financially driven/framed?)
- The idea of usefulness refers to an audience I suppose. Which audience are you thinking about in your artistic practice?
- Adjacent Q: Considering the modes of distribution/dissemination of this work, what are the limits/affordances of each? How do you define your (target) audiences?
- The concept of research driven art as something practical is introduced, i.e. there is a production utilitarian artefacts, how do you measure the usefulness of the work you create, specifically when you consider the communities that took part in your participatory research approach?
- Do you see agency on datasets as an effective way to regain power over AI?
- Who are the attendees to the workshop you are organizing as part of the Feminist Data Set project?
- The Feminist data set is a project that started in 2017. What are the challenges to carry a project over such a long time period?
- Related to this, how do you see that the project changes given the changes in AI technology (and data-driven technology in general)
- In the example you showed, where parrots are annotated, the annotation remains a categorization. How do you see feminist dataset work as a way to think beyond categorization/classification? Is it just not compatible with ML/AI?