The Good Robot

Catherine D'Ignazio on Data Feminism

June 07, 2021 University of Cambridge Centre for Gender Studies
The Good Robot
Catherine D'Ignazio on Data Feminism
Show Notes Transcript

In this episode, we chat to Catherine D’Ignazio, Assistant Professor of Urban Science and Planning in the Department of Urban Studies and Planning at MIT and Director of the Data + Feminism Lab, about data feminism, what that means and why feminism matters in data science. We talk applying programming skills to social justice work, the tension between corporate and social good, and how technology can be oriented towards the feminist project of shifting power. D’Ignazio explains what would be needed to reshape the model of accountability in AI and why ‘better’ technology might not be less harmful. She argues that data work can be most effective at producing better outcomes when grounded in feminist scholarship and practice. We hope you enjoy the show. 

Content Warning: This episode contains a brief discussion of femicide. 

This episode includes an ad for the What Next|TBD podcast. 

KERRY MACKERETH (0:01):
Hi! We're Eleanor and Kerry. We're the hosts of The Good Robot podcast, and join us as we ask the experts: what is good technology? Is it even possible? And what does feminism have to bring to this conversation? If you wanna learn more about today's topic, head over to our website, where we've got a full transcript of the episode and a specially curated reading list with work by, or picked by, our experts. But until then, sit back, relax, and enjoy the episode.

ELEANOR DRAGE (0:32)
Today, we’re chatting to Catherine D’Ignazio about data feminism, what that means, and why feminism matters in data science. We discuss applying programming skills to social justice work, the tension between corporate and social good, and how technology can be oriented towards the feminist project of shifting power. D’Ignazio explains what would be needed to reshape the model of accountability in AI and why ‘better’ technology might not actually be less harmful. She argues that data work can be most effective at producing better outcomes when it's grounded in feminist scholarship and practice. We hope you enjoy the show.

KERRY MACKERETH (1:15)
Thank you so much for being here with us today. So just to kick us off, would you mind telling us a bit about what you do, who you are, and what brings you to the topic of feminism and technology?

CATHERINE D’IGNAZIO (1:25)
Sure, thank you for having me. My name is Catherine D’Ignazio, and I'm an Assistant Professor of Urban Science and Planning in the Department of Urban Studies and Planning at MIT, based around here in Cambridge in Boston. And I'm also the director of the Data + Feminism Lab here at MIT. And - How far back should I go?! [Laughs] Um, let's see, well, I grew up around technology, my whole life. And even in a time where, you know, pre-internet, I was going to computer camp and doing programming and things like that, primarily because my dad was a computer science dropout and he actually left a computer science PhD to write children's books about computers, and ended up being an educator, working in educational technology around how computers and computation can support teachers and support learning in the classroom. And so that was sort of the environment I grew up in around computers and robots and all of these things. So I got really thinking like, this is my area, like this is, this belongs to me, this is something that's sort of native at some level. And so fast forward, when I graduated from undergrad, I, you know, it was the first dot-com boom at the time. And I ended up going to work as a programmer, even though in fact I didn't have any formal computer science training, I just had this summertime training, where I had been working with my dad, and teaching teachers how to make web pages and use kind of early internet, with website building programmes as part of their teaching and learning in the classroom. So I basically had web development skills but not actually programming skills, ended up at a software company, because they were desperate for people, and were like we’ll just train you, you’re a Java programmer now! And that’s kind of how I fell into being a software programmer. And yet, at the same time, I think I carried with me actually a lot of my dad's influence, like really wanting to see this emancipatory potential of technology. But at a certain point, I found myself building project management software for Fortune 500 companies, and I had this moment of like, what am I doing with my life, and really feeling that there were so many possibilities for creative expression for justice-oriented applications of technology, but where I was wasn't supporting me to do those things. And so that's sort of, I ended up kind of following  a longer path, went back to grad school for an MFA, went back to grad school, again, at the MIT Media Lab. And those were sort of educational places where I was able to start to fuse these interests in digital technology, databases, the creative representation of information, like in forms of maps and visualisation, but also as applied towards social justice. And so that's, maybe that's a little too long for what you're looking for. But that's sort of how I found myself in this area of wanting to really say, well, we have this data with technology. These are things that come from a particular realm, and we're shaped by particular forces, like we inherit technologies that come out of a very unequal context. And yet, there is still this potential for how they might be applied towards more feminist, towards more justice-oriented, towards more anti-racist ends. 

ELEANOR DRAGE (5:16) 
Our podcast is called The Good Robot, so we're trying to conjure all those things into what we do. So our opening question is always, what is good technology? Can we even have good technology? And if so, how should we work towards it? What methods should we use? So can we ask you those questions, thinking about feminist methods as well and what they contribute.

CATHERINE D’IGNAZIO (5:39)
Yeah, I think that's great. And, you know, I think the first question around good is, of course, whose good, you know? Like not as ‘who is good’ in some absolute sense, but like, whose good are we talking about. Good for whom, and by whom? And so, Lauren Klein and I, in our book Data Feminism, we talk about these kinds of questions as ‘who’ questions. So like, these are questions that you ask when, you know, you're saying ‘data science for good’, for example, but like, for whose good, and who's doing the good, and who gets the good done to them and may not think it's very good, you know what I mean! And so, I think there's a lot to unpack when we make statements like ‘good’. And so in the book, that's why we advocate, or what we would say from a feminist perspective is that there isn't really - we are not specific about who ‘the good’ is for, we mean that the good is for the dominant social group. And in a lot of cases, we can see this in a good amount of the work out there that does do this, like data-science-for-good, design-for-good type of things, because it's almost like we can't go as far as to say - or I guess the question would be, why can't we say justice? What's precluding us from pointing to that as the truly good outcome? But I mean, in terms of barriers, I think there's that, where we haven't yet really interrogated who's behind these technologies and who's on the receiving end of these technologies, and what outcomes do they really want and reconciling those. But then I think there's a lot of other barriers, particularly in the technology and data space. Because these are tools that require usually a fairly high amount of expertise in order to mobilise, they require a lot of resources in order to do and undertake, it costs a lot of money to collect and store and maintain data, it costs a lot of money to employ staff who know how to use that data, particularly if you’re talking about at scale, for making the things that global platforms make. And so what's happened is that the ‘who’ questions have ended up being answered by default. And it's by like, the default way that the political economy is organised, meaning, it's corporate actors, generally, who are answering these questions, and so ‘good’ in a lot of these cases, basically just means what's good for the company, which means focusing all of those resources on extracting the most profit possible from the applications. And so I mean, I see what is good for corporations as being fundamentally in tension with what is good for people and communities, and also what is good for justice. And those things are really not well aligned. And so that's where I like, again, these concepts of ‘whose good’, I think, come into play here.

KERRY MACKERETH (9:04)
Fantastic. That's really fascinating. And we've emphasised and you've certainly emphasised the importance of feminist perspectives on AI and data science. But we know that feminism means really different things to different people. And it's one of the things we really love about the podcast, right, is that we get to hear people's really different ideas of what a feminist project entails. So could you tell us a bit about what feminism means to you and how it plays out in your career and then your work?

CATHERINE D’IGNAZIO (9:29)
Sure, yeah. Thanks. Yeah, I totally agree. I think that there are many feminisms and not all the feminisms are compatible with each other [laughs]. So Lauren and I, in the book, we draw specifically from intersectional feminism, and in a way that reflects us more, both of us are based in the US, Lauren actually works in American Studies and her training is in the early American period. And intersectional feminism is a feminist framework that comes out of US Black feminisms specifically. So thinking about work by Kimberlé Crenshaw, of course, who coined the term intersectionality, but then also the many precursors to her like the Combahee River Collective. And many folks, you know, there are folks who have traced the history of intersectionality as a concept back actually into the 19th century. Meaning, what an intersectional feminist approach says is that there's really no way, when we think about social inequality, that a feminist approach can only be looking at one dimension of social inequality like sex or gender, as the one vector of oppression. We have to be taking into account other forces. And specifically, Crenshaw was talking about the experiences of Black women who sit at this intersection of combined racism and sexism, and the unique, nonlinear way in which sexism and racism combine to shape their life experiences, life possibilities, and so on. And so really the grounding of the book is saying that, and, you know, if you think about it, it's really visionary and really opens up ... it's a very powerful justice framework, because what it's saying is, this narrow focus on this one dimension is really reductive. And so how do we understand the interlocking nature of these structural forces, and there are so many of these structural forces, but this can become complicated, right? So we think about sexism and racism frequently in the book, that's an ongoing theme of the book. But of course, there's also things like classism, heterosexism, and cis-sexism, there's colonialism - hugely important for us to be thinking about. And so all of these … you know, intersectional feminism is really asking us to take these structures into account and in a way make these differentials of power that are produced by these forces. That is, in a sense, the object of study. And so that, I think, resonates with a lot of other feminisms, but it has its own specific analytic, which makes it in a way... you know, one of the things I've been thinking about is like, it was designed specifically to think about Black women's experiences and that's the lineage. And it's really important to recognise that, at the same time, there's what the Combahee River Collective said, they said if Black women were free, then all of us would be free, because it would mean the destruction of all forms of domination and oppression. And so I think there's this way in which by being very specific about the analytic of power we are actually generalising to be able to talk about sort of all sources of power, or being able to try to spotlight and examine the ways in which they operate in different contexts. I mean, that's one of the things that for me is very powerful about intersectionality - it starts with this unique thing to then apply everywhere, and I think that’s its power. 

ELEANOR DRAGE 13:30
That's why the book is so important and we highly recommend it for everyone. We will include it in the reading list to go with this episode so that people can check it out. It was particularly important for Kerry and I because it directly addresses how bad data science can result in harmful technologies, and how feminist methods and values can make for better data science. It's one of the clearest examples, I think, of how this iconic feminist work that is so precious to so many of us, like Gloria Anzaldúa and bell hooks, can be mobilised to help problems in AI. So can you tell us using perhaps some specific examples, because the book is very specific, what do you think that these texts offer? And how do they respond to the harms that arise from what are understood to be technical problems?  

CATHERINE D’IGNAZIO  14:23 
Yeah, so in Data Feminism we propose seven principles. So basically, the way that we developed the book is we looked across a lot of feminist literature across a lot of different fields. We sort of always had intersectionality as well as Patricia Hill Collins’s the matrix of domination as the undergirdings of theories of power that underlie the work. But then we looked across a lot of other feminist scholarship but also activism, writing and so on. And we came up with these seven principles that for us really encapsulate the most important aspects of feminism as they relate to data science and to us, and really in a way to the practice of doing data science. And so those are things like examine power, challenge power, rethink binaries and hierarchies, elevate emotion and embodiment, and so on. And so these are, I would say, like distillations of feminist thought that can help us understand how current practices in data science are sometimes subtly working against our goals, let's say of gender, and race, and equity, of liberation. And so, you know, these are things that … like in the Elevate Emotion chapter we [do] talk a lot about data visualisation but we also talk about how data visualisation as a field has tended to really value a kind of neutral style of communication, really has pushed back on emotion entering the visual communication of data. And we push back on that narrative and say, in fact, setting up this false binary between reason and emotion is highly gendered. And it's also in a way, it's very reductive in a way to imagine that humans are only creatures of reason. In fact, we need to think about what gets opened up, when we say that we're going to embrace emotion in a kind of data communications toolkit, what kinds of possibilities get opened up? Who else might you reach as part of your data communication project, and so on. So we try to use these principles as a way to sort of like, rethink some of the common wisdom around how data science is done, and what are some of the best practices, and to show how they might be working against our goals. And we also use examples to build on and hopefully expand conversations about, you know, bias, discrimination, and so on. I mean, fundamentally, Data Feminism is responding to this issue that we keep producing racist, sexist, and classist technology. So sort of an attempt to say, well, let’s back up and examine the root cause and examine how does this happen, because in fact it's a very logical outcome of a racist and sexist society that would produce racist and sexist products of whatever sort, data and information products included. But so, for example, in the Examine Power chapter we talk about ways that not only are algorithms and data sets biased, which of course, there's more work around, and that's fantastic. And we talk about some of that work that's looking at bias in data sets and training data and so on. But if we really want to understand the root causes, we have to back up a little bit. And so we talk about the work of Mimi Onuoha who has a project from 2016, which is actually an art project. But she set out to collect a list of what she calls missing datasets, so aspects of the world that one might think of as being really socially important, but nobody collects data about these things. And so, these include things like maternal mortality statistics at the federal level, and the United States, COVID data for a very long time when in the pandemic we were not collecting race and ethnicity, trans* people killed or injured in instances of hate crime, things like this. And so this project of Mimi's was she actually displays it as a filing cabinet and then it has these file folders, and they're titled with the subject of the data that are missing, but you open up the folder and then there's nothing inside because we have no records about that particular thing. And so one of the things that this project helped us point out, like we talked about this as she's examining power, and the sense that she's interrogating, and she's asking the question, why don't these data exist? Why do we have no information? And also then who is the responsible authority for collecting and stewarding such data and why haven't they invested those resources in it? And all too often, like if you go look at her data set, like all too often, these are data that have to do with people that are minoritized, stigmatised, issues that are stigmatised, things that are underreported, and so on and so forth. So often these have a raced and gendered dimension to them.

ELEANOR  DRAGE (19:50) 

Mmm, and I love those examples in the book. And they were really beautifully illustrated and described, you're also wonderful writers. So, very jealous, you’re good at everything, all those different aspects! 

The industry is struggling to find workable solutions to the harms produced by AI, as you very well know, and practitioners are still avidly seeking the right framework or toolkit to fix the problem. And there's still this idea that there can be a kind of quick fix solution, like applying an ethics toolkit after a system has already been built, and that that will be sufficient, and I think that your book explains why making AI less harmful is actually a complex political and social issue, not just a technical one, and how all these factors need to come together if we’re to create better AI. How are you thinking around that idea?

CATHERINE D’IGNAZIO (20:44) 
Yeah, absolutely. I mean, in many ways, that was one of the reasons for writing the book, like I said, this desire to help people understand some of the root causes behind these things, specifically so that we can come up with better and more holistic - I don't want to say solutions [laughs] - you know, like the whole language of technology around solutions is so frustrating, even though I'm a programmer, and I do like fixing things. But yeah, but as a way of thinking about, how are we going to take action to address these things? It's not going to be a mathematical formula. I don't I don't want to detract from that work. I actually, I do think that, I think some of the work, for example, on auditing algorithms, analysing disparate impacts, there's folks right now, we were talking about doing algorithmic impact assessments. I think that work definitely has a place, you know, but what I don't think has a place is thinking of that as the answer, right? It's like, it's part of a much more systemic shift, every conception of how we work with technologies that includes DEI-type issues, it includes thinking about who is hired into and is building these systems, who is pushed out from these fields, you know, from these sort of like, lucrative bro-guy, software development firms, one really disheartening recent example is the example of Dr. Timnit Gebru at Google, where she was fired for a paper that she was writing about the harms of natural language processing models. And so that, you know, that, to me demonstrates a huge lack of commitment on the part of industry to truly [realise] good or justice in product, you know, like it says a lot about, about like, where the company's allegiances lie, I would say in this case, they definitely chose profit over justice. And that is very disheartening. Because actually, prior to that, I was thinking, well, maybe, you know, there's things like there's these great people in industry who are doing this amazing work to transform and challenge and then there’s this clear signal of like, don't rock the boat, you know. And so I think that's a very disheartening thing, because sort of who is doing the work is intimately tied to whether the work is harmful at the end of the day or not, you know? So, yeah. Yeah, I don't know. There's more to say there. 

ELEANOR DRAGE (23:31)
Do you think then that tech companies should be thinking in terms of justice rather than equality?

CATHERINE D’IGNAZIO (23:37) 
Well, so here's the thing. Like I think this is where our systems come into tension with each other. Because fundamentally, the structure of companies, the way that they are organised, who they are accountable to is shareholders, right? And shareholders’ metrics of success are about profit. And, and that's it, basically. And so, you know, maybe there's some better models that ... you probably can go out and find some business scholars, who could tell us there's some good models that combine maybe people health and planet health into some kind of like, profit thing. But the overarching model is that the model of accountability is accountability to shareholders and profit. And I think that is almost unrectifiable in this case, because profit is always going to be on the side of status quo. Like I just, I don't necessarily see ways of changing that. And that's where I see the role for government and regulation in fact is really strong to think about - we really need a way of reining in that profit motive that is extractive, exploitative, and exacerbating systemic inequality, basically across the board. And it's not to say, I don’t want to say that there's not good people working inside these companies that are working towards change. And in fact one of the more heartening things, you know, kind of despite Dr Gebru’s firing, one of the more heartening things is the workers at Google unionising, folks who are really taking action from within large companies, large tech companies in particular, to really make change. And so I also think, industry is not monolithic. It's not to paint everyone as an evildoer or something. But just to say the system is set up so that justice will not win in this particular system. And so we need a kind of a reconfiguration, a rebalancing of who can do what, when and where to whom. So the path that I see towards that is both through these different sorts of surges of activism that I think are coming from multiple spheres, as well as through, you know, government regulation, but like, proactive, not reactive, sort of government regulation of these systems.  One of the things I always come back to is, Frederick Douglass had this great quote, power concedes nothing without demand. So I personally am not going to sit around and wait for companies to do that, because I feel like for the most part, they're like, fine. They're happy with the status quo. And of course they want more women, and they'll sort of work on that on the side or something, you know. So I think I'm not going to sit around and wait for that to happen. And that's why I'm heartened when they see the bottom of things, whether those are from a sort of community activism, or they're from internal pressure within the organisation, to say we cannot abide by this way of operating anymore, you have to change something, and so on. So I think there needs to be. There needs to be pressure on people in power to be accountable to different things and different communities.  

KERRY MACKERETH (27:00)
That's really fascinating, thank you. I want to pivot back to something Eleanor mentioned a bit earlier, which was the sort of gorgeous visual presentation of your book Data Feminism and which I have sitting behind me, as I'm sure many other keen data feminists have as well. And we were really interested in how the book expresses how seriously you take design is a key part of the AI production pipeline. So you want to ask you, why does design matter? And how can design itself be feminist?

CATHERINE D’IGNAZIO 31:34 
Design is so important. And I say that as an artist and designer, I guess, so that's part of my own background and training. And one of the things for me that it relates to, that specifically has to do with a feminism, justice sort of orientation around again, who can access the work? And how does it speak to them? By thinking about the mode or the form, or the aesthetics in which you communicate something, it's a way of inviting certain people in and excluding other people. So if you just think of some highly technical and complex data visualisation that might be very appropriate for an expert, scientific audience or something. But then when it's presented to a general public, you know, that's going to have a really different effect. It's going to make people feel dumb, it's going to not bring them into the conversation. They're not going to know what it's talking about, and so on. So I think the design and the aesthetics of how we present information has a lot to do with that question of who, who are we accountable to, who are we speaking to? Who are we inviting into the conversation? And so in Data Feminism, yeah, we emphasise, you know, in fact, I should say, the book started with, the book was going to be feminist data visualisation. So that was originally going to focus only on the visualisation side of things. But once we wrote our first chapter, we, Lauren and I were like, you know, you can't make a visualisation feminist, if, like everything that came before it is super broken and exploitative. Right, so visualisation is part of the data exploration process. It's also a kind of an end product or end state. But if everything in that process is, you know, bad, you're not going to meet, you're not going to be able to retroactively refit it into a feminist lens. So that's when we realised we had to go back and really think about the whole process and the practice of data science. But then I think there's some other, there's some other interesting insights that come from taking a feminist perspective on design, as well. And one of those is also, so I think there's things around form, so when we think about like, you know, we show lots of things in the book that we call visceralisations of data, which is actually a term from the artist Kelly Dobson. And those are things like quilts made from data or walking data visualisations or there's even this fantastic project in Tanzania that did a fashion show. They did this like data and design competition. They made data-driven garments, and then did a fashion show where everybody sort of wore the garments and they celebrated the winner of this competition. So I think these are very creative, embodied ways of working with data or interacting with data and so on. And I am finding that both artists and also data journalists, right now, we show a lot of their work in the book because precisely they're really forwarding these new methods of making data accessible to larger and larger publics so not only to sort of expert or sort of narrowly professional audiences. But then I think another kind of feminist take on the design process would be really thinking about who's involved. In that process, we write a lot about participatory processes, and strategies of engaging with multiple publics in a design process along the way. And that's a very feminist [approach] as well as [aligning with] universal design principles as well. It's not exclusively a feminist approach. But it's basically an approach that advocates for looking at the sort of edges and the margins first for any given system that you're working with, rather than working with the big bulk of people in the middle that are the average user, the average characteristics are really looking at the people who are most marginalised by whatever the context is in which you're working, and really working hard to have the system work for them, to potentially even centre their needs in the process of doing the work. So I think there's a sort of feminist insight around both aesthetics and form but also around pluralism and process for how a project unfolds.

KERRY MACKERETH (32:00) 
Fantastic, that's really, really fascinating. I was also really interested to hear about the original title of your book, because something I do love about it is just the overt titling of it as Data Feminism and I was wondering, how is that title received, or the concept of data feminism itself, when you were writing the book, when you were trying to get it published? And now as you speak about it, and have, you know, released it into the world, what are people's receptions of it?

CATHERINE D’IGNAZIO (32:24)
Yeah, yes, it's been so interesting really. We actually get this question of like, why don't you call it data justice? And because precisely intersectional feminism, like I'm saying, is an analytic that tries to account for not only gender or sexism and gender bias. However, you know, we felt really strongly … well, first of all, there is already work on data justice, that was already like a concept, and we didn't want to be co-opting other people's work. There's a data justice lab in Cardiff, actually, that's one reason, but then also, because what often happens ... part of it was wanting to insert, like, it's like a kind of very specific intervention to say, hey, data and technology people, feminism has so much to offer you and feminism specifically as an intellectual tradition, you know, and so that was one of the great pleasures of the book, I think was, for us looking across all of the amazing feminist work has been done in so many different fields, like both technical fields, but also, you know, social sciences, arts and humanities and so on, and being able to kind of really try to distil that to say, here's what a feminist approach has to offer. So kind of a push back, I would say, against conceptions of what is feminism and who is feminist, that … often when people bring those images to the table that were never feminist in themselves, they're not actually characterising feminist work, but they're products of a backlash against feminism and the backlash against feminism has been very powerful and continues to be powerful. And so it's kind of an insertion to say, in fact, the ideas that many of us come to feminism with are incorrect. And actually there's this amazing liberatory body of work that has so much relevance for the conversations that we're having today about data and technology. And so I think it was also that I've seen all the conversations about data and ethics, and how enriched they would be with an explicitly feminist perspective. 

We still get questions, I mean, we try to answer those questions about ‘why not data justice?’ We still get questions about well, what makes this feminist versus just good data science, like actually, I get asked that question, quite a good amount. And the thing that I come back to with that is, you know, again, the feminist project really centres on power. Like that's why the first two chapters of our book are Examine Power and Challenge Power. And that is the focus. We really think about how do we use data science to challenge oppression and work towards the liberation of minoritised people. And that is not the central project of good data science right now [laughs]. I mean, maybe one day in the future, that will be the focus. But right now, data science, as it's being taught and consumed in the majority of departments and  industries is not … the main problem it’s trying to solve is not oppression. In fact, there's typically no talking about oppression and like pedagogical situations around data science, it's imagined to be separate from the kind of social and political realm which of course ... all the people who actually work with data science know that  that's not the case. But that's in fact often how we imagine it to be so it’s a way to  push back on that misconception.

KERRY MACKERETH (36:08)
Absolutely, and it’s really lovely to hear you talk about the pleasure of sort of introducing  to so many more people the ways in which feminism can really revolutionise and help us in this project of trying to create, like, better technology, I guess for our last question we want to flip that around and say what do you think that technology itself or technological processes can do for our understanding of feminism? 

CATHERINE D’IGNAZIO (36:31)
So one of the interesting things that we actually do talk about in the book, too, we talked about a good amount of projects that leverage data in a kind of activist way. So we call it counter-data projects, where they're either collecting data or combining data in new ways in order to advocate for specific social changes. And so I think one of the ways that technology and data science can contribute to feminism is, for me, it's around looking at what are feminist groups actually doing with digital data, or digital technologies and data driven practices to kind of enact new digital feminisms? And so right now, and let me give you a concrete example of this, we in the book, we talk about the work of María Salguero, who works in Mexico to collect data about femicide, which is gender-based killings, in the Mexican context. And I was living in Argentina, when I was writing about her work, and I started talking to folks down there. And in fact, there's many groups across Latin America, and it turns out, and many other countries around the world as well, who do the same work of counting, tabulating femicides, working from news reports about the deaths of women and girls. And so we've, I've started a project, I'm working with them to actually design some participatory technologies to support the work that they're doing. And it's been really fascinating. We've done lots of interviews. And it's fascinating to see the ways in which with relatively few resources and not much should mention institutional infrastructure, these activists are piecing together a kind of, basically like a data pipeline for how they collect the information, what they do with the information, which is highly creative. It's not only about producing alternative statistics to government numbers, like although in some cases, it is about that. And like they take all sorts of creative actions with their data, they do art projects, and public spaces, and so on. And so for me, there's a really interesting way in which the data and digital technologies are supporting these kinds of digital feminist practices that then get reinserted often into physical spaces because they'll use data to then go and do an intervention or protest or paint the names of women who have been killed in alphabetical order in an enormous public square. And I think this is highly interesting. It’s a way that technology is sort of infrastructuring new activists’ counter-power practices. So I think in that sense, these kind of data technologies do have a lot to offer, but we need to understand the ways in which they can support these kinds of counter-data scientists, basically, so.

ELEANOR DRAGE (39:38)
Amazing, well you’ve definitely responded to the question of why we should all be data feminists, so thank you, we are all included in this mission to become data feminists. And we, at the very beginning of our project Kerry and I were interested in what this kind of work can do for feminism itself so it's incredible,  its delightful to hear that there are these kinds of far-reaching possibilities for this work in technology. So thank you so much for joining us today!  

CATHERINE D’IGNAZIO (40:05)
Thank you. No, it's a pleasure and thank you for doing the podcast. I'm so excited to listen to the other episodes as well.

Reading List:

By the Guest 

D'Ignazio, Catherine, and Lauren F. Klein. 2020. Data Feminism. Cambridge, MA: MIT Press.

D'Ignazio, Catherine, et al. A Feminist HCI Approach to Designing Postpartum Technologies: “When I First Saw a Breast Pump I Was Wondering if It Was a Joke.”. 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2016. Link

Thylstrup, Nanna Bonde, et al. Uncertain Archives: Critical Keywords for Big Data. Cambridge: MIT Press, 2020.

Data + Feminism lab at MIT

What the Guest is Reading (from Data Feminism):

bell hooks [1984] (2015). Feminist Theory: From Margin to Center. New York: Routledge. 

Hil Collins, Patricia (2008) Black Feminist Thought: Knoweldge, Consciousness, and the politics of Empowerment. New York: Routledge. 

Data for Black Lives, https://d4bl.org/

“How We Collected Nearly 5,000 Stories of Maternal Harm” Propublica, March 20, 2018 https://www.propublica.org/article/how-we-collected-nearly-5-000-stories-of-maternal-harm

Moraga, Cherrie, and Anzaldúa, Gloria eds. (1983) This Bridge Called My Back: Writings by Radical Women of Color. New York: Kitchen Table Press.

Nash, Jennifer C. (2019) Black Feminism Reimagined: Life After Intersectionality. Durham: Duke University Press. 

“The Combahee River Collective Statement” (1978), Yale.edu American Studies. 

Sandra Johnson, “Interview with Gloria R. Champine,” May 1, 2008, NASA Headquarters NACA Oral History Project, Link

Other Resources

Data Justice Lab, Cardiff University 

Tanzania Data Lab