The Good Robot

Priya Goswami on Feminist App Design

June 01, 2021 University of Cambridge Centre for Gender Studies
The Good Robot
Priya Goswami on Feminist App Design
Show Notes Transcript

In this episode, we chat with Priya Goswami, anti-FGC activist, awardwinning filmmaker, and CEO and founder of the AI-driven app Mumkin, about feminist data practices and app design. We discuss designing and building apps that do good, why an app’s “users” are actually “participants”, and why you cannot compromise on the participants’ privacy and safety. Priya explains what it means to design an app as an activist, why feminism should be normalised, and the problem with running activist campaigns on social media.

Content Warning: This episode contains discussion of female genital cutting, or FGC, and gender-based violence. 


This episode includes an ad for the What Next|TBD podcast. 

KERRY MACKERETH (0:01):
Hi! We're Eleanor and Kerry. We're the hosts of The Good Robot podcast, and join us as we ask the experts: what is good technology? Is it even possible? And what does feminism have to bring to this conversation? If you wanna learn more about today's topic, head over to our website, where we've got a full transcript of the episode and a specially curated reading list with work by, or picked by, our experts. But until then, sit back, relax, and enjoy the episode.

ELEANOR DRAGE (0:32):
Today, we’re chatting with Priya Goswami, who's an anti-FGC activist, awardwinning filmmaker, and CEO and founder of the AI-driven app Mumkin, about feminist data practices and app design. We discuss designing and building apps that do good, why an app’s “users” are actually “participants”, and why you can't compromise on the participants’ privacy and safety.  Priya explains what it means to design an app as an activist, why feminism should be normalised, and the problem with running activist campaigns on social media. We hope you enjoy the show.
 
KERRY MACKERETH (1:09):
Warning: This episode contains discussion of female genital cutting, or FGC.

KERRY MACKERETH (1:14):
So, Priya, thank you so much for joining us! So just to kick us off, who are you, what do you do, and what brings you to this topic of feminism, gender and technology?  

PRIYA GOSWAMI (1:23):
Very happy to be here, thank you so much for inviting me. I’m Priya Goswami, I’m a filmmaker and communications designer from India, now based in Hong Kong. What brings us here, well let’s see. I’m a filmmaker who made an artificial intelligence driven app - a very exciting journey, I can’t wait to share with you all, and , yeah.

ELEANOR DRAGE (1:46):
Our podcast is called ‘The Good Robot’, and so we ask all our guests what good technology means, and what it looks like to them. To say that your app is an example of good technology would be an understatement, it’s truly amazing technology that is improving the quality of people’s lives and offering this really unique, informative space of solidarity. So, when you were building the app, what did you think of as ‘good technology’? 

PRIYA GOSWAMI (2:14):
Thank you for that question, and also the amazing name Good Robot. I’m happy to be a good robot, or a part of it. So, when we started creating the app it was like a clean slate for me personally because I come from a communication background, I went to a design school, so we were kind of aware of user-centered design practices and the impact of design on just about everything. So be it a cup to a brush or even cars, my design school professors would always talk about, design can make an impact on just about everything including, you know, a survey form. And all of that learning sort of rushed back when we started to create this app. I started to question myself, ‘What would I want to see in the kind of technology we see out there? And how can we do it differently?’ So a little bit about me, I’m also an activist, I have a background of working on female genital cutting in Asian communities for the past ten years, and oftentimes as a visual communicator as a filmmaker I would get frustrated with the use of blood, blade, really kind of graphic imagery - think of black and red and those types of colours - to although emphasise the trauma of the practice but you know really not thinking about how the survivors receive that kind of imagery. What kind of impact would someone who may have undergone the practice of female genital cutting would have when they see such a visual. So, like I said, it was a clean slate, I wanted to re-look at practices in terms of UI/UX but also in terms of how can an app feel more human; how can technology feel more human by the words that it chooses to throw at the user. And at some point I’d also like to debate on your platform why are we calling our users, ‘users’? Why not ‘participants’? So just about questioning everything, that’s how we began, starting from the UI/UX, the way it was created, the language that is used, and also the colour toning, visuals, especially to communicate heavy, sensitive subjects on gender-based violence.  

KERRY MACKERETH (4:40):
That’s so fascinating, and thank you so much for sharing, you know, the journey behind this app and the design choices that you made; the, sort of, problems that you’re trying to address when it comes to the other design choices made in this field. And so I’d love to hear more about your choice to make an app in the first place. So, what opportunities did you think that the app offered you for addressing these forms of gender based violence and harm? 

PRIYA GOSWAMI (5:03):
That’s a great question, why an app in the first place? Why not an Instagram page, right? Why not a Facebook page? And we already did that and I’m very fortunate to be a co-founder of an international non-profit with four other women and we do have a very successful storytelling platform inviting stories from people from the community, survivors, men, women, to share their own stories and, you know, basically place the narrative, the power of sharing their stories in people’s own hands - you know, bringing up ground up change. So why an app? Why not a social media page? So it’s a very interesting journey for me, for me personally the movement started with a documentary called A Pinch of Skin. I was very young when I made it, I was still in college, and my professors I remember told me that this film cannot get made because you’re a documentary filmmaker, you will not get visuals. If no one is willing to come on record and share their name in an article, forget asking anyone for getting a video bite or an audio bite. I found that challenge also a kind of a design challenge and particularly interesting, this was way back in 2011, when no one from the community was willing to speak about the subject openly. Any names, anything at all leaked out could have meant very real, could have had very real consequences on the people, people’s lives. So when I made A Pinch of Skin it was shot entirely with anonymous narrators, with hands, feet, abstract visuals, fluttering of curtains, you name it; full of abstractions. From there, in 2015 when we co-founded and international non profit it was such a significant leap, where men and women from the community were willing to post their selfies. Now we think of selfies as these really - I mean, what is a selfie right? - such a flippant thing, for lack of a better word. I cannot tell you how powerful selfies felt at the moment when we were just showcasing somebody who was taking a selfie holding a placard saying, ‘I’m from the community, I don’t believe in the practice’. There it was. People coming out, using their phone cameras, I get goosebumps when I say this. You know, just really putting their voice out there, dissenting voices out there. But, things started to shift. We often forget that you are - anyone - when we put our voices out there on social media, the social media pages are monitored by other people, people who are watching you, maybe, and social media pages are watched by the social media pages themselves. So, say you put out your selfie on Facebook or Instagram, Facebook or Instagram is watching you, people are watching you and your behaviour on Facebook or Instagram. So that kind of really hit home hard, that a lot of people from the community started to feel like they can’t comment or even like, or even heart, a picture because they can’t be overtly showing their support to the movement. But they would write to us. So although there’s a lot of power even in anonymous narratives, we started to think how can we give power to the people and, you know, at the same time protect their privacy. So that was, I would say, kind of the idea behind what can an app do that a social media page can’t. And then something fantastic happened. My co-founder, Aarefa, her sister said, ‘Oh you know, these days things are really advancing, why not use AI? And Aarefa said okay, I am not a technology person, let’s maybe put something out there with AI. We got through and to let you onto a secret, Aarefa gave me a panic call, ‘We’ve got the grant! It’s an AI driven app! And we want you on board’. And I was like, yes, okay, wow, we’ll do this. And I think that’s when the penny sort of dropped, how important it is to have, rather create, a private space for the users where they can - what does it mean to have privacy of a palmtop device, what could that afford to our users? What it already hasn’t, like a social media channel hasn’t, and so on and so forth. So the app really came from the medium of somebody using it in their bathrooms, for instance. Like you can really use an app in your bathroom and no one would know. So, pretty much from the medium itself. 

KERRY MACKERETH (9:41):
Thank you so much for sharing that with us, and it’s so fascinating to hear about the kind of long history of activism that then informed the kind of product that you wanted to make, to allow people to have these really difficult conversations. I’m really interested in the question of privacy, and this private/public interface or relationship that you’ve outlined, so what kind of design decisions did you have to make in order to ensure that your app didn’t risk putting its users - or rather, participants, as you’ve quite rightly identified - at harm. Or making sure your app didn’t produce unexpected or unintended negative outcomes? 

PRIYA GOSWAMI (10:17):
I think this is where I had the most fun as a designer. It was such a challenging - just such a challenging project, on so many levels, I’m not going to scratch the surface of what is a communications designer doing with technology, that is a whole other conversation, maybe we can touch on that later. But how to approach technology now from the perspective of an activist who deeply understands the pulse of the community, deeply understands the ramifications it could have on actual people's lives who you know by their first names, you know their entire families. So I would start by talking about yes, there is a private/public discourse happening here. When we decided that we will have this app the first challenge at hand was how do we invite the users, like how should they be logging into the app, and this debate went on for six months. We had a pre-pilot, and then we improvised on our pre-pilot. There were no straight answers, right. On the one hand, we wanted our users to be able to log into Mumkin entirely anonymously. We didn’t want to have their email addresses, phone numbers, nothing. On the other hand, we were acutely aware of intimate partner surveillance. We were acutely aware of how phones are actually devices that can be monitored by intimate partners, extended family members, and we did not want anyone else to have access to the app, right? No one should be able to know what the conversations in the app looks like. So we took the difficult decision of going with a username and password. 

And at that step, we were appalled to find despite the very many choices we had which were so implicit, right. So the text app contractors asked us, ‘do you want to use, do you want the user to share their phone numbers?’ I remember feeling like, uhh, no? Do you want access to their cameras, folders, this that? No. Do you want to track? I was like, no. Why is that even a thing? And why is that so easy that just anyone can make that happen. And by anyone, I say anyone. I mean, our level was fairly rudimentary in 2019, we were just beginners, and those were some of the choices in our hands, like phone numbers, city location, you name it. And so these choices were appalling and they warranted some kind of studying and understanding and basically going back to the drawing board for both Aarefa and me and we did just that, we informed ourselves. I dived into a lot of literature to realise ‘no, this is not okay’. And we decided that okay, this is in itself, the framework in itself is quite faulty because why should these choices be laid out in the first place. Well we took the hard call of having the username and password. What I didn’t know at that stage was this login step would become a kind of hindrance for us to get more users from the community who are interested in trying out the app because they would think that their data is being collected. So then we started to put a whole lot of discourse out that we don’t take your data, we don’t sell your data, no one has access to your data. And so that is the word out there. And we started to find out ways how to push our privacy policy more out there, so we started putting Instagram posts and things like that. But more on that later. I think another thing that I would want to talk about here is the quick login access through Facebook or through Google. That’s such an easy step, right? Like you want someone to have a username and password, and just with a button you can have a Google login or a Facebook login. I’m still, I’m trying to learn what happens when someone has a Google login or a Facebook login. And from what I know and from what I have seen in the back end, I would not want anyone, even at the cost of losing their users, or the number of users who might find it extremely tedious to have a username password captcha but I wouldn’t want anyone to use a quick login. Let’s just put it that way. 

KERRY MACKERETH (15:01):
Yes thank you, you raise such important questions around privacy and data and where people’s really sensitive personal data is actually going. I was also wondering a bit more about Mumkin; how did you embed people’s real life experiences into the app and into the design of the app? 

PRIYA GOSWAMI (15:17):
We obviously did not literally lift people’s experiences into the app, but a decade long work on a particular subject definitely prepares you well enough to know that no two people will experience the practice of female genital cutting in the same way, and what also I think I would say is in a way unique to Mumkin is that we allowed as many responses as possible on the practice itself. So for example is a survivor feels she underwent the cut but that it did not have any impact on her, that option is also there for the users to choose from. Of course we want to make a very strong case that the practice is a form of gender based violence. But we also want to represent as many subjective voices on the subject as possible, because ultimately, it's a human experience. What actually fed into it was different forms of information gathered in different forms, actually, anecdotal evidence, stories, audio data research, and so many, so many conversations with people, for us to know how exactly that the practice impacts someone, like impacts the survivor, impacts someone who's the husband of a survivor, impact, someone who is probably a family member did not even know that this happened to their daughter. We've been through those. And also it prepared us to create a floor plan, right? Like who it's AI driven app, and we want there to be a conversation. So who are we having this conversation with? In our, you know, understanding of the subject, we realise that most people want to take this conversation back to, you know, to their mothers, they want to initiate this conversation with their mothers, if they have any kind of unresolved issues where the mom got it done on them, and then they think it was a big deal. And that's also kind of heartbreaking, right? Like, how do you tell your mom that you feel violated by her. And so we have so so many cases like that, and also with an intimate partner, say, your husband, with whom you might not have the words to initiate a conversation with.

ELEANOR DRAGE (17:37): 
You brought up this really crucial point of allowing for a variety of experiences in this context, and I think that this is an expression of how brave and wise this app is. And that idea of not dominating an app with an overarching message is really important in the feminist context, and something that we thought a little bit about when, for the project I worked for before, this European Commission project on the production of cultures of gender equality, we created a ‘feminist app’ (in inverted commas), because it's incredibly difficult to appeal to all kinds of feminism. So what we wanted to do was generate lots of different quotes that expressed how varied feminism is. So I wanted to know, before we talk to you about how feminism factors into your app, can you tell me what feminism means to you, as a self-identified South Asian feminist.

PRIYA GOSWAMI (18:37):
My heart just dropped [laughs]. I've lived, I don't know, I would say I've lived feminism all my life. For me, it's been, it's really not an exaggeration to say that I was born angry. I think it started with a really, really serious fight with my mother when she called me impure at the age of 11, when I got my first period. I was an 11 year old kid, a very recalcitrant kid who wanted to play Holi, the Indian festival of colours. And so I was just, you know, I just wanted to be dunked in a well by the boys and my mom is like, ‘no, you're bleeding. You can't. You have to, you're impure now’, and, you know. And, and I, all I remember is I fought with her. I really, really fought with her that, how dare she use the word impure. And then I realised very quickly that this is an attitude, she really does think that a girl is impure when she is menstruating. And that kind of defined my other inquiries. Of course, I didn't call it that back then. It kind of defined, it defined everything else I did, I wrote a play, and…

Well, what does feminism mean to me? Feminism is just my way of life. To me personally, feminism is also the only way I know how to live. It’s the only way I know how to live and I do, however cliche it may sound, and the more I see the less I know on this also is, if it is not intersectional, it's not feminism.

ELEANOR DRAGE (20:12):
Yup, exactly. That's the premise of Kerry and I’s work. We try to retain that intersectional focus in everything we do. We had a bit of a chuckle when you said that you were born angry. I think that was something that lots of us can understand. I think it's being born aware. And my mum, still a child of the 50s, writes ‘C’ for curse in her diary, when she gets her period, which I always find really shocking. But of course, that's such a normal thing for women of that period. So how do feminist ideas shape how you conceived of and designed Mumkin? And, to flip that question slightly, what do you think that a study of apps like Mumkin can do for feminism?

PRIYA GOSWAMI (20:58):
If you don't mind, I'm going to take that as two different questions. And yeah, I hope I don't lose my train of thought there. How did my feminism, our brand of feminism, define Mumkin? Well, to start off, we realised nothing about technology is feminist, or very little about technology is feminist. We realised how the first somebody was pitching some ideas to us, and in a very, you know, very, like, you know, coding male space. And they brought a presentation to me, which said, Hey, Jane Doe, login, and I kind of flip. I'm like, Jane Doe, sorry. Jane Doe is, of course, you know, that obscure name persona that people talk, I associate with Netflix documentaries on crime, when a girl goes missing, or something happens to her. And I was like, I don't want any of our users to be addressed as Jane Doe, not now, not never. And that was like my day one into creating Mumkin. So I think it starts with the language. There isn't enough feminist representation in technology. And I'm trying to think of - this was one personal anecdote, yes. But how else can I talk about it? Just last week, we were uploading a post on Facebook, from our Mumkin page, and we realised that there's a category that Facebook now has on tags, if you have a social project, tag your project. So there was wellbeing, there was health, there was mental health, there was water, there was sanitation there was just about everything, except for gender equality. I mean, in Facebook as a tag, how did Zuckerberg and his team miss it? After, you know, so much discourse on you know, criticism against Facebook? 

So, at the very systemic level, we strongly feel that technology is really written, really written from a male perspective. We wanted to change that, we wanted to be - just these small minor changes, right? This morning, I was designing a form for Munkin. And I realised that I'm not going to give option A as a man, option one would be a woman, why not? Just those very, very basic, you know, laying out for sure, A and B, they tell you something about what is the default setting, what is the option one, you know, and let's flip that. Let's do something else, option B woman. And finally, I was making this form for a bunch of hackers. And I knew that they were like, Oh, where is my you know, like, I don't think you got this - it’s option two, deal with it. You're gonna have to, yeah, deal with it. So yeah, structurally, user interface wise, the way user interfaces are written at every step of the way, we had this acute feeling that it's not written from a female perspective. And we wanted to change that. Hopefully, we represented that somewhat into what design there's a long road ahead. Everyday we learn something new, and we want to make that go back and make that change in Mumkin. What it is now. And Eleanor, what was the second question?

ELEANOR DRAGE (24:17):
So we're also thinking about how we can develop feminism, what we can do for feminism through the study or the creation of these amazing apps. 

PRIYA GOSWAMI (24:28):
I think the best way Mumkin could contribute to feminism is to really, really challenge that an app doesn't need to be a period tracker or a pregnancy tracker. It could be --- or a shopping, Oh God, or a shopping website, you know, app. And those are not the kinds of apps women want. Women might want something else altogether. Like, I don't want another period app, tracking app. I don't want another pregnancy tracking app. I just wrote that somewhere, too. And I think that is a one way. That is the first way I would say studying of apps like Mumkin could, you know, expand feminism.

KERRY MACKERETH (25:07):
Priya, thank you so much. This has been absolutely phenomenal. And it's really such a privilege on our part to get to listen to you talk about all the amazing design choices you made, the hugely important political project behind the app. And I so appreciate all the thought and the care that you've put into this whole process from the multiplicity of stories that you aim to share and honour in your app through to the design choices that you've made around the visualisation of the app, and making sure that this isn't something that entrenches or reproduces the trauma of survivors. So we both just want to say thank you so much. It's really been such a pleasure.

PRIYA GOSWAMI (25:41):
Likewise, thank you for having me. It's really special to talk about it here.