252: Designing For Safety with Eva PenzeyMoog

September 29th, 2021 · 1 hr 45 secs

About this Episode

TRIGGER WARNING: Domestic Violence, Abuse, Interpersonal Safety

01:26 - Eva’s Superpower: ADHD and Hyperfocus

08:19 - Design for Safety

12:45 - What Engineers Need to Know

  • Control/Shared Accounts
  • Surveillance
  • Location Data

15:02 - Expanding Our Understanding of What “User” Means

  • “User as an abstraction.”

20:43 - Parallels with Security

  • Personas / Archetypes
  • Adding Layers of Friction
  • Ongoing Arms Race

22:23 - Spreading Awareness Across Teams Focused on Feature Delivery

  • Safety Designers as a Specialized Role?
  • Generalists vs Specialists; Literacy vs Fluency
  • This Book Is For Everyone: Engineers, Designers, Product Managers, etc.

31:38 - Thinking Beyond The User

35:25 - Traditional Design Thinking Protects White Supremacy

40:21 - Putting Ergonomics, Safety, and Security Behind Paywalls

  • “Ergonomics is the marriage of design and ethics.”
  • The History of Seatbelts
  • Government Regulation
  • Worker Organizing

45:58 - Tech Workers and Privilege

  • Overpaid/Underpaid

Reflections:

Mandy: Inclusive and accessible technology includes people experiencing domestic abuse.

Damien: If a product can be used for harm, it will be.

Coraline: How systems are weaponized against marginalized and vulnerable folks.

The internet is good for connecting people with shared experiences but we’re breaking into smaller and smaller groups. Are we propping up systems by taking a narrow view based on our own experiences?

Eva: Who didn’t teach you about this?

It’s our job to keep ourselves safe in tech. Tech companies need to take more responsibility for user safety.

This episode was brought to you by @therubyrep of DevReps, LLC. To pledge your support and to join our awesome Slack community, visit patreon.com/greaterthancode

To make a one-time donation so that we can continue to bring you more content and transcripts like this, please do so at paypal.me/devreps. You will also get an invitation to our Slack community this way as well.

Transcript:

MANDY: Welcome to Greater Than Code, Episode number 252. My name is Mandy Moore and today, I'm here with Damien Burke.

DAMIEN: Hi, and I am here with Coraline Ada Ehmke.

CORALINE: Wow. I actually showed up for once. [laughs] I'm very happy to be with y'all today and I'm very excited about the guest that we have today.

Her name is Eva PenzeyMoog and Eva is a principal designer at 8th Light and the author of Design for Safety.

Before joining the tech field, she worked in the non-profit space and volunteered as a domestic violence educator and rape crisis counselor. At 8th Light, she specializes in user experience design as well as education and consulting in the realm of digital safety design. Her work brings together her expertise in domestic violence and technology, helping technologists understand how their creations facilitate interpersonal harm and how to prevent it through intentionally prioritizing the most vulnerable users.

Eva, I'm so happy to have you here today. Hi!

EVA: Hi, thanks so much for having me. I'm so excited to be here.

CORALINE: So if I recall correctly and it has been a while so Mandy, correct me if I'm wrong, but I think we open with the same question that we've been opening with for 251 other episodes and Eva, that is, what is your superpower and how did you discover, or develop it?

EVA: Yeah, so my superpower is my ADHD, actually [chuckles] and specifically my ability to hyperfocus and I didn't really acquire and start to until the age of 25, which is when I was diagnosed.

For people who don't know, hyperfocus is basically exactly what it sounds like. It's a state of very intense focus that people with ADHD will sometimes go into. It's not something you really have control over, it's not something you can just turn on, or off, and it isn't necessarily good, or bad.

But for me, I'm really lucky because it often gets triggered when I start to code. So as I was starting to learn code and then I switched over to focusing on design and frontend like CSS and SAAS. But as I was learning that stuff, it gets triggered all the time. So I can sit down and code and oftentimes, hours have gone past and so long as I don't like miss any meetings or forget to eat, it's totally a superpower.

CORALINE: That's amazing.

I've talked about before, I live with bipolar disorder and I tend to stay in a low-grade manic state as my resting place and I experience very similar things with that hyper focus and just losing hours on a task and sometimes, it's very positive and I get a lot done and sometimes, I'm like, “What the hell did I do?” [chuckles]

EVA: Right.

CORALINE: But I think it's great that—I've been talking to some other folks with ADHD, with bipolar—the judo moves we can do takes something that really negatively affects us in a lot of ways and finding a way to turn it around, like you said, and use it as a superpower. Those are the strategies we develop when we live with things like this and I'm always happy when people have figured out how to get something good out of that.

EVA: Yeah, totally and realizing that you have this thing that happens. Because I'm sure it's been happening my whole life, but I didn't recognize it, or understand it and then just being able to name it and see that it's happening is so powerful. And then to be like, “Oh, I can maybe do certain things to try to get into it,” or just being aware that it's a thing it's like very powerful.

CORALINE: I'm kind of curious, Eva, if you don't mind us talking about ADHD for a little while?

EVA: Sure. Yeah.

CORALINE: Okay. I have a friend who is – actually, a couple of friends who were very recently diagnosed with ADHD and they had so much trouble in the traditional tech were workplace, especially working for companies that have productivity metrics like lines of code, or number of commits, or something like that. It was really difficult for both of these friends to operate in an environment where you're expected to have very consistent output day over day and not having accommodation, or not having the ability to design their work in such a way that maximizes the positives of how they work and minimizes the negatives of how they work.

Is that something you've struggled with as well?

EVA: Yeah, and that’s so unfortunate that your friends because like I said, I feel like it is a superpower and most workplaces, they should be trying to harness it and understand that, you can have really, really awesome employees with ADHD. If you set them up for success, they can be so successful.

But it is something – so I've only ever worked at 8th Light actually, when I was interviewing, over 5 years ago now, and started doing, trying to find my first job in tech, after doing a bootcamp, I interviewed at a couple different places and none of them felt super great. But obviously, I was just really eager to get my first job.

But then I went into 8th Light and 8th Light was one of the places where I really, really did want to work there and was really excited for the interview. But when I got to the office, it was very quiet and there was an open workspace, but people were working very quietly and there were like lots of rooms.

I got into that and I was like, “Oh, thank God” like, this is exactly the space I need. I can't handle too much activity. I can't handle offices where they're actually playing music; that type of thing is my nightmare and I don't actually like wearing headphones all day like that. That's not just a easy fix for me and for a lot of people with ADHD.

So I felt like right away, now I want to work here even more and I've been really lucky that it's been a really good setup for someone like me to work and I have gotten some accommodations which has been good. I feel like if you don't give accommodations, they're breaking the law, they need to do that.

DAMIEN: This is really, really validating because I've had similar experiences of that. Even just this morning where I was in the code and I had no idea how much time was going by and I had no awareness of anything else. That's possible because of the environment I have that I work in. Whereas, previous jobs I've had with bullpens and just open office plans, I was in incredibly miserable there and I didn't understand how people could get any work done in those environments. So just this understanding of how people are different; in what environments some people thrive in and other environments other people thrive in.

EVA: Yeah. So have you always worked from home, or has this been a pandemic thing?

DAMIEN: This has been probably about 10 years. Yeah. [laughs] I went home and never left. [laughs]

EVA: Nice. [chuckles]

CORALINE: I've done something very similar. I started working from home, I think in 2015 and not for a great reason, but I found the exact same thing that you're talking about. Like I am very sensitive to my environment. I use music to control my mood and like you, Eva, I hate headphones. So I do wonder, you mentioned accommodations and the legal perspective on that. In Illinois where – Eva, you live in Illinois, too. Are you local for 8th Light?

EVA: Yeah. I live in Chicago.

CORALINE: We have that will employment and it's really easy to discriminate against folks on multiple axes rather than providing our accommodations. Without will employment, they can just let you go and you have no proof that it was because they're ableist, or racist or transphobic, or whatever.

EVA: Oh, yeah. That's so rough. Pritzker's got to get on that. Our governor. [chuckles]

CORALINE: So do you want to tell us a little bit about the book that you just wrote? I understand a lot of people are finding a lot of value in it and really opening their eyes to a lot of maybe issues they weren't aware of.

EVA: Yeah. So my book, Design for Safety, came out in early August and it's been really great to see people's reactions to it. I got my first formal book review, which was really cool and it was overall very positive, which has been very exciting.

I'm hopeful that it is helping people understand that this is a thing because it's different, I feel like than a lot of other problems. Someone else explained this to me recently and I had this light bulb moment that I'm not providing a solution to a problem that people know that they have this problem, like how their tech is used for interpersonal harm and now I have a solution like, here's this book that's going to tell you how to fix it. It's more that people don't even know that this is a problem.

So I'm educating on that as well as trying to give some of the solutions on how to fix it. It has been a lot of people just saying like, “I had no idea about any of this. It's been so eye-opening and now I'm going to think about it more and do these different things.” So that's been really great to see that just people’s awareness is going up, basically.

MANDY: I really like on the website, the sentence that there's a pullout quote, or I'm not sure if it's even a pullout quote, but it says, “If abuse is possible, it's only a matter of time until it happens. There's no might, so let's build better, safer digital products from the start.” I like that.

EVA: Yeah, thanks. I was very intentional and well, this goes back to when I was doing a conference talk. Before I wrote the book, I did a conference talk called Designing Against Domestic Violence and I thought a lot about the type of language should I use; should I say might happen, or should I say will happen? I eventually settled on it's going to happen even if it hasn't happened yet, or oftentimes, I think we just don't know that it's happened.

People who have gone through domestic violence, some of then we'll talk openly about it. But most people just don't, which makes sense. It's this really intense, personal thing to go through and there's so much judgment and survivors get blamed for all these things. So it makes sense that people don't want to talk that much about it. I ended up thinking we just need to say that it will happen.

DAMIEN: That's amazing. So I really want to know everything about this book. [chuckles] but to start with, you said the book is designing for safety and you witnessed this a little bit with domestic violence, violence and abuse. Can you talk about safe from what sort of things you mean when you say safety there?

EVA: Yeah, for sure because I know safety is a big word that can mean a lot of different things. But the way that I'm talking about it in my work is in terms of interpersonal safety. So it's like how is someone who has a relationship with you in an interpersonal way going to use technology, weaponized technology, in a way that was not meant to be used? We aren't designing tech with these use cases in mind, but how is it ultimately going to be weaponized for some type of abuse?

Domestic violence is really the emphasis and my big focus and was mentioned in the intro, some background in domestic violence space. But there's also issues with child abuse and elder abuse, especially in terms of surveillance of those groups as well as surveillance of workers is another thing that came up a lot as I was researching that I didn't get as much into in the book. But it's basically anytime there's an interpersonal relationship and someone has access to you in this personal way where you're not just an anonymous stranger, how is tech going to be used to exert some form of control, or abuse over that person?

DAMIEN: Wow, that is a very important subject. So I'm an engineer who doesn't have a lot of knowledge about interpersonal violence, domestic abuse, anything of that nature and I know you've written a whole book [laughs] and we only have an hour, or so here, but what are the first things that people, or engineers need to know about this?

EVA: Yeah, so I think the first thing is to understand that this is a problem and that it's happening and to go through some different examples of how this happens, which is what the first couple chapters of the book are all about. It's different forms of this interpersonal abuse via technology in the form of shared accounts is a really big one and this question of who has control and nebulous issues of control. There's also surveillance is a really big one and then location data as well.

So I guess, I don't want to say like, “Oh, just read the book,” but learning a little bit about the different – there's so many different examples of how this works. Just to start to build that mental model of how this happens like, someone taking advantage of certain affordances within a shared bank account software, or someone using an internet of things device to gaslight someone, or torment them.

There's so many different examples. Location data shows up in all sorts of really sneaky in terms of stalking. It's not purely putting a tracker on someone's car, or even like Google Map and sharing your location is a more straightforward thing. But there's also, it shows up in other ways like, a grocery store app that has a timestamp and location. You can learn someone's grocery shopping habits and maybe you're estranged from this person, or they've left you because you're abusive, but they don't know that their stuff is showing up in this app and their location data. So it shows up in all sorts of different ways.

This is a very long way to answer your question, but I think the first thing is to start to understand how this stuff works so that you're just aware of it and then from there, I have a whole chapter about how to implement a practice of designing for safety at your company. It is a little more design focused, but I think engineers can absolutely be doing this work, too. Even if it's just like quick research on how are any product with any type of message feature is going to be used for abuse and there’s lots of literature out there. So just looking at some articles, thinking about ways that aren't covered already, that just having a brainstorm about what are some new ways this might be used for abuse and then thinking about how to prevent them.

CORALINE: One of the things that I was thinking about after reading your book, Eva, is at a metal level, or zooming out a bit. I think a lot of the ways that we design software, we have this idealized and homogenous notion of a user. I think that in a lot of cases, especially if you're working on a project that's like more, or less one of those scratch your own itch problems, you tend to think of yourself as the user.

It's great to have that empathy for the end user, but what we don't have, I don't think as a field, is an understanding that user is an abstraction and it is a useful abstraction. But sometimes you need to zoom down a little bit and understand the different ways that people want to use the software and will use the software and what makes them different from this average idealized user.

That was one of the things that really struck me, especially from the process you were describing, is expanding our understanding of what user means and anticipating the different use cases with hostile users, with actively abusive users, and I think thinking of abstraction is super helpful, but I feel like sometimes we need to zoom down and think differently about really who the people are and what their circumstances might be.

EVA: Yeah. Oh man, I just wrote down what you said, user is an abstraction. That's such a good way to think about it that I haven't heard before, but you're absolutely right that it's encapsulating such a big group of people. Even if for a small product, something that's not like Twitter that's open to billions of people, even something that's a subscription, or something that's going to have a smaller user base. There's going to be such a diverse, different group within there and to just think of the term user as a catchall is definitely problematic.

Sorry, I'm just processing that user is an abstraction, that term because we use it so much as designers, definitely.

CORALINE: Yeah.

EVA: And anyone in tech is always using this term, but problematizing that term in a new way is really interesting to me.

And I think my other thought about this is that we talk a lot about needing to think about more than just happy path and I feel like even that, at least in my experience, has been other things that are also very important where it's like, let's think about someone who has a crappy Wi-Fi connection, or someone who's low vision. Like there are all these other very important things to think about in terms of accessibility and inclusivity.

I think I see what I'm doing as just adding another group into the mix of let's think about people who are currently surviving domestic violence, which is maybe a little bit harder to bring up than those other two that I mentioned because it's just so dark and it's something that we just don't want to have to think about, or talk about during work. It's just such a bummer, but it is really important to have this new group added when we're thinking about inclusive and accessible tech.

DAMIEN: There's a really great parallel here, I think with security minded design and research. Again, that's another user who is not behaving in the happy path. That's not behaving the way your normal users are behaving and you have to design your system in such a way to be resilient to that.

So I love this user as an abstraction, then breaking it down into all these ways and then also, there's a huge value to diversity in your team with this sort of thing.

CORALINE: Absolutely.

DAMIEN: You can understand the very different types of users having people on the team who can understand blackhat users who are going to be trying to use your servers to mine Bitcoin, or [laughs] blind users, low vision users, or colorblind users, for goodness’ sake. And then in addition to that, people again, who are experiencing domestic violence, other to terms of other forms of interpersonal abuse and just being able to understand all those users and their experiences with the things you're building and designing.

EVA: Yeah, definitely those are all really good points.

Just going back to what you said about the parallels with security is something I've actually been thinking about that a lot, because I think there are lots of parallels to that, or useful things about how security professionals think about their work and operate.

Especially the big one for me right now is thinking about a security professional. They're never going to be like, “Okay, we did it. Our system is secure. We're done. We have arrived.” That's not a thing and I feel like it's very similar with designing for safety, or even inclusion. There's just, you're never – I feel like we've had a mental model of “I can think about these things, I can check these boxes, and now, my product is inclusive, or my product is accessible.”

I feel like we should be thinking more like security professionals where there's always going to be more things like, we always have to be vigilant about what's the next way that someone's going to misuse tech, or the group that’s going to be identified that we've totally left out and is being harmed in some way. So I think that's just a useful shift that I'm thinking a lot about.

CORALINE: And Damien, I'm so glad you brought up the parallels with security. I was actually going there as well.

One of the things that I've been thinking about from an ethical source perspective is insecurity that, I think two tools that would be super useful. First of all, personas and secondly—I guess, three things—understanding that safety can be a matter of adding layers of friction to disincentivize abusive behavior and like you said, recognizing this is an ongoing arms race. Every new feature that you design opens up some kind of attack, or abuse factor and if you're not planning for that from the outset, you're going to be caught later when harm has been done.

EVA: Yeah, absolutely. Since you brought up personas, there is something in the process that I created that's a similar tool where I call them archetypes because they're a little different from personas. But it's identifying who is the abuser in this scenario, who is the survivor, and what are their goals and that's basically it, we don't need to get into anything else.

I don't think, but just articulating those things and then even having a little printout, kind of similar to the idea with personas like, oh, you can print them out for your sales team, or whoever it is to keep these people in mind. A similar idea of just having them printed out an on your wall so that it's something that you're thinking about like, “Oh, we have this new feature. We probably need to think about how is this abuser person that we've identified who would want to use our product to find the location data of their former partner,” whatever it is.

CORALINE: Yeah.

EVA: Use this.

CORALINE: From a mechanical perspective, Eva, one of the one of the challenges I had at GitHub when I was working on community and safety is that the other engineers and the other groups were creating so many new features. I felt like the knowledge about how feature can be abused, or like you said, will be abused wasn't spread very effectively throughout, especially a large software organization, and it fell on a small team of folks who frankly were not consulted. A feature would go out and we'd be like, “Holy crap, you can't do that because of this, this, and this.”

So do you have any do you have any thoughts? I know you said print it out, or put it on the wall, but do you have any thoughts for how to spread that awareness and that mode of thinking across teams who frankly may be very, very focused just on feature delivery and will see any consideration like that as slowing them down, or having negative impact on “productivity”?

EVA: Yes. I have many thoughts. [chuckles] So this is bringing up something for me that I've struggled with and thought about is should there be specialized teams in this area? I feel like yes, we want people with special knowledge and experts and that's really important, but also, I feel like the ideal scenario is that it's just everyone's job.

CORALINE: Yeah.

EVA: Those teams were already doing things and it wasn't seen as “Oh, Coraline’s team is going to come in and now we have to consult with those people,” or whatever because it's not our job, it's their job.

CORALINE: Yeah.

EVA: Which this isn't a very maybe satisfying answer to your question because I feel like it involves a huge shift in the way that we think about this stuff, but it is something I've thought about in terms of should I call myself a safety designer? Is that something I want to do? Do I want this to be like a specialized role? Maybe is that a goal where people start to see that? Because there are people who specialize in inclusive design, or accessible design.

But then the downside of that is does that just give someone else even more leeway to be like, “Not my job, I don't have to worry about this. And then we have the problems, like what you just described. I don't know, I feel like it's such a big shift that needs to happen.

CORALINE: Yeah. One of the models I've been thinking about and I was thinking of this in terms of generalists versus specialists is generalists, or to map that to domain that we're talking about now, the other engineers in your group, or in your company. I feel like there has to be a balance between specialization and general knowledge. The way I describe that is everyone should have literacy on a particular topic and the basic vocabulary for it and a general knowledge of the concepts augmented by a specialist who has fluency. So kind of a dynamic relationship between literacy and fluency. Do you have any thoughts on that?

EVA: I love that. I'm literally writing that down.

A generalist with literacy and a specialist with fluency is such a good way to think about it because I feel like I do say this. I don't want people who read my book, or see my talk to think like, “Oh, I have to be like her, I have to learn all this stuff. I have to really dig into domestic violence works and what it means and laws.” I don't want people to feel like they have to do that because it's just such a dark, heartbreaking thing to have to think and read about every day and I don't think that's a realistic goal. But I think being a generalist with literacy is realistic augmented by specialist with fluency; I'm just like basically repeating what you just said.

[chuckles]

But that's just a really brilliant way to think about it.

DAMIEN: That pattern actually really matches something that I learned from another Greater Than Code guest. I'm sorry, I can't remember their name right now. I believe we were talking about inclusivity and what they said was like, “It's not the expert's job to make the product, or the company inclusive. [chuckles] It's the expert job to support – it's everybody's job to make it inclusive. It's the expert's job to be an expert and to support them.” We also use again, a metaphor from security. We don't have security experts whose job it is to make your app secure, we have security experts whose job it is to support everybody in keeping your app secure.

CORALINE: Yeah.

DAMIEN: So I feel like that this matches really well. The job of the person with this expertise is to support, to educate, to guide not because they can't do all the work together all themselves, like Coraline said. There's just too many features being added for [laughs] for some team somewhere to go, “Oh no, this is fine,” or “That's not fine.”

EVA: Yeah, totally, and I feel like that just brought up something for me, Damien, about the speed at which we work, too many features being added, not enough time to actually do this work, and how—this is getting at just way bigger critique of tech in general.

DAMIEN: Yeah.

EVA: But it's okay to slow down once in a while. I feel like just the urgency thing causes so many problems outside of just what we're talking about. But this is another big one that I feel like it's okay to spend an afternoon thinking through what are the ways this is going to be not inclusive, or unsafe and that's totally fine. But I fall into it, too where I'm like, “I want to deliver things quickly for my client,” or if I'm doing so internal for a flight, I want to get done quickly. I don't want to hold people up. So it is a really hard thing to break out of.

CORALINE: It seems to me, Eva, that this kind of knowledge, or this kind of literacy, or this kind of making it part of the process can fall solely on engineers. Because in a lot of places, we have of product managers who are setting deadlines for us. How do you communicate to them why this work is so important when they may only see it as like, “Well, you're getting in the way of us hitting a release date and we have a press release ready,” or “We want our debut this feature at a particular time, or place”?

MANDY: And now we want to take a quick time out to recognize one of our sponsors: Kaspersky Labs:

Rarely does a day pass where a ransomware attack, data breach, or state sponsored espionage hits the news. It's hard to keep up, or know if you're protected. Don't worry, Kaspersky’s got you covered. Each week, their team discusses the latest news and trends that you may have missed during the week on the Transatlantic Cable Podcast mixing in humor, facts and experts from around the world. The Transatlantic Cable Podcast can be found on Apple Podcasts & Spotify, go check it out!

EVA: Yeah, totally. So I think ideally, this comes from everyone. My book is called Design for Safety, but I really hope that people are reading it, who are also engineers and who are also project managers—basically anyone who has a say in how the product is actually going to function, I think should be doing this work.

But specifically, if you have a project manager who is rushing everyone and saying, “We don't have time for this,” I do have a couple different strategies in my book about this, where it's like we can use statistics to talk about that this is a thing that is impacting a lot of our users. It's 1 in 3 women, 1 and 4 men in the US have experienced severe physical, domestic violence and that's just severe physical, domestic violence. There's so much domestic violence that doesn't have a physical component to it so that could be like a third of our user base.

So bringing stuff up like that to try to get some buy-in, but then also my process, I have little time estimate.

CORALINE: Yeah.

EVA: So saying like, “We want to do research; it's going to be 6 hours.” “We want to do a brainstorm; it's going to be 2 hours.” Giving people very specific things that they can say yes to is always going to be better than just an open-ended, “We want to design for safety.”

CORALINE: Yeah.

EVA: And someone being like, “I don't know what that means, but we have a deadline.” Saying like, “We're going to do a brainstorm to identify ways that our product will be used for harm. We want to do it next week and we want to spend 4 hours on it” is going to be a lot better.

DAMIEN: And I want to call out how important and useful the language you use there was you said because when you find something, when you do that brainstorm, or whatever analysis process, you go like, “Oh, here's the way our products will be used for harm.” Because if you say to a product manager, “Here's a way our product might be used for harm,” they go, “Well, okay.” [laughs] “Might not be.” [laughs] If you say, “Here's a way our product will be used for harm.” Well, now that leaves a lot less of wiggle room.

EVA: Hmm, yeah. That's a really good point that I actually hadn't thought about.

I think the other thing is there's tangible outcomes from something like that brainstorm, or these different activities that I have outlined. You can actually show the person, like, “Here's what we did. Here's what we came up with,” which isn't necessarily – I wish we didn't have to always do that; always have some type of very explicit outcome from everything we do. But I do think that's a reality that we have that this process kind of helps with.

CORALINE: I want to go back to the user thing. Again, one of the things that we're thinking about our ethical source is thinking beyond the user and thinking about not just who is using the technology that we're creating, but the people that the technology we're creating is being used on.

EVA: Yes. That's such a good point. I'm actually curious, have you come up with a term for that type of user? Like nonuser?

CORALINE: I have not yet, but that's a great call out. Language is so important so, yeah.

EVA: Yeah. I don't know that it exists and I've seen nonuser, but I don't know that that's agreed upon.

DAMIEN: I've gotten as far, the best I've come up with is constituency.

CORALINE: That is very interesting, Damien because one of the things we're developing is a governance tool. The W3C, when they were working on the HTML standard—this was a couple of years ago, I think—they mentioned something called a priority of constituent and this was very much from a standards body perspective, but it was one sentence and I think it is such a powerful sentence. Just for their example, they said, “In times of conflict, we prioritize end users over developers, over browser manufacturers, over spec writers, over technical purity.”

[laughter]

EVA: Wow.

CORALINE: That’s one sentence, but writing that down, I think can really help cut through a lot of a lot of the noise and a lot of the gray area maybe that's the most encountered. It's so simple and you can do it in a single sentence. So absolutely, the notion of constituencies and being explicit about whose safety, convenience, or what have you you're optimizing for.

EVA: Yeah. That's really important and I have two thoughts.

One is that this comes up a lot in the surveillance space where it's like, what sort of rights, or priority should we be giving someone who is walking on the sidewalk in front of a house that has a Ring camera that's facing out to capture the porch, but is ultimately capturing the sidewalk in the street? What are the rights of that person, that nonuser, who has not agreed to be filmed and isn't part of this product's ecosystem, but is still being impacted by it?

It's something I think about a lot, especially there's so many in my neighborhood I see. Since I wrote the book, I see the Ring cameras everywhere, including in places where they're not really to be like on the outside of someone's gate, just facing the sidewalk. It's like, you're not even recording your own property at that point. It's just the gate, or it's just the sidewalk, I mean, which I feel is very problematic.

You also said that it's important to explicitly call out who you're prioritizing and that's something – I read this book called Design Justice by Sasha Costanza-Chock, which was very lifechanging and it's just such a good book. It's a little more theoretical. She explicitly says it's not a guide, but she talks about this, about how it's really important to, if you are going to choose not to be inclusive, or safe, or justice focused, whatever it is, you need to explicitly say, “We are choosing to prioritize the comfort of this group over the safety of this group.

CORALINE: Yeah.

EVA: Or whatever it is. Like, you need to actually just spell that out and be upfront about it.

DAMIEN: Yeah. It reminds me of, I think I learned this from Marla Compton. Although, I don't know if she originated it. I guess, she probably didn't, but the phrase she taught me was, “We prioritize the safety of marginalized people over the comfort of non-marginalized people.” It's such a powerful statement.

CORALINE: It really is.

DAMIEN: Yeah, and just making that explicit like, “These are the tradeoffs and these are where we side on them.”

CORALINE: Yeah.

EVA: Yeah. Oh, yeah. That's such a good one.

I did this workshop recently, it's called How Traditional Design Thinking Protects White Supremacy, but they talked a lot about how feeling entitled to comfort is just such a white supremacist thing and I feel shows up in different forms of oppression as well like men's comfort, et cetera. But that's something I've been thinking about a lot is the feeling of a right to comfort and how that also includes a right to not have to have any type of conflict and a fear of conflict. How these things all play together and how it's all part of white supremacy and how it shows up in our culture, in our workplaces. It was a great workshop. I would highly recommend it because it's also been a lifechanging thing as I digest all of the different things from it.

DAMIEN: It's so powerful to name that as comfort.

CORALINE: Yeah.

DAMIEN: Like, this is what we're protecting. We're protecting these people's comfort [chuckles] and this is what it will cost.

CORALINE: I think about what Kim Crayton said for a year is, “Get comfortable with being uncomfortable.”

EVA: Yeah, that's such a good one. I love her.

CORALINE: Yeah.

EVA: I quoted her in my book about, oh, I forget what it is. It's something about not having strategy is chaos.

CORALINE: Oh my God.

EVA: Like, the need for strategy.

CORALINE: I learned so much from her from that one statement. That was literally lifechanging for me. That was literally lifechanging for me because I always had a negative feeling about strategy, like strategy is coercive, or insincere. And then another friend of mine I was talking to about it said strategy is good when it's not a zero-sum game.

EVA: Mm.

CORALINE: I think we maybe we can think about personal safety and abuse factors in that way.

EVA: Yeah, definitely. I think the full quote is “Intention without strategy is chaos.”

CORALINE: Yeah, that.

EVA: That has been very definitely influential for me and as I feel like a big part of the reason, that idea is why I wrote my book and did my conference talk is because I was feeling frustrated with – it's a lot easier to raise awareness about an issue than it is to have actual strategies for fixing it. I felt like I would always get really fired up reading something, or listening to a talk and be like, “Yeah, this is such a huge problem. We need to fix it,” and then didn't have a takeaway, or anything that I could really do at work other than just being told to think about this, or consider this, which I'm like, “When do I do that?”

CORALINE: And what does that look like?

EVA: Yeah, you can't think about all of the different things we need to think about from 9:00 to 5:00 while we're at work every day. We need a strategy to do that, which is why I like made these different activities that I have in my process.

But going back to this white supremacy and design workshop that I did, I also learned in there about how some other ways that white supremacy shows up is having an action bias and a sense of urgency.

CORALINE: Yeah.

EVA: And how a lot of that can come from people, especially white people, not being able to like sit with discomfort when we're faced with really uncomfortable topics and a desire to jump into action before we fully understand the problem and have internalized it.

So now I'm feeling like I need to backtrack a little bit and be like, “Yes, provide action.” But also, it is good to do deep learning. I think we need both, but I feel like a lot of people, it's one, or the other. Let's do a ton of learning, or let's jump right into action. I have always been a jump right into action person and now I'm realizing it's okay to take a beat and do some deep learning and to sit with all the discomfort of the heavy topic.

CORALINE: A friend of mine gave me a concept that I like a lot. He has a definition of ergonomics that is the marriage of design and ethics. When I use the term ergonomics in that sense, what I mean is how easy is it to do a particular action. One of the things that I see quite a bit—something, I think is a terrible consequence of the web, frankly—is putting ergonomics behind paywalls and asking people who use our software to yield some degree of agency, or digital autonomy, or security in exchange for features.

EVA: Hmm. So interesting.

CORALINE: So I'm curious maybe how you would frame designing for safety, some of the other axes of oppression that we discussed on the show today, from the perspective of the ethical aspect of our design decisions. What workflows are we optimizing for? What workflows are we putting behind a paywall, or in exchange for okay, you’re signing up. The [inaudible] says you're buying into surveillance capitalism and you just simply have to do that if you want an email account, if you want a Twitter account, what have you.

EVA: Yeah. I do feel like there is a bit of an issue with putting safety and security sometimes behind a paywall where you can literally pay more to not get advertised to, for example.

CORALINE: Yeah.

EVA: Which it's like, I get that products have to charge money and it’s like we shouldn't – the flipside of that is well, we can't just work for free. I see that a lot with journalism when people are criticizing paywalls and it's like well, but journalists have to get paid. They can't work for free just like everyone else.

But I do feel that with things like being able to opt out of advertising and I feel like there are other things. Nothing's coming in right now, but different ways that you can ease some of the crappier parts of tech, if you have enough money, to buy into the paid versions of things is definitely problematic. Who are we keeping out when we do that and who are we saying doesn't deserve this privacy and the safety? What should just be standard? The seatbelt; I'm obsessed with the history of the seatbelts.

CORALINE: [chuckles] I still have the [inaudible] that's been going around.

EVA: Yeah.

CORALINE: It’s amazing.

EVA: I've talked about this in many different places, but the seatbelt used to be something that you had to pay extra for. In today's dollars, it would've been like 300 extra dollars when you bought a car to get seat belts and only 2% of the people, in 1956 when they were introduced, actually paid for them and probably even less were actually using them. And then there was a revolution in the auto industry led by activists and everyday people. It definitely not come from the auto industry; they had to be forced into these different things. But now seat belts, the government basically, they passed a law and they said, “You have to just include seat belts as a standard feature.”

I think about that a lot in tech. The things now that we're making people pay for, should some of those just be standard features and how are we going to get there? Probably government regulation after a lot of activism and everyday people rallying against these different things with big tech. But I think we're going to get there with a lot of things and we're going to see a lot of seatbelts, so to speak, become just standard features and not something you have to pay for.

CORALINE: And I wonder, you mentioned government regulation; I have literally zero faith in government doing anything effective in the online world at all because our government is powered by 65-year-old white men that are rich and there's no incentive for them to care about this even if they did have the basic literacy about how this stuff works.

It seems to me one of the things that we've been seeing really emphasize is, especially during in post lockdown, is worker organizing and I wonder if there's a strategy here for empowering the engineers, who frankly, we are being treated rockstars right now. I hate that term rockstar, but we're overpaid, we're pampered—a lot of folks, obviously, not everyone.

So can we leverage our power? Can we leverage the privilege of being in such an in-demand profession to affect change in organizations that have no financial incentive to think about stuff like this at all?

EVA: Yeah. So many things I want to respond to. Definitely, I think worker power is like such a strong point in all of this and I feel like we are the ones leading out on this. A lot of it is coming from people who work in tech and understand the issues. Like, writing, speaking, and doing these different things to help everyday people who don't work in tech understand like, “Hey, actually, here's why Facebook is really terrible.” A lot of that is coming from people in tech, even former Facebook employees even.

CORALINE: Yeah.

EVA: Which is different, I think from the paradigm shift we had with the auto industry. I don't know, I would have to look, but I'm pretty sure is not coming from car designers and engineers weren't helping lead that charge the way that we are.

But I also want to respond to something you said about tech workers being overpaid and pampered, which yes, I agree with you. But I also think there are privileges that everyone should have and that no one should of and I feel like everyone deserves to be well paid, to be comfortable and have all these perks, and whatnot.

I had a career in nonprofit before this so I have so much internalized just baggage about and guilt around feeling with my pay, my benefits, and all these things. The work I do now, compared to the work I was doing in the nonprofit, which was helping kids who were basically on a road to dropping out before graduating high school, which was really important work and I made so much less money and worked so much harder. But I feel like everyone deserves to be as well paid as we are and it is possible.

CORALINE: Yes.

EVA: So I just wanted to kind of throw that out there as well that we – [chuckles] I feel like I'm trying to just absolve myself from being a well-paid tech worker. But I do think we deserve this and also, everyone else deserves similar treatment.

CORALINE: Absolutely.

DAMIEN: Yeah. I feel the same way, especially—to take an example within a tech company—as an engineer, I get paid a lot more than customer service people.

CORALINE: Yeah.

DAMIEN: And that doesn't mean I'm overpaid, [chuckles] it means they're underpaid.

CORALINE: Yeah.

DAMIEN: A lot. [laughs]

CORALINE: Yeah.

EVA: Yeah, and I feel like this whole conversation, honestly, this is a freaking tactic. This is what the people at the top, this is how they want us to feel; pitting us against each other, feeling like it's not that – the sales people, that's normal and we're overpaid. It's like, no, actually we're paid a livable amount where we can live comfortably and they're exploited even more than we are.

That's how I'm trying to think about things because I do feel like this other way of looking at it is just absolutely a tactic of, I don't know, the 1%, whatever you want to call them. The company leaders definitely don't want us to feel like we're – they would rather us feel that we're overpaid and pampered than just compensated for the labor we do in a fair way,

MANDY: Have us feel the shame and guilt around it, too. Before I was in tech, I went from welfare to making a reasonable standard of living in a year and sometimes, I still feel guilty about it. It's a heck of a feeling.

EVA: Yeah, and I feel like that didn't just come out of nowhere. We've been taught that we should feel guilty for just surviving. I don't know. Because I think even in tech, it's a lot of people there's still so many issues with burnout, with—I don't know about you all, my body sometimes just hurts from not moving enough during – like, there's still all these like different things that could be better. But the feeling that we should feel guilty for having some comfort and decent pay, I think that's definitely a strategy that has come from these different powerful groups. It didn't just come out of nowhere.

CORALINE: I appreciate y'all pushing back on that. I guess, I'm speaking from an emotional place. Eva, you went from nonprofit and the tech. In April, I went from tech and the nonprofit and personally, I took a 30% pay cut and – [overtalk]

EVA: Oh, wow.

CORALINE: It just really made very visible and very personal seeing what we value as a society and what we don't value as a society. I'm still comfortable; I still have a living wage and everything. But look at what happened during the lockdown with “frontline workers.’ They're heroes, but we don't want to pay them more than minimum wage.

So I definitely agree with what you're saying about other people being underpaid and I definitely hear what you're saying about that guilt, but guilt is a form of discomfort. What are you going to do with that? What are you going to do with the privileges and the power that we have as a result of the way we're treated in this industry? I feel like that's the more important thing and what do you do with it? Are you giving back? Are you giving back in a substantive way, or are you giving back to assuage your guilt? It's nuanced. As y'all are pointing out, it is nuanced.

EVA: Yeah. It's very complicated, but I feel like agitating for those—sorry, Damien, I think you said support people—getting paid more, that's something we can agitate for.

I know someone, I'll call her an online friend of mine in the infertility space, which I'm very involved in as I go through my journey. I hate that word, but I've made all these online friends who are going through it and one of them is a paralegal and she is obviously hoping, although it's not going well, to get pregnant. But she was looking into the parental benefits and realized that the lawyers where she works had, I think it's 18 weeks fully paid off and then everyone else got this weird piecemeal of 6 weeks paid off, then there's FMLA, and then there's PTO, and all this stuff that amounted to a lot less, and you had to like use all of your PTO and all these different things. She actually was able to—with some of the lawyers help, I believe—get that policy change that it was just the same for everyone because it was like, “I didn't go to law school. So therefore, I don't need as much time with my newborn? How does that make sense?”
CORALINE: [chuckles] Yeah.

EVA: So I feel there is a lot of potential to have more equality in our companies, especially as the most powerful people often in the companies, to push for that change to happen.

CORALINE: Yeah.

EVA: There needs to be a lot of solidarity, I think, between these different types of workers.

CORALINE: Yeah, and that's a great example of that.

MANDY: Well, this has been an absolutely fantastic conversation and I feel so privileged just to be sitting here kicking back and just taking in the back and forth between the rest of you.

I wrote down a bunch of a things, but one of the biggest takeaways that I have had from this episode, and especially if you've been listening to the show the past couple episodes, we've been talking about a lot of accessibility things. Eva, you said something that was mind-blowing for me and it shouldn't be mind-blowing, but it was because I was like, didn't even ever think of that and what the hell is wrong with me for not even ever thinking about that? but inclusive and accessible includes people experiencing domestic abuse. It’s not something – I guess, because as what you said, people don't talk about it. So just keeping that in mind was pretty pertinent to me.

I also liked what Coraline said about specialization and then the general knowledge and literacy versus fluency. That was really good as well.

So it's been an awesome conversation. Thank you.

Damien, what do you have?

DAMIEN: Oh, well, this has been really awesome and I want to of first thank Eva for being our guest here and for the work you do and this book.

The thing that's going to be sticking with me, I'll be reflecting on for a while, is this sentence both well, if the product can be used for harm, it will be, which is not only a really powerful thing to keep in mind when designing and building a thing, but also, a powerful sentence that is really useful in communicating these issues. So thank you very much for that.

CORALINE: One of the things that and actually Eva, this was a reaction I had when I first read your book is, I think a lot of us, a growing number of us, have at least an awareness, if not a personal experience, with how systems are weaponized against marginalized, or vulnerable folks. So I think it's really important that in your book, you focus very specifically on a particular domain of abuse, abuse of power and loss of agency and loss of privacy, loss of physical safety.

One of the things I've been thinking about a lot is how the internet has been really good for connecting people with shared experiences and creating communities around the shared experiences. But I do worry that we're breaking into smaller and smaller and smaller groups and I see that. I don't know if it's intentional, but it certainly is a way, I think that we're propping up, that we're being coerced into propping up these systems by taking a narrow view based on our own experiences.

I don't see that as a criticism. What I see it as is an opportunity to connect with other folks who experience that same kind of systemic damage in collaborating and trying to understand the different challenges that we all face. But recognizing that a lot of it is based frankly, white supremacy. We used to talk about patriarchy; I think the thinking broadly has evolved beyond that.

But I would love to see your publisher start putting books together on different particular axes, but also, looking at ways that we can bridge the differences between these different experiences of intentional, or unintentional harm. So that's something that I think I'm going to think about.

EVA: Nice. I can't give any spoilers, but I do think my publisher might have something in the works that it's getting at some of this stuff.

Wonderful.

EVA: Which is exciting.

CORALINE: Yeah.

EVA: Yeah, okay. Man, those are all so good.

My reflection, I'm just thinking a lot about our conversation about the way that people in tech might feel like we're overpaid, or pampered and how that feels like an intentional thing that has come from somewhere and things like that don't just – it always comes from somewhere.

I'm thinking Mandy, about what you said in your reflection. You said, “What's wrong with me for not thinking about this?” I always feel like when I hear people say things like that, it's like well, when were you – I think more who didn't teach you about this? Why wasn't this part of your education as you were learning to code and before you joined the industry? I feel like that's more where the blame lies than with individuals, but yeah.

Something I was thinking about earlier today, before we started recording, is that this idea of user safety, that it's like our job to keep ourselves safe on tech and there's so many resources out there, different articles, and different things. I've been thinking similarly about that, but that's a marketing campaign. That's something that the leaders of big tech done to intentionally shift responsibility from themselves and onto the end user.

We're expected to be legal experts, read these agreements, and understand every single thing about a product that no one uses every single feature, but we're expected to understand it. If we don't and something goes wrong, either interpersonal harm, what I do, or with like oh, someone guessed your password or whatever it was, it's your fault instead of it being the tech company's responsibility. I feel like that's another thing that I'm thinking like that didn't come from nowhere, that came from somewhere.

CORALINE: Yeah.

EVA: It feels like a very intentional strategy that big tech has used to blame us for when things go wrong. Not to say that we get to be absolved of everything, people have responsibilities and whatnot, but I feel like a lot of times it's like this comes from somewhere and I'm trying to think more about that kind of stuff. This conversation was really awesome for helping me process some of those and expand my thoughts a little bit more. So thank you all, this was just really awesome.

DAMIEN: Thank you. Thank you for being here.

MANDY: Thank you for coming.

CORALINE: Yeah. So happy to talk to you, Eva.

EVA: Yeah. You, too.

MANDY: All right, everyone. Well, with that, we will wrap up and I will put a plug in for our Slack community. You can join us and Eva will get an invitation as well to come visit us in Slack and keep these conversations going.

Our website to do that is patreon.com/greaterthancode. Patreon is a subscription-based thing that if you want to you can pledge to support the show. However, if you DM any one of us and you want to be let in and you cannot afford, or just simply don't want to, monetarily support, we will let you in for free. So just reach out to one of the panelists and we'll get you in there.

So with that, I will say thank you again. Thank you, everybody and we'll see you next week!

Support Greater Than Code