Full interview with Lauren Lee McCarthy, visiting artist at Stanford’s Institute for Human-Centered Artificial Intelligence

Bhumikorn Kongtaveelert [BK]: Your work has been at the intersection of art and computer science, specifically ethics and surveillance. However, your new body of work focuses more on the biosecurity of reproductive futures. What created the shift from human-AI interactions to something more intimate but also distant?

Lauren Lee McCarthy [LLM]: I think about [my work] less being about technology, and more about just what the experience of being a person is, like right now. Depending on what culture and context [one] lives in, this often involves a lot of technology. 

I was making a lot of works that dealt with surveillance and AI. I think one of the particular interests of mine is thinking about really intimate spaces and the way that technology plays out in very personal spaces. 

Because we have so many narratives about how AI affects us in terms of work or in terms of the public space. And so the series of works, dealing with AI and surveillance is really thinking about, you know, what happens in the home? How does that feel when it comes to that personally? What do we think about it? 

I think of the recent work as sort of an extension of that. So if we take it one step further, and think not just about the home, but about the body and about these really intimate moments like reproduction in the way that we understand the family. What happens when technology intervenes in those processes? How does that shift our understanding of who we are? 

I’m often making work that is really just about what’s most confusing to me at the moment. And so when I was doing all of the AI and home works, it was partially inspired by the fact that I had just gotten married and moved in with someone, and I was trying to navigate, like, we’re making a home together, and I couldn’t understand what that meant. And then I think this work began, because then another question on my mind was, should I have kids and why or why not? And what does that mean? So the work has this personal dimension, and the point of it is not, should I have kids, but just thinking about what are all the things that come up when one starts to explore that question at this moment in time.

BK: Thematically, your work can be very personal. But I think your approach is not legibly personal in a way that portraiture or family archives are. So how do you grapple with this dichotomy of when the themes are really personal, but your work cannot be registered like that? And because a lot of your work has very intense labor on the artist’s side, how do you touch on these really important themes during his methodology without necessarily commodifying the artist’s labor? Or, if so, how do you grapple with that?

LLM: Hmm, that’s a good question. I don’t think that it’s necessary that people know how personal that work is, for me. I think that’s a detail that if someone hears me speak about it, or reads about it, then they might learn. What I’m hoping is that, even when the work is quite personal, it’s not really intended to be just a story about me. I’m hoping to make a place where people can really connect on and reflect on their own personal experiences dealing with these topics. And so if there’s a dimension that’s personal about it, for me, and it’s sort of meant to hopefully open things up for other people, too. It’s not to say, look at me and my story, but instead: what are your stories? And how are you thinking about [the subject matter]? 

That comes from this feeling, especially at this moment in time when it comes to technology, we’re asked to respond to things so quickly; just scrolling through our feeds. And it’s like, like this and download this and hit update and buy this. There’s just very little time to really form your own point of view. I often feel that the narrative from tech companies is, “Oh, it’s so complicated, you wouldn’t understand anyway.” And so with these works, I’m hoping to make a space where people can just sit with [these subject matters] for a little bit and make up their mind and figure out how they feel about things. 

I think we’re all doing a lot of different kinds of labor every day that are not recognized; there are a lot of forms of emotional labor, especially service work, that is either not recognized at all or very underpaid and undervalued. A lot of this work falls disproportionately to women or people of color, or nonbinary people. So, I guess by putting that in the work, the intention is to bring some attention to that sort of labor and what it means to us and whether we value it. 

Then of course, with technology, there’s the question of automation. Can these things be offloaded? I think in some of the pieces, I was so worried that all the jobs are going to be taken by the tech sector, I’m trying to make the argument that one of the jobs that will be hardest to automate is care work. You know, you don’t really want a robot taking care of your child, for example; that feels much harder to imagine. And I’m also curious, what will that do to our thinking around labor in those areas where we come to value it more, because we’re realizing it’s a really distinctly human thing that one can do.

BK: The LAUREN series focuses on an intimate space. Since you’ve been grappling with these topics for quite some time now, how do you deal with ethics as an artist maneuvering the murky ground around privacy and technology?

LLM: It’s definitely a process of learning, having done this work for a while and continuing to learn. There are a few different things. First, I think whenever I’m involving people in the work, I’m trying to be really thoughtful about how I do that. There are things that I did in projects earlier on that if I would do those projects again today, I would do them really differently. I think what feels really important and top of mind is that there’s consent for people to participate. So that means trying to be clear about expectations of what might happen. Consent should be clear and enthusiastic, and informed, but it should also be revocable. So, I give people a way to exit the performance or the piece if it turns out that they’re not feeling good about their experience. A lot of times with these performances, I’m proposing something like, I’ll sleep in your backyard, or I’ll carry your baby. It’s not something that people have necessarily done before. So sometimes it’s hard for people to imagine or think it through it intellectually, but it might be really different in the moment when they’re experiencing it. They just haven’t experienced it before. 

Another place I’m taking cues from is thinking about our relationship to technology, and the way that I feel that there’s often not consent there, really. Say you get Alexa in your home, and then it just starts pushing updates, and the functionality actually changes, or we have these terms of service, and they’re just so ridiculously long and convoluted. There’s no chance anyone’s going to really read them before they hit “accept.” Those are kind of designed strategies to keep you from really being able to be informed or to consent to something. I’ve got my phone and I don’t really know how often it’s listening or recording even. Like right now, I think it’s not recording but who knows, right? We keep hearing these stories where it turns out it was, or it turns out there’s this security hole. 

So when I’m designing these projects, I’m actually taking cues really directly from that and trying to think, How would I do it differently? How would I like it to be seen? As an example, for the LAUREN project, when I go to install the devices in people’s homes, I really take time with them to explain each part of the system and then ask, “Is it okay if I set up a camera here or microphone here?” They can say yes or no. Then I show them how to unplug it, turn it away, cover the camera, or tell them they could say, LAUREN shut down, and I shut down everything immediately. I say, “Here’s what’s gonna happen with your data, here’s what’s not gonna happen.” 

For the SURROGATE project, I’m making this app where something could control me, rather than it really being a control mechanism, it actually became more of a tool for dialogue, where we were saying, “What sort of relationship would we want to have as your surrogate? How would technology facilitate that?” So in contrast to some of these apps that are trying to impose a program behavior on us, we were trying to make an app that opened up our own discussion about what sort of behaviors we want to have.

BK: I liked how you explained how LAUREN differed from the conventional ways that the tech industry has desensitized us to [consent]. Talk a bit more about what exactly the surrogate project is.

LLM: I basically had this idea, “could I be a surrogate for someone and carry a baby for a couple of parents or parent?” And during that time, give them an app where they could just totally monitor and control me 24/7 for nine months. The idea was that they would have complete control of the body in which their baby was growing [in]. They could decide what I would eat or where I go, or what I do or what thoughts I’d meditate on. 

It was kind of asking questions about what does it mean to be a proxy or surrogate for someone else? What does it mean that we can use technology to move around this genetic material and make these things that biologically wouldn’t inherently be possible, possible? 

I was thinking about the future. We already have a lot of technologies for embryo freezing, IVF, and genetic screening, and all these sorts of things that can make different kinds of pregnancy possible. It seems like we’re heading towards a future where those possibilities expand even further with gene editing. So, what would it be like if we were able to have even more control? To pick the gender of the child, which you already can, or to decide the genetic feature[s that are] not desirable? Let’s have that edited out. They’re already doing research in that direction. What if that was available to everyone?

The idea that someone controls me with an app is a metaphor for the idea, what if you could kind of control or have much greater control over all these aspects of your future child: what color hair they had, or what sort of personality they had, or editing out a particular disease or genetic mutation or something.

I can’t really do that as an art project, but I couldn’t stop thinking about it. So, I started making a short film where I was interviewing parents about this idea. I wasn’t actually trying to propose to have a baby, just role playing or speculating. But then, while I made that film, one of the people in it was a really close friend of mine and said, “You know, I’d like to do this for real.” Then I was like, “Well, I would, too. I just thought that wasn’t really something you could do in an art project, you know.”

She was down for it. So, we went through this process where we really tried to make that happen. We built the app, and did all the tests, and got really far into the process, until it was eventually shut down for a couple of different reasons. 

BK: To clarify, the ultimate goal was an app that simulates what it could be like in terms of monitoring or active communication between a would-be surrogate mother and the family.

LLM: The actual project was intended to be a performance where I actually do this. I carry the baby and she controls me, and we made an app to do it. When we got really close to the implantation step, it got shut down by various kinds of medical decisions, so we didn’t end up going through with the pregnancy. But now I’ve been making a series of works trying to tell the story of what happened and elaborate on this idea. So, it becomes a little bit more like a thought experiment, but it’s very real in the sense that it got very, very close to what was actually happening before it was shut down. 

It was also interesting because the project started out with these questions about bodily autonomy. Then in the end, the decision to actually carry this baby was taken out of my hands by a doctor who decided that I wasn’t capable of it. So it’s not so much about the app itself, just making the app was a tool for this performance.

BK: In my head, it was all a simulation, which, consequentially, it became. What were the ethics of that in terms of doing because it’s different when it becomes like, when they like, when they say like, wouldn’t be baby cannot kind of center this or like, I guess like that’s a question that Oh, I guess that is the fundamental like debate of like, like, like when there’s like an added piece of like documentation and sort of media around it, that feels different. 

LLM: It does. I mean, these were some of the really hard questions that made this project really difficult for me. I would just get lost, like, “Can I really do this? Is that wrong? If Dorothy [Lauren’s friend] wants to do this, is she giving consent on behalf of her child? Are there ways to do it so that privacy is preserved? So then it would be okay? I don’t know.” So I just went through all those questions, and at some point I realized it was impossible to know the answers to all of them; we had to work it out in real time. 

I think for me, there’s not that much separation between my art practice and my life. So when we’re thinking about this project, it wasn’t like, “Oh, we would have a baby for an art project.” It would be more like, “Well, we’re having a baby because Dorothy wants a baby and we decided to do it in this way because that’s very true to who we are as artists and people.” So the child that is born through that process, that’s just the kind of life they’re born into, in the same way that everyone has different parents who create different situations for you, from the moment you’re born. But I don’t know, ethically, it was definitely very questionable. 

I think everyone is entitled to their own opinion about what’s right or wrong in the situation. On the other hand, there’s also these questions around reproductive rights and bodily autonomy. So even if people feel that it’s right or wrong, I think ultimately, what matters is what the parent thinks about it, whether they feel it’s something they want to do in the same way that we talk about abortion. My perspective is that it should be a very personal decision, not something that the state should decide on your behalf or other people should decide for you. But that’s my perspective. The piece is meant to open up conversation and discussion. Different people might have different points of view on the ethics of it, and that’s totally fine.

BK: What will you do at Stanford, and what do you hope to accomplish in your time here?

LLM: One of the outputs of this whole kind of exploration has been a live performance for an audience. It’s kind of like a one-woman show where I’m sort of telling the story of what happened. So that’s something that I’ve been developing while I’ve been at Stanford.

Another big part of my research at Stanford has been trying to get deeper into the genetic aspect of this project. What are all the different ways that we can currently, or might in the future, be able to control the outcome of reproduction? How does AI factor into that in terms of quality prediction, possibilities, genetic screening, even gene editing? There are a lot of startups that have spun out of Stanford around those topics too, like 23andme. A lot of the research has been talking to different people working in that field and understanding what’s possible and then thinking about how it relates to this project. 

I’m also working on a new piece at Stanford called “SALIVA” that is trying to get at some of these questions. The idea is that it would be a saliva exchange. People could come and spit, because I figured after COVID, we’re all just used to spitting all the time anyway, or swabbing and all that. So, you come in and spit into a tube. Then you fill out a form that’s similar to a sperm donor forum, where you share some qualities about yourself, your personality, or your genetic traits. Then after you’ve made your donation, you go into the next room and you can select from other people’s samples and take one home in exchange for yours. As you’re selecting, you’re given this information, like this person had brown hair, or this person had a really nice voice, or this person was very melodic, and you can select the saliva you wanted based on those traits. 

In a way it’s an absurdist project, but it’s also a way of talking about this future idea of designer babies and how that could be very problematic or what sort of questions it raises. But it’s also a way of talking about our shifting relationship to biosurveillance after everything we’ve been through in the pandemic. 

I did a test performance [of SALIVA.] In that version, I was just exchanging the saliva one on one with people myself. In this version, it’s more of an exchange. So, it would be less about just exchanging with me and more about this idea of selection.

BK: You have cultivating an intellectual grounding, or at least a speculative anchor, for work based in science or ongoing research. Talk more about art as a research methodology and how it works in tandem with philosophical and scientific inquiry, but also how it offers alternative outputs that these two things were unable to do.

LLM: I think that a lot of scientific research is really based around trying to find answers to questions. In some cases, there’s a hypothesis about what that answer should be and that can really skew the way the data that’s collected looks. So, you can sometimes find yourself in a situation where you’re collecting data in support of the answer that you think. Even if that doesn’t happen, it’s often about a question that you’re trying to find an answer or solution to.

I think one of the things that art does really well is it doesn’t give a lot of answers; it actually makes a lot more questions. I think that [art and science] in conversation with each other can be really powerful. I always advocate for having artists in conversations about technology or in the room with scientists, engineers or policy makers, because I think that artists have a very good ability to imagine what’s possible or what hasn’t happened already. That’s part of the job description: just imagine what doesn’t already exist and make it happen. That’s what you’re doing when you’re making art because you’re trying not to make this thing that already exists. 

I think that’s really helpful in conversations that are about how we’re making policy or how we’re relating to each other, or our understanding of our relationship to the world and technology because it opens things up and says, “Okay, here’s the current framework that we’re in, but what if it was like this? What if we could relate to each other in this way? What if this was possible?” I think that’s the first step to making it happen. It’s just saying “what if” and letting your mind go there. So, that’s philosophically how I see those things coming together. 

Practically, as an artist, I really enjoy collaborating with people in really different fields. I think so much of the idea generation, and just the way these projects get made, is by talking with people and working with them in that kind of friction that comes up when you have really different ways of seeing things. I find it incredibly productive to not just be working with artists, but to be working with scientists, researchers, technologists, designers, everyday people that are not doing any of those things, students, and seeing what comes out of it.

I do think of my art practice as a research practice. I am not just trying to make art about things. I’m really trying to explore the boundaries of a topic and see what I can find there. So, the research often consists of experiments. I’m putting myself or maybe other people into a situation, and then we’re seeing what comes out of it and what we observe and notice and learn.

BK: In that way, you see art as a simulation tool in order to see speculative futures and outcomes that happen in the moment.

LLM: I think so. And yeah, I think simulation is a nice word more than even speculation, because I think we’re finding especially now, as things become so digital, there’s not that much difference between simulation and what’s real. If you think about, say, playing a game or existing in a virtual space as other kinds of simulations, there are ways in which those are simulations, but they’re also very real, like having an interaction with a friend in a space like that; that’s a real feeling and you have a real relationship.

BK: You said in an interview, “Data is a form of representation of ourselves, our relationship, our lives, our reality, and like any other representation, it is incomplete, imperfect, and subject to interpretation.” How has this perspective changed over time?

LLM: I did my undergrad at MIT and it was very technical. Early in my career, I thought of data as something more factual, or more unchanging, or indisputable. I think through learning from a lot of different friends and scholars, I have come to understand that data is something that is merely constructed in some way; you can’t understand data as separate from the frame in which it was collected or analyzed or understood. Whenever you look at the world, you have a particular lens that you’re looking through just because you’re a human, but you’re also a human in a particular society that has a particular worldview, so the data that is being collected is a part of that. So that’s what I mean, in one sense it’s a representation. If you think about a sound, you could think that there’s so many different ways to represent that visually, or auditorily, or numerically. All of those are different sorts of representations of something that exists on its own. And so, given that point of view, it’s made me think a lot more about what is the particular frame that we’re bringing to things as we’re observing? How has that influenced our view about the world and our values, especially around things like, race, or gender or class? How does our own positionality as humans affect the way that we’re seeing data, or which data does not exist, or which data seems important to us? And how are we understanding it? Over time, my understanding of data is that it’s much, much more subjective; it’s not a truly objective thing.

BK: Do you have any advice for emerging technologists or artists?

LLM: When I was studying computer science, I felt like there wasn’t a lot of questioning of why things were being done or why we were learning things. It was just learn them, and make the programs, and make the thing work. When I walked into the art department, it was totally different. I started to see possibilities and then that sort of opened this whole space for me, where I realized, actually, I could be doing something in between art and computer science and I didn’t have to follow a predefined path. So, I guess one piece of advice would be to follow what you’re passionate about and don’t assume just because what you want to do doesn’t already exist as a job that it can’t become one. A lot more is possible than what you might imagine or what you might have been taught to believe. 

Ask a lot of questions and make someone tell you “no.” This advice goes especially to women and non-binary and BIPOC and queer people. Having been in spaces where maybe you are feeling like you’re not accepted, or you’re not welcome, I think we tend to sometimes shut ourselves down and think, “Oh, I wouldn’t fit here” or “What I want to do doesn’t make sense.” But I have this mantra in my head all the time which is, I’m not going to tell myself “no.” Just ask, there’s no harm in asking whatever it is you’re trying to do, or you’re looking for.

I think when you’re really motivated, or you’re really excited about your work, sometimes it can feel like you’re trying to make something happen that you’re not sure is possible and to feel like you want to put all of your energy and time into that. But it’s really important to hold onto things in your life that are more important than work. The work that you’re doing, sometimes it will go great, and sometimes it won’t, but the life that you built for yourself beyond that is the thing that will carry you through.