112

Fantasy Factory: One Filmmakers Fight Against AI w/ Valerie Veatch

Read Transcript
Listen on:

Show Notes

The way artists make art matters. And some artists, like filmmaker Valerie Veatch, are exploring what role AI has in the craft of filmmaking.

More like this: Fantasy Factory: AI Supervillains w/ Anat Shenker-Osorio

Valerie Veatch is the director of Ghost in the Machine, a new film that explores the depths of the Silicon Valley fantasies around AI, and platforms all the people that challenge these fantasies. With this film, Valerie is working to change the culture of AI: it is not inevitable, in many way it’s not even possible, and therefore we have a right to refuse to engage with it. Valerie discusses why she made the film, what she learned, and what impact she’s hoping it will have.

Ghost in the Machine will be available for rentals and screenings beginning March 27, via Kinema! Pre-sales are now available at open now (go to Kinema and slelect the "Watch" tab). Proceeds will go towards the production of the film. The film will also be available on PBS in fall 2026.

Further reading & resources:

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Computer Says Maybe is produced by Georgia Iacovou, Kushal Dev, Marion Wellington, Sarah Myles, Van Newman, and Zoe Trout

Hosts

Alix Dunn

Release Date

March 27, 2026

Episode Number

112

Transcript

This is an autogenerated transcript and may contain errors.

Alix: Hey there. Welcome to Computer Says Maybe. This is your host, Alix Dunn, and this is our third installment of Fantasy Factory. Our mini series covering the stories shaping ai. Both the delusional stories that are constructed by white dudes who are kind of afraid of death, I feel like is their main motivation.

And then the stories we need that help more people come together to build the political power necessary. To reshape how AI will influence our world. We've been talking about both. If you've missed those episodes, do go back and listen 'cause they're both bangers. Last week's episode with Anat Shenker-Osorio was super useful and also to figuring out what to do about it, how we can communicate more effectively about this topic.

And I learned a loads, I felt like I got kind of a free on record consultation from one of the most powerful messengers on the left. So if you are looking for that, go back to that interview with a knot. And we also interviewed Adam Becker, who is an astrophysicist, who  unpacks the science or lack of science behind the sciencey stories that big tech bros like to tell us, which is also very good.

But today. Is the third episode in this installment, and it's timed alongside the theatrical release of a documentary that I watched in advance of its Sundance premiere. It's called the AI Doc, which I can't imagine a more hubris title for a film about such a wide ranging subject. Um, but it is a frenzied movie that sort of flips between what Dr.

Roha Benjamin Calls. Two sides of the same Bitcoin, which is AI could kill us all. Oh no. Oh, but also maybe AI will like save us all and they sort of frenetically move between these two un evidenced industry narratives and it sucks the air out of the room where people are trying to question or criticize.

Or even reject AI for people that feel politically animated by that idea. It just leaves a viewer stressed. Um, no additional information is presented. No evidence is presented, no critical questions are asked. No reorientation around power rather than technology. It's just bad. I came away feeling like I had just watched industry propaganda, which if you ask those directors, which a magazine did ask the makers of this film how they got the c.

CEOs from three of the AI companies to be in their film, and they admit that they basically went about a path of doing interviews with people they thought the CEOs liked, and then they were delighted when the press relations office of the AI companies were impressed with their work and reached out and said, yeah, we'd be happy to have our man at the top sit down and talk to you because the way you're covering this is exactly how we want it to be covered.

So instead of watching a man. Center his hopes and fears about our techno future. I offer you a different conversation this week with filmmaker [00:03:00] Valerie Veatch. I spoke with her about her AI documentary, the Ghost in the Machine, and it brings this sort of historical clarity and rigor to our understandings of ai.

And I'm not even gonna say this is the alternative movie about ai. This. Month. It's literally just the movie to watch. If you want to dig deeper into how generative AI was built on a foundation of eugenics and white supremacy and the power and harm its proponents have to sort of force that worldview to become our future.

So this week we have Valerie on to talk about how she's trying to use media to change culture. And the culture she wants to change is one that's largely dominated by men and the stories men tell, and one that says that AI is our natural next. Step in making media and generally being creative in our modern world.

And Valerie wants to sort of challenge that notion outright in this conversation. We explore how her and our freedom to create real human art is under threat and how she thinks [00:04:00] about, um, AI as part of the filmmaking process. So here's Valerie to explain all of us in her own words. Starting with what pushed her to make Ghost in the machine in the first place.

Valerie: So I was minding my own business. Not actively making films, not loving that, but just sort of like doing my thing. And a friend of our family came over to our house and he made a song on his phone using ai. And I was like, whoa, that's wild that it can do that. And I sent it to another friend of mine who was like, oh yeah, well actually you would be totally amazed.

I'm a part of this artist group testing this technology for this AI company. And I was like, okay. And he was like, you know what? I'll sign you up. And I did not even have a computer. I was just sort of like a mom of two young kids. And I was like, okay, yeah, I got this email and I clicked this button and all this sudden this app starts downloading.

And I was like, all right. Engage with these folks, [00:05:00] and there was a Slack channel. I downloaded Slack and it was like a group of really sweet, probably 60 or so folks. The usual suspects of early adopters of this kind of technology. Along with a few people whom I now recognize belonging to some of the newer companies dealing with generative AI in Hollywood.

But we were all there and we were experimenting with prompting, and the first piece I made was called Uncanny Valley. Because, um, that's a term in robotics where the closer something gets to looking real, the more it gives you like the X and you wanna throw up because it's like a zombie. There was something grotesque about the technology that I thought for sure everybody must understand and

Alix: you immediately felt that way.

Valerie: Absolutely. Yeah. I mean, like you, you generate something and it's horrific. You know, the first thing that really surrounded me with this stuff was the women that these generative systems made and still make, they would be pulsing and like losing their clothes within like a few versions of [00:06:00] generating a scene.

Suddenly she's got no clothes on and. You're like, whoa. It would not make women in positions of power. Basic stuff. As a critical theorist and as a feminist and as a filmmaker, all of my red alarms went off. And then digging around a little further in that space, the racialized stereotypes. Are impossible to ignore you.

You write in an occupation that was more associated with labor. The skin would be darker. Like it was like that stupid and obvious and harmful. And the rhetoric that was coming from the company was this was gonna be the technology that democratizes Hollywood. This is going to unleash an era of productivity for all of us folks in the group.

And aren't we so lucky to be part of this? My partner, who is not in the arts industry but is in business, would listen to these phone calls and he was just like, they're taking advantage of you guys. Like, this is so pathetic. Like, you know, like this is like a huge company that's using all this rhetoric to [00:07:00] basically source free labor, which was exactly what they were doing.

So I didn't mind, I thought it was kind of fun. The obvious biases were super crazy. And so I thought, Hey, I'll just write a letter to OpenAI. And so I wrote this email saying like, Hey, you know, I don't think that filmmakers are going to use this technology because it sucks and it makes racist and sexist stereotypes and you know, I'd love to work with y'all to figure out how to fix that.

Obviously they didn't email back and then that next day this. Amazing artist in the group posted on the Slack channel saying, Hey, you know, I was uploading a selfie into the software and generating a scene of me in an art gallery, and it kept my braids and it kept my outfit, but it made my skin white over and over again.

And she posted like a hundred thumbnails of basically OpenAI. So a, a whitewashing her and. I sent another email to the folks moderating the artist group being like, Hey, [00:08:00] so this is really problematic. There was a little bit of like, oh, yes. Well, thank you for helping remind us of what's most important.

We're launching a product,

Alix: and

Valerie: so

Alix: this is what dri, this drives me mad, this idea that like critique is. Invited, but in an environment or a context where like the dominant paradigm is gonna like steamroll it anyway.

Valerie: Yeah. So then I reached out to the artist being like, how are you feeling? Are you okay? You know, typically when somebody would post a piece of slop on the Slack channel, there would be like 85 little flames and hearts and leaping frogs, and there was like nothing like crickets, like no one responded.

And so I had responded with like a hundred percent emoji being like, Hey, can we like talk about this? This is really crazy. The whole spirit was like, it's very cringe to bring this shit up. And so anyways, I reached out to her and she was like, I really appreciate you reaching out. It sucks as an artist, I don't wanna send her my work around suffering, which I completely respect and understand when she was.

Additionally, as somebody who works in this space, [00:09:00] I want open AI to keep reposting my stuff on Instagram. I want them to engage me. And the inherent power dynamic that you're talking about is there, you know? And so she was like, I don't wanna. Upset them

Alix: toil and obscurity.

Valerie: I was like, okay. And I took one more time writing these letters to OpenAI and they were like, we're launching a product.

We're in the launch room. You're exactly right, Hollywood is gonna rebel. Unless, and that was sort of the threat I was using. I was like, nobody's gonna like this. Everyone's gonna say it sucks. And then they launched the product, Sam Altman hops on our Slack channel and says, Hey everyone, I read every single post on this page and I just love it all, and I'm obsessed and I'm addicted to this feed.

And I'm like, either he's lying and he doesn't read every post. He's just saying that. Or he's. Negligent knowing that it whitewashed this artist and did nothing about it either way. I don't really like this guy, who's he? And then they were like, okay, after Christmas we're gonna have this conversation with you about what you're talking about.

And so I like waited. And then after Christmas. I'm a professional [00:10:00] filmmaker. I take myself really seriously, and I thought that surely OpenAI would too if they'd invited me to be part of this group. And then there was like one person on the call and they were like, Hey, so yeah, it's just the engineers at this company are really smart and they've.

Grown up in a STEM environment and they don't take social issues seriously, and in fact they're kind of grossed out by them. So it's just overall cringe for any of us women to try to bring up what you're talking about. Here's a third party, DEI specialist who can help you deal with equity initiatives in tech, and I was genuinely stunned.

I just sat there and I was like, I'm not what, this isn't an equity initiative in tech. This is about product functionality. That like, like what? And so then they invited me out to New York for this screening series at the Metrograph Diamond Square Central, and basically I went out there. I was like, I'm, I'm so curious, like what am I missing?

What am I [00:11:00] missing? Then the culture change that like this is our reality now and everyone's excited about this really poisonous, toxic technology and no one is saying anything about very obvious kind of social harms. Cinema is the intentional production of images, right? Like that's filmmaking, that's storytelling and stories are so powerful.

So I was just like the, the kind of capture of this space really freaked me out. So I went out to New York and I basically decided that everyone was a fascist and I had to come back and figure out how to make this film as quickly as possible, and I pitched it. While I was still in New York to several folks who commissioned documentaries and all of them were like, Valerie, we love your filmmaking.

We really respect you as a filmmaker, but you're being really cringey about being against ai. And in fact, our company is doing an AI initiative and we're really excited about it. And so we think that you're kind of like wrong.

Alix: Like a buzzkill. It's not that you're wrong, it's that you're right. And we'd like for you to stop saying it.

Valerie: Maybe.

Alix: Yeah. Or maybe it's genuine. That's the feeling I have is that like people are like,

Valerie: no, [00:12:00] please don't tell me again that this stuff sucks. I do feel that way about driving a car.

Alix: Oh, that's interesting. So when people are like, you shouldn't have a car, you're like, stop. I know.

Valerie: Yep. It's tough. Right? So this all kind of lands us back where we start, you do conversation, which is what mechanisms do we have to really produce the kind of shift that we know is necessary?

Right. And I guess one of the things that I know to be true is the culture is something I can change, you know? Yes I can. As an artist, as a person, culture is ours. And then if we look at what slop is or what generative AI is, it produces the kind of tech bro lay capitalist interpretation of a creative object.

It's just, Nick Cave says, this is not me. You know, like a three minute song, right? Plus the end product. But what makes that song is actually the embodied experience of the [00:13:00] artist literally vibrating through time. Discovering and singing, you know, and playing that thing. And then of course, that thing ends up being this square box that you can download as an MP three.

But the whole context of it is friction in space over time in context. And that creates rooted meaning, and that is culture. And so when you just operate in this. Vacu is space where the and the product of a cultural gesture is, is mistaken for the culture. Then that slob and that is sort of the hollowing of this whole industry.

And so the real sadness that I have is any studio that is going to embrace this technology will feel a slow hollowing of its she. Like even now, like apparently I don't hang out with like 15 year olds, but a couple people have told me that like the 15 year olds they're around will say, that, say I as a way of denoting something as meaningless and stupid and cringe and awful.

What they're articulating there is that lacks culture that [00:14:00] lacks meaning. I don't identify with that it has no grip on my soul. And weirdness in a way is a, is an assault on the regression to the mean, the like finding of the center that AI produces. Right. The surprise like weirdness.

Alix: Unanticipated. Yeah, totally.

Valerie: I think there's something really powerful about weirdness moving forward and things that are like super unexpected and wouldn't be pattern recognized by an LLM. Those things will have more meaning and value. Right. So in the context of like, you know, the slop as an empty vessel, as something that will not have cultural meaning ongoing, sure fine artists and culture always adapt, but.

One of the things I find really troubling is when we look at this phenomena in the creative professional context, right? The thing that I was really surprised at when we premiered the film at Sundance, I thought as soon as everyone sees this film and sees the information therein, which you know was all new [00:15:00] to me and all the stuff I didn't know when I started making this film, I was deeply uncomfortable with the end product of what was going on going into it.

I didn't know any of this. I didn't know what was different about this current moment in this technology and why now, you know? And my cousin, who's he's a computer person, explained that this is just unprecedented amounts of compute and unprecedented amounts of data and the transformer architecture. And this chip design.

And those are the things that were making these generative models function like these chat bots on steroids basically, but nothing about that was sustainable. And what's unsustainable about the data was that we've run out of new data and synthetic data. It doesn't really work as we're discovering, which is hilarious to me.

Alix: And the data that we do have that's not synthetic is like the worst communication almost all in English of the disgusting underbelly of the internet. Yeah. That's so, but that's so wild to me that from a exposure [00:16:00] to the end product or like encouragement to creatively engage with the end product, you had a hunch immediately that there was like something deeper here that disgusted you and then you basically just like start.

How did it start?

Valerie: I started Googling and like I was looking for like how do large language models work? And I was just like, where do you go to find this information? And I found the touring center and. England and the name of it, and I was somewhat familiar with it as being the, the home place of artificial intelligence.

I was like, okay, I'm gonna go look there. It's the British Library. Can't be that weird. And I found on their YouTube channel a talk by Abe Hanani, which basically forms the spine. For the film and I was so excited by the thing she was saying that I literally just started like cutting that into like a timeline and my mom was visiting from America and I was like, quantum mom, I'm learning I there.

It's just like, yes, absolutely, yes. Uhhuh, totally. And it was, you know, AI is a marketing term and what it's built on is [00:17:00] stolen data and what it's powered by are environmentally harmful apparatuses and what powers it and what creates its political staying power. Is corruption. And so then I was like, wow, that's crazy.

And then I found this paper called Stochastic Parrots, which is also equally a sort of primer to all of the problems around the specific technology. It was a group of researchers from Google and a linguist and computer scientist who traced the contours of all of the problems around this stuff. And so then I emailed Emily Bender.

She emailed back and was like, who are you and why do you wanna talk? And I went to New York and, and that was kind of on the flight back to and from New York. I was sort of discovering some of the stuff 'cause I was like, whoa. And then when I got back to England, I found Dan McQuillans. I decided everyone had become a fascist.

And so I googled AI and fascism and found Anne McClellan's amazing book, which is resisting ai. I think is the title, but it's anti-fascist approach is two [00:18:00] ai. And he was actually the first interview, first person that would talk to me and we spoke for like two hours and he was like, actually all of this, you can find the roots of this technology in eugenics.

And I was like. Okay, sure. And hung up the Zoom call and went about researching more of this and was like, oh wow, that's crazy. I found Tim Nit Gabriel and Emile Pete Torres's amazing research paper. The test grail bundle of ideologies, which kind of draws a further net around some of these ideas, which when all hung together, create a very dehumanizing and problematic.

Paradigm through which to view artificial intelligence and a lot of these technologies. From there, I met this amazing person called Alex Dunn, who took the time to share with me some of the finer nuances around data centers and what, what, what is happening on a kind of local and political level [00:19:00] around how infrastructure in these technologies is.

Created and what that means. So it was, it was really a journey. I would just wake up every morning and read white papers and look at the footnotes and find somebody who, and then sort of like email them and see if they emailed back and would they wanna talk and we would talk them and would they wanna do a recording.

And the amazing thing about this film is that all 40 people who I interviewed, and literally there could be 150 more people who would be peers and colleagues of this amazing group in the film. Everybody's telling the same story or telling pieces of the same story, and that's what really stunned me. I really wasn't sure if this would be a film, and it was.

It was kind of more like therapy. It was more like making me feel less crazy and less alone to learn the things that everybody had to share and to be putting them together. It was just a way of me organizing my own thoughts because throughout last year. Okay. Maybe a little bit over a year ago when I was trying to pitch a sort of critical perspective on AI story that nobody wanted to hear.

It [00:20:00] was very unfashionable to be against ai and even though people, they were like, I respect you, and you definitely know what's up about technology, Valerie, but like, I don't think you're right about this. Or like, I think that you know that it was so unfashionable. And even now when people are more vocal about being uncomfortable with generative ai, it's.

Still feels unfashionable to take this critical perspective. And so it was very exciting to discover like this whole world of people who thought this way and who knew this stuff. And I think the end result is this really compelling story that is so urgent. And it is really important. And it's not like flashy, you know, it's not a flashy film.

It's not a film that's gonna make you feel comfortable. Even while you're watching it, but I think that the ways in which this story about AI challenge our assumptions around truth and power are really important. And it's like AI is not a new technology. It's not even a technology, right? But this moment that we're seeing is the consolidation of all of these forces of deregulation and wealth in wealth inequality [00:21:00] and corporate kind of consolidation and their.

Really stark in this moment and stark around this technology. And that's what makes it a crazy story

Alix: now that you've gone through this, now that you've made something that is cool and was premiered at Sundance. And thank you again for inviting me to be a part of a lot of the festivities. That whole Sundance experience was really interesting and cool and bizarre.

And also a manifestation of all of the very issues that you sought to uncover with, with the movie. Like it was all kind of on display in a very tiny town in Utah. Um, but do you wanna talk about like. What you wanna see happen with the film now.

Valerie: We are so excited that PBS will be acquiring the film and we'll be airing it in.

Late August on PBS in the US and simultaneously in the US it will be available for free for 90 days on the PBS YouTube channel, which is so exciting for us. I'm so excited because it's this moment that we can really build and [00:22:00] harness around so that like grandparents in their living rooms and you know, kids on their YouTube over the school break will like literally trip over the same content.

And it's an opportunity to have a real conversation. And what that conversation needs to be is engaging in the politics of refusal. And what I mean by that is like what Jonathan Flowers says at the end of our film, which I think is so profound, he says, the most radical thing that you can do in the age of AI is ask the very simple question.

Why does this need ai? And if it doesn't need it, or if you can't answer that question, then it doesn't need it and you can refuse to use it over and over again. And you can say, this adds no value here. And I think that's the most important. Tool that we have at our disposal is the right to just reject this technology.

And I think that that's a wonderful way to approach this very ubiquitous term ai, right? Like it's just such a [00:23:00] silly, it's such a stupid word. I wish we'd stop using it, but like, let's just say for like the sake of this conversation. What we mean in the context of being a creative and being a filmmaker, like that means generative ai.

Right. And what is it for? Does it add value? No, it doesn't. And it's funny, there's so many different threads to this dynamic of refusing this technology that is being shoved down our throats with no consent in every single digital experience we have. And. I was really surprised bringing this film to Sundance and then the conversations that have come out of that.

I'm surprised at how confrontational it feels in the filmmaking space to say that this is rubbish tech and we shouldn't be using it. And I think there's like a couple things at play here. Like one of them is this. I've noticed that like especially older folks in the industry like are keen to embrace this technology as an extension of [00:24:00] non-linear editing or as an extension of digital color correcting and this kind of march of technological innovations that this is somehow.

Like the digital camera and we should get on board with that or we're gonna be left behind. And that whole narrative is problematic because what this technology is, is fundamentally different than other technologies that have come into like the creative space because it kinda steals authorship, it steals a source of authorship.

So I'm surprised at how feels to say no to this Technology feels so against the grain. And one of the things that I am excited about is in having these conversations with other filmmakers. There's a kind of responsibility that we have as storytellers to identify like what is it that we're doing with our gesture of storytelling and the ways in which generative AI.

Enhance. Those are like none, right, in my opinion. But like the ways in which the generative AI like detracts from that gesture like are many and going from there, there's a lot of [00:25:00] ways that we can like kind of have these conversations around normalizing the rejection of its use. One of the funny things I heard James Cameron say, the like, avatar filmmaker, he was like, well, I meant the end of my life and you know, I wanna make at least six more avatars and AI will help me do that.

I'm like, bro, we don't need six more avatars. You know, like this, this franchise of a story that's built on like colonial narratives is not something we need six more of AI slop amplifies like the worst. Elements of storytelling in our society, but there's this misunderstanding of the technology also, like I heard another filmmaker say, oh, it's the most organic direct expression between an artist's vision and the outcome.

Like you don't have to deal with the lighting department. You can just sort of express your vision. And I think that that's also like completely false, right? Like fundamentally misunderstands the technology. And that was really sad to see, and I felt kind of like, wow. Our film is a bit of like this voice crying in the wilderness, and so part of what I'm really excited about this [00:26:00] summer is we're gonna work really hard to get our film in like all of the art house cinemas that we can over the summer before our PBS launch.

And we're gonna have community screenings and university screenings, and we're gonna be here for the folks who are here for it. And similar to the, in the journey of making this film, it was kind of a little bit of finding like-minded people. I think that the process of releasing it will be the same.

It'll be finding institutions and people and groups that. Do feel like they have the political agency to refuse the technology and say no. To me, it's so simple because I know it adds no value because I know it doesn't actually create productivity and there are all these studies coming out now that of course like say this and you know, but like you can just know it by using it because there's nothing about it that improves the efficacy of your work or the quality of.

Alix: I feel like there's, I see three reasons why there's this intense reaction to refusal in filmmaking and in arts, and this is from my very brief foray into spending some time with [00:27:00] you in these spaces. So this is me not really knowing very much, but I feel like one, it feels like there is this undercurrent, and I don't think it's just within filmmaking of.

Are you being naive about the capabilities of these technologies? So yeah, they're bad now, but like eventually they could get good. And don't you want to be at the forefront of new tools that will be useful in your craft? And like, we gotta get into this. The second feels like this argument of precarity of true artists.

Real artists, people without access to capital and investment in film, need to do whatever they can to be able to make art and access quality product. And that this is a shortcut or a technology that's gonna like democratize or like create a path towards production that currently doesn't exist because of the concentration of capital within the industry.

And then the third one, I think is just the, it feels, I think there's anger that comes. When people feel like they're hearing something that [00:28:00] doesn't rhyme with what they think or feel or feel like everyone else feels. And I think the inevitability narrative of AI is so dominant that when you say no, people go through all of these sort of rhetorical and political.

Cycles in their head of like, no, no, you can't what, wait. 'cause if you can say no, then I can say no, but I can't say no. 'cause no one can say no, 'cause this is happening. And like, what, what are you doing? Like what is this? Um, and I think there's like this psychological reaction of like rejection of the frame of refusal because it.

Introduces the idea of agency that like we could have some control over things in a way that I think triggers people. So like how do you, on that first bucket on like the, eventually this will be a technology integrated into all of this anyway, what is wrong with you? We don't have to, you don't have to like rebut these, but I just think that there's like an interesting,

Valerie: I think number two is what I would jump in and speak on immediately, and this is this narrative of democratization of the technology [00:29:00] and.

The hilarious thing to me is we've had 4K cameras in our pockets for 10 years, only five years. Yeah. You know? Okay. Like we do not need help producing wonderful quality media. These technologies are. Ubiquitous and certainly the tokens and the cost of the tokens to iterate repeated generative outputs until you get something that works, is way more expensive than like shooting something on a, on a camera that's like $200.

But what does need to be democratized is distribution. And so like all things, like the conversations about climate change, this conversation, what we are really critiquing and what we really want to challenge are the mechanisms of capitalism and the mechanisms of power. And it's a complete distraction to act like the thing that keeps me as a filmmaker from being ubiquitous and having like my shit in Hollywood.

It has nothing to do with my ability to use the tools at hand or like that. I can't [00:30:00] make a fight scene in outer space. Like the, the issue is distribution, you know?

Alix: Yeah. And also like, I don't want what used to be called like indie filmmakers to be trying to make avatar.

Valerie: Avatar as a sort of artifact is an expression of colonialism, is an expression of capitalism, is an expression of male entitlement, right?

Like those things are the things that need to be challenged. And it's such a lie and such a false narrative that generative AI is somehow democratizing in the same way. Maybe that we can like draw some kind of vague parallel to the social media democratize access to information. Well, no. Yeah. In a lot of ways it distorts it.

It has, it has actually completely fractured our communities. It's distorted the notion of friendship. We could go on, people know about this now. I think there's this trick that happens when new technologies are introduced or even, or old technologies are reintroduced, that the democratizing [00:31:00] is actually a lie and, and all this technology does is entrench real power.

And we look at Hollywood, what's happening with the massive consolidation of streaming platforms, production houses, and Paramount Warner Brothers merger. There's going to be even less ways for independent storytellers to get their stories out there in kind of traditional environments where we like create shared meaning and truth and being able to generate like a butterfly inside a light bulb shattering.

It wasn't like, you know, that's not gonna, that's not gonna like challenge power in any sort of real way. I, it's interesting, I was having a conversation with somebody today who's looking to mobilize folks around rejecting ai. What was so fascinating about that conversation, and this is a conversation I've had a few times with various folks throughout this process of making this film and releasing it, and it's this idea of having a big tent to hold all of this dialogue around, uh, refusal or slowing AI down or rejecting it.

And you sort of have these various voices, like you've got the [00:32:00] AI risk bros who are very tied up in existential um, stories, and then you've got people who are kind of on the ethics side of things who are grounded in the reality of these. Systems and like challenging the way that they enhance corrupt power.

And you know, I used to sort of be like, and I, and I wish I could still be this way being like, we need a big 10. Everyone needs to be invited. And then we can sort of like help everybody understand more what this technology is. But it feels like the existential folks have such a loud microphone. They are the people who get the funding.

You know those documentaries coming out that engage in that dialogue of, well, the machine is gonna be so smart because men have figured out how to make a super smart machine's gonna take over the planet. And the existential story is simple. It fits with these sort of like the narrative of capitalism, and it fits with the narrative of white supremacy and male male power as we see in the film.

And that's the story that gets the most noise and it's also the very [00:33:00] movement that. Is toothless, it will have zero impact on power and it will only serve to perpetuate the power.

Alix: It's also a such an abstraction away from individual decision making that basically it says if you're a serious person, you're gonna think about these very abstract, very speculative ideas.

And that means that if you were to. Have the arrogance of thinking that making a simple decision, like refusing to use these technologies, that that has any relevance or meaning in this, you're kind of an idiot. It creates this construct where like basically the ask is to do nothing and to let the people that know stuff do the thinking, because that's what's needed right now is thinking rather than action.

Valerie: Yeah.

Alix: Of lots of people.

Valerie: Mm-hmm. Yeah. Or even worse, call your senator. As if, as if kind of optimizing your individual voice and optimizing your anxiety through these political mechanisms is gonna have any. Any impact and [00:34:00] it won't. What will have impact is culture, and that's kind of where I turn my energies.

I really love and admire folks who are doing work in the policy space. And I'm not saying that like that matters equally, right? But what I have access to and what I can understand on a kind of very visceral level is culture. And so by creating a culture where refusing AI is viable, where it rhymes with what I'm saying.

So like what you said earlier about how when, when someone hears, oh, don't just don't use ai. It doesn't rhyme with the kind of meta narratives that we're all absorbing, but maybe if we can create this space and learn how to articulate in a clear way what it means to refuse this technology or even what it means to refuse the very framing of ai.

Alix: I mean, I feel like the piece about meta narratives trying to suck oxygen outta the room for refusal and agency. I think is important and I feel like I would be remiss without asking you what it feels like to have had at [00:35:00] Sundance, another film that does perform that function and it platforms all of the people that do exactly what you're just describing, and it also creates a through line of narrative that is essentially a non-expert director man being.

Emotionally dragged through these meta narratives and getting more and more anxious about all of it without actually articulating any power analysis or any meaningful understanding of like what should be learned and understood and acted on. And then CEOs talking about the most abstract articulation with no.

Specificity, no questions, no challenge, no sense of anything other than narrative. Like the AI doc is basically just multiple large v, it's vapors. It's the fumes [00:36:00] of the discourse packaged as a film. I can imagine many people watching that, that don't know anything about this space, leaving, feeling like someone coming to them and saying.

I'm not gonna use ai, that that person sounds childish and naive and they don't understand that there's this thing and it's coming and I don't know what the thing is and no one knows what the thing is, but important people that are really smart just told me that it's gonna change everything. And it like, it's like whipping up this moral panic about a thing that hasn't yet arrived and won't arrive.

It's like sucks the oxygen out of action. Like there's just no action space in that film as a person that made a film that is covering the same topic, but from a completely different angle, um, with fewer resources. I don't know. Yeah. What are your thoughts? I don't mean to like say talk shit about this film.

No,

Valerie: no, no. But like, this is my third feature documentary to premier at Sundance about. [00:37:00] Technologies, specifically new technologies as they are being normalized in society. This is terrain. That is literally my career. I went to pitch my take. On this story a year ago around and to all of the usual places that would fund films, and across the board what you are saying about the naivete of refusing AI in the face of this inevitability narrative.

I wasn't even there yet with my dialogue. I just wanted to make a piece that was engaging the AI critical space. Nobody wanted to fund that because, oh, their company is doing this AI project. Or we're actually investing in this like, AI thing. And, and like I'd be like, well what AI thing? Like, well what are you investing?

And they're like, well, we don't really know. Our just box is very into it, so we don't really wanna touch those. And I was like, cool. I said about, you know, this journey of, of like putting together these thoughts and, and this film is really just a vessel for the like, amazing work of everyone in the film.

And what was so excitingness to hear everyone speaking over [00:38:00] these interviews, a lot of the story emerges. That's like a, a quilt all kind of expressing the cohesive story. You know, there's that. Quite a bit. What I experienced in my career frequently is that stories about the future and about technology belong to men.

And my filmmaking has no space in that dialogue because I, as a woman, as a feminist, am going to decenter certain narratives just naturally when that occurs, then some of the myths around the tech world fall out of balance. And you know, I made a film about internet addiction. 12 years ago. Right. That was sort of like, what is you talking about?

We're not, we're not addicted. You know, like just nothing to do with my brain chemistry. And it wasn't, it wasn't like studied yet, but now we know fully like all of it. But then the social network comes along. And it completely captured that space. So this is actually the second time I've made a film, and then there has been another film that has been created, funded and promoted by tech Bros that [00:39:00] seeks to occupy the same space.

That film gets a lot more traction than my film and the kind of important lessons that we learn when we decenter the male narrative get lost. So I think what's really interesting about this moment that we're in right now. There is this other AI documentary that absolutely viscerally centers the male narrative and it centers the male narrative that men can create life.

And so when a woman in the story is creating life, the validity of that is challenged because men have already figured out now how to create life and isn't that important and scary and shouldn't we be focusing on that? And then you have our film, which. Just strips away, hopefully, really effectively, the very idea that machines can think, we just go to the root of this whole crazy thought process and we just cut it off because it doesn't exist.

It's all a fantasy built on white supremacy and colonialism. But I think what this illustrates is. If I [00:40:00] hadn't made this film for free in my, like, spare time between school runs on Zoom and been, you know, an especially good editor or whatever, then there would only be this other film. And when 8% of the top grossing films in the last year were directed by women as a culture, we, we completely lose the.

Truth in our storytelling spaces when we center the male narrative, like, look what happened with these two films. We start with the same question of like, oh, there's this like idea of super intelligence, what's happening here? And when the male narrative is centered and funded and coddled and not challenged, then you have one kind of story that engages only in the existential risk and is toothless and completely like collapses.

That the ability to have, um, political engagement or like effectiveness. That's interesting to me. I'm just over here like swinging away, doing my thing, you know, like a machete in the jungle, like moving through and finding the path and wanting to share [00:41:00] this perspective, this way of seeing ai, this kind of truthful, historical, grounded look at it.

Platforming. Not that this film platforms anyone in the film in a way that they couldn't already platform themselves, but hopefully like making a really wonderful body of work accessible to viewers. This film is like a vessel for those voices. That's really exciting to me. So another factor in all of this is like having made a lot to films and projects around technology in society and having pitched.

Infinitely more of those kinds of projects. Um, it's very difficult to talk about technology without attaching it to a character. You know, you can see the impulse to insert the emotional journey of the filmmaker as being the raison deck for the film, and then inserting the experts alongside that. Like if that is attempting way and it looks good on paper to tell a story, but when.

That main character is somebody who's not going to interrogate their own internalized narratives of supremacy. Then [00:42:00] we end up getting a very loud, highly funded, wildly distributed film that ultimately perpetuates a narrative that's false, and that is frustrating. But I'm also like not, no, I'm, I can't really engage that right now because I'm still in this maybe overly hopeful.

Little space where like maybe our film can punch through, like maybe we're gonna get a little bit of traction in the dialogue.

Alix: I feel like what you're doing is creating a foothold for people that are trying to do the thing that is counter to this dominant moneyed male narrative. And so I think you're creating a permission structure and an artifact that.

Is useful for people to consolidate the network. 'cause there are people that very much agree with you and are looking around and thinking, oh my God, does anyone else feel what I'm feeling and think what I'm thinking and I wanna learn more, uh, about these instincts I have even if I'm not. Super educated or [00:43:00] informed about the histories and politics of these technologies?

I don't think it's punching through. I think what you're doing is like you've made a campfire and it's like, come sit around my campfire. And I think that like that's gonna bring people to you that are already thinking in this way, not necessarily puncture some other thing. So back to distribution. So from a financial perspective, the other film has a lot of money and a lot of.

Pretty high profile people that have built their own distribution networks on the backs of these narratives and on the backs of the power networks that align with what it is they're advocating for, or at least benefit from the narratives they would say are critical of these companies. How this is critical, these companies, I don't really know.

Um, but how do you, like, what's the plan? Like how do you, like, what do you need? To help create the type of distribution infrastructure that would not rival. 'cause I don't think, I think their film's irrelevant. It's basically just part of an existing [00:44:00] structural dominant force. So rather than think about it as a rivaling, like how do you build the distribution infrastructure that translates into a network of passionate people who feel aligned politically with what you're trying to do.

And like make it possible for people to be connected with other people that are thinking the same way.

Valerie: So one of the really amazing things that has come out of Sundance is the film has now been acquired by PBS Independent Lens, which is on its own incredibly wonderful. It's a huge honor. I mean, it is sort of like the high bar of documentary filmmaking, and I'm like touched that they saw this film and that they understood what it was, and that they're so excited about it.

They are trying something new in a really cool way with an initiative with their YouTube channel. Square day in day, it's gonna broadcast in America on PBS, so like our grandparents can watch it or whatever. And then for three months it'll be playing for free on YouTube in America and starts in September.

And that to me is a real opportunity to create this kind [00:45:00] of multi-generational. Conversation where anyone who wants to jump in and just listen a little bit to the perspective of this film, and even if some, you know, somebody is hardcore, believes that there's such a thing as artificial intelligence and that it's coming and whatever, just this film is.

It is like a knowledge packet. It's like a bits of information that once you understand you can't unsee, and then the whole big tech narrative looks a little different. So anyways, I'm wandering away from your question. So if your question is how do we get it out there, I think one of the really exciting things that we have is.

PBS and being able to utilize the networks of public broadcasting in America, even though it is so much diminished under the current administration, it still has this like really strong function. I'm excited about that. We also are launching March 27th on Kin, which is this amazing platform I'm so excited about.

It didn't exist last time I made a film and I'm just in love with it. Community screenings virtually and [00:46:00] in real life around the world. Folks can contact us on our website and they can sign up for a community screening event virtually, or in real life we already have like over a hundred requests. We haven't even like really officially launched it yet.

And so there's just this really grassroots natural community. Like I love your campfire analogy. You know, people are gathering and it feels wonderful and it feels like so many people have emailed me saying, after watching your film, I feel less crazy. Like, I feel like there are other people who see that this is, um, this is a narrative.

That doesn't make sense. And then I think what is fascinating to me about the kind of AI field or the critical AI space is that so many dialogues are consolidating and. What we're really talking about are old school questions around social organizing around how do we create a society we want? What is the function of democracy?

One of the problems with the other film that you're talking about and the kind of white bro, tech bro perspective is [00:47:00] that it says The problem is super intelligence. And then we all look over there when the real problem is local planning loss for data centers. Data sovereignty laws and policies and you know, capitalism, those things are the things to talk about.

So that campfire analogy is wonderful, and that's kind of like what I'm hoping the film does. And then on a, on a more tactical level, we're just making it available via until we go on PBS, and then it will be also be available for rent on all of the big tech platforms come may. So in terms of getting the film out there, we're really excited about and as a platform, our ability to like connect with our fans directly and our audience directly.

There's already like huge kind of grassroots word of mouth interest in the film, which I've never, I've never been a part of a film that has this before, which is really cool.

Alix: Ghost in the Machine will be airing on PBS later this year and on YouTube in the fall if you are itching to watch it sooner. That's great. Valerie's been working on getting the movie in [00:48:00] indie theaters. Screenings with organizations working on AI politics. We will add links in the show notes and may update those links when there are additional availabilities to see the film and watch it as soon as you can.

And when you do, tell us what you think. Up next week is Naomi Klein. She's gonna share a bit about what our future could actually look like if we keep pushing back on. All the things. I went into this conversation knowing what Naomi Klein's work has done to me mentally, historically. Um, a lot of her books have shaped my politics and a lot of her books also.

Uh, and her writing generally, I think gives us glimpses into how the world really is. And I think sometimes that can feel. Dark. But I think she also, because of the way that she engages meaningfully and directly in the work, I think oftentimes you can leave conversations with her feeling so much more hopeful.

And that's what happens in our conversation next week. So do [00:49:00] tune into that to get a dose of historical context and education. Also some really insightful ways of looking at the world that are energizing and sort of leave you wanting to be a part of what comes next. Thank you so much to Valerie for this episode.

Thanks to producers, Georgia Iacovou and Sarah Myles, uh, for putting this episode together and to the team, Zoe Trout, Marion Wellington, Kushal Dev and Van Newman, and all the team at the maybe that helps put this show together and we will see you next week.

Stay up to speed on tech politics

Subscribe for updates and insights delivered right to your inbox. (We won’t overdo it. Unsubscribe anytime.)

Illustration of office worker in a pants suit leaning so far into their computer monitor that their entire head appears to have entered the face of the computer screen