.png)
Show Notes
Welcome to the final boss of scams in the age of technology: Enshittification
More like this: Nodestar: The Eternal September w/ Mike Masnick
This is our final episode of Gotcha! — our series on scams, how they work, and how technology both amplifies and obscures them. For this final instalment we have Cory Doctorow on to chat about his new book Enshittification.
Is platformisation essentially just an industrial level scam? We will deep-dive the enshittification playbook to understand how companies lock users into decaying platforms, and get away with it. Cory shares ideas on what we can do differently to turn tide. Listen to learn what a ‘chickenised reverse centaur’ is…
Further reading & resources:
- Buy Enshittifcation now from Verso Books!
- Picks and Shovels by Cory Doctorow
- On The Media series on Enshittification
- Pluralistic — Daily Links and essays by Cory Doctorow
- Conservatism Considered as a Movement of Bitter Rubes — Cory on why conservatism creates a friendly environment for scams
- How I Got Scammed — Cory on his personal experiences of being scammed
- All of Cory’s books
- All (Antitrust) Politics Are Local — the entry to Pluralistic that Cory wrote on the day of recording
Transcript
Alice: If you go to X, you're seeing porn, you're seeing crypto ads, you're seeing like supplements. You're seeing all of these like scam adjacent things.
Mark: The defining feature of that system is basically a way for wealthy people to control their money with impunity
Cory: platforms start to hemorrhage users. And then they panic, although they call it pivoting and they do all kinds of really dumb shit.
Bridget: And we're taught that it's good because we're all, you know, doing it together.
Lana: Looking at a history of scams is really looking at a shadow history of the economy and a shadow history of our communication systems.
Alix: This is Gotcha. A four part series on scams, how they work and how technology is supercharging them.
Alix: Welcome to our final episode of Gotcha. Although I will say this has made me even more curious about scams. So we might do more episodes in the future and we'll let you know by titling these episodes in this way. So if you ever look back at our stream, you'll see that when there's a little shorthand name of things.
Alix: That's kind of a cluster of episodes we've done around a topic. So if you want more on this or wanna look back at past series, you can do that hopefully pretty visually easily in the feed. And also, as I mentioned in the first episode with this, if you have topics that you are working on or that you would love for us to give this series treatment to please reach out.
Alix: 'cause we're always interested in diving deep on these kinds of topics. This is our last episode in the series, and we are gonna be sitting down with. The one, the only Cory Doctorow, and talking about ification, which has kind of taken the world by storm a little bit as a, I don't know, like a vehicle of expressing frustration about how terrible it is to engage in any technology product at all.
Alix: But there's a lot more to it than that. And Cory. Took the idea from, I think first a blog post then that really great series and on the media, which we can link to where he and Brooke Gladstone got to go through his thinking in a couple of episodes into a book form. And we talked about everything from how did the technology industry become kind of an industrial level scam and what.
Alix: Can we do about it? Like how do we unring some of the bells that have been wrong in this era where tech companies have become part of the problem? And I think the reason we got Cory on for scams is, one, the flexibility of digital technologies to make deceptive practices easier and faster, which we got into a little bit in the last episode with Alice and Lana about how generative AI has affected these scams.
Alix: But really like thinking about how when something is created that is software, how quickly it can morph into something that is weaponized against everyday people. And then kind of how the scale of digital technologies gives platforms outsized power. And then they become incentivized over time to keep users locked into those ecosystems, which is very scammy.
Alix: Like the goal here is not let's make the best product to attract the most users. It's let's make the best. Product for now, convince those users to use it and then basically prevent them from moving on. When we start making the value of that product less and less and less, and to the point of actually potentially being abusive.
Alix: That sort of scale and power also then makes it really hard to regulate. All right, so let's get into it with Cory Doctorow.
Cory: My name's Cory Doctorow. I'm a science fiction novelist, a journalist, an activist. I've worked for the Electronic Frontier Foundation nonprofit for 23 years, and I, um, visiting faculty at Cornell and several other universities.
Alix: I've heard you explain in acidification multiple times. I thought the on the media series was really good and they like gave you time to like really unpack it and it just.
Alix: Always felt like a longer format concept that had been kind of shoved into one word that would travel. Um, and so I'm glad that you had the time to like actually spell out. I mean, I think like the pathology piece and like getting into the examples was really helpful. In the book, I mean, you write so many books in ification.
Alix: Is it because it was like traveling that you decided to spend more time on this or like what was the motivation for actually turning this into a book?
Cory: Oh, uh, well, you know, I write when I'm anxious. Uh, and so I've written 10 [00:04:00] books now since the lockdown, and I work through my daily anxieties through a newsletter I edit called Pluralistic, which is, you know, this open access semi daily, five days a week.
Cory: Newsletter where I synthesize everything that seems important. Uh, rather than keeping notes for myself, I keep notes in public and I try to synthesize it into essays and that leads me into. More complex, more nuanced, more synthetic ideas. You start with just a very straightforward one. You know, today I wrote a piece about the movement of antitrust enforcement to localities, so that localities have been doing really well at doing things like blocking data centers.
Cory: Portland, Maine just prevented Ticketmaster from entering the market, and this is as the DOJ is walking away from its ticket master case. So you have. Locality's picking up the slack. No one is gonna write a book about that, but that is like a piece of a puzzle. And by writing out all the things about it that seem important and how it relates to everything else in my mind, that helps me kind of [00:05:00] get to the next stage of the next in this long synthetic process that eventually turns into a book.
Cory: So it was in Ification was always gonna be a book if it were important enough because books are like. The place where you save your game in Zelda, right? Where it's like, okay, I've gotten this far, I'm gonna like take everything I thought of now and put it into one argument, and then I'm gonna start on the next thing.
Cory: And so, you know, the nonfiction series that I've done since the Lockdowns, which is how to destroy surveillance, capitalism, choke Point, capitalism. Internet Con and ification. And now the AI book, uh, the Reverse Center's Guide to AI are like a, a series of developing related thoughts. None of them replace the others, but they constitute a kind of evolving analysis of.
Cory: What's going on. And part of the reason that that analysis has to evolve is because I'm getting smarter about it. But partly it's because the world is changing too. And so the analysis so fast has to be updated. Yeah, yeah.
Alix: Yeah. I mean it makes loads of sense based on even how the sequence of the book is structured.
Alix: 'cause it seemed like, like the first pass, like that [00:06:00] first kind of um, I think you call it the pathology, it sounds like that was kind of the first figuring it out and then kind of moving towards. Getting more specific about how we got there and then kind of getting into solutions, it feels like, I can imagine those being almost sequential times that you were chewing on things and kind of building it on the phase before.
Cory: Yeah, and it's definitely the case that, you know, this evolved through a series of speeches and essays and so. The second essay that I wrote, the first one that got a lot of attention was one about the pathology, right? This three stage progression, the getting attention for an article is always nice, but part of the things that it does is it forces you to think harder because some of that attention is critical, and so you know, you rub up against people who think you're wrong.
Cory: And so that required that I develop a more detailed critique about the underlying mechanism and then the causal roots of it. Because I do think that, you know, as a materialist, I fight against people who have a teleological view that are just like, well, the internet went bad because the internet was going to be bad.
Cory: Iron laws of [00:07:00] economic great forces of history. Here we are. And I really do think that that lets a lot of people who made some terrible decisions off the hook, like it is both too personal and not personal enough because it is personal in the sense that it's like, okay, mark Zuckerberg was bad and that's why Facebook is bad.
Cory: But it's not personal enough in the sense that it assumes that the policies that allowed Mark Zuckerberg to make Facebook worse. Arose out of a kind of vacuum and that there weren't like policy makers who had people at their elbow saying, don't do this. This is bad. It will let Mark Zuckerberg destroy Facebook.
Cory: And they were like, no, no, no. I'm sure it'll be fine. Zuckerberg seems like a nice fellow. Right. And that is the, um. Personalization I seek to make because I think you could like drop Mark Zuckerberg in a wood chipper and there'd be another Mark Zuckerberg right behind him ready to do what he's doing. I think you have to create the conditions in which Zuckerberg in conduct is not practical.
Cory: I don't think that like changing the man will make it better.
Alix: No, I also, I mean, I think there's a quote in your book. A company can yield to the temptation to do [00:08:00] only those things that are technically possible, and I feel like that's such a powerful. Concept. Do you wanna unpack that a little bit more about like, mark Zuckerberg having the space to do what he did is probably more of a problem than Mark Zuckerberg, particularly being the guy that was doing it?
Cory: Yeah. Well, you know, I think that Larry Lessig, who's a, a mentor of mine, really nailed this, you know, 25 years ago he wrote a book called Code, another Laws of Cyberspace, and he says that outcomes are determined by a mix of four factors, code law, norms in markets. So nobody ever does a thing that's technologically impossible.
Cory: Definitionally, right? So once a thing becomes technologically possible, it doesn't mean someone's gonna do it, but it means they might. And so creating a new technological possibility, for better or for worse, opens the door for that to occur. Norms. So people do less of things that you're ashamed of. Again, some people do shameful things, but when we are ashamed of things we don't do as much.
Cory: And when you're proud of them, we do them differently. So, you know, you can think of. The response to [00:09:00] misogyny, at least up until like last October, as a thing that changed a norm about the kinds of jokes people made and attitudes that they ev convinced in public and that were overt about, even if it didn't change their views in every case.
Cory: You can also think of like the World Post Act Up and Stonewall as changing what people are willing to say about themselves in public too. So norms change things, laws change things. Uh, so things that are illegal happen less. Going back to Stonewall. There were lots of gay people and long-term relationships that went through a lot of back flips to incorporate an LLC and make each other's lasting power of attorney and whatever, try and replicate the structures of marriage.
Cory: But once marriage became legal and once we had marriage equality, just a lot of queer people just got married. Right? You got more of it 'cause it was legal and markets, right? People do things more when they're profitable. One of the reasons we have a data broker sector, it's not just that there's a bunch of people who are jerks and don't care.
Cory: About your feelings, and so they spy on you. It's that it's profitable, [00:10:00] right? And these things are related. So if. Data brokerage was illegal, right? If we'd had a new privacy law since 1988, when we got the last one, there would still be people doing data brokerage, but they'd be like illegal and they wouldn't have overt sources of capital.
Cory: They'd be the mob. They wouldn't be like people who went out and raised tens of millions or billions of dollars to create non-consensual dossiers at population scale, and you just have less of it. And so. All of these things, inter interrelate. You know, gay marriage became legal in part because of norms, right?
Cory: But then once gay marriage was legal, it became nively more acceptable to be gay and married. And so when I approach a technological policy question, I am often looking at it through this lens and asking what levers do we have that are normative, that are legal, that are technological, and that are commercial that we can yank on?
Cory: To adjust the conduct of firms and the people they relate to.
Alix: Yeah. Which makes loads of sense. 'cause I feel like, I mean, as you said, it's not like [00:11:00] laws of physics. These are rules that we tweak and, and basically construct ideally in a democratic society and you're sort of making these things happen and then things happen within those systems.
Alix: It's not like the systems manifest out of their own. Will, I guess.
Cory: Great forces of history. Iron laws of economics. Yeah, exactly. And you know, I, I think that like people think that someone came down off a mountain with two stone tablets and said, Larry Sergey, stop rotating your log files and start looking in them for actionable market intelligence.
Cory: Right? Like. They made a choice, right? They could have made a different choice. The reason they made that choice was in part because they needed more money, right? Because it was the dotcom downturn. But the other half of why they thought they could get more money is because we didn't have any privacy laws, right?
Cory: Like they could have made more money by getting into human trafficking too, but they didn't because it was illegal. It. So, you know, it leaving aside what we think of their morals or whatever, like people who are running or aspire to run public companies in our VCs to answer to generally if they're breaking the law, they're trying to break the law in a way where they're not gonna get caught.
Cory: And if [00:12:00] they do, they're not gonna go to prison. And it's, you know, the risk is worth the reward. That is clearly the case where child trafficking was not gonna be a good outcome no matter what other reasons they might have had not to do it, but violating your privacy was right. And so like, it's not, I think, crazy to say maybe we would've had a different.
Cory: Google, right? Maybe they would've found a different business model. Maybe they would've taken this business model and used it differently if they had not been in this equilibrium where spying on people is profitable.
Alix: Yeah. Well, let's back up. So I wanna, I mean, I feel so bad doing this to you 'cause I feel like you have done this 1 million times.
Alix: But can you just describe and explain the kind of mechanics of in ification?
Cory: Yeah. So in ification, it's this funny, dirty word that I came up with to describe a big, complex idea. And it's a big complex idea that seeks to explain platform Decay platforms are the kind of indigenous form of enterprise on the internet.
Cory: They are what you might call an intermediary because they sit between different groups of people. Nothing wrong with intermediaries. [00:13:00] Even if you wanna drive a taxi, you probably don't want to also found a credit card company. And so it's nice if you can take payments without having to. Also be a payment company.
Cory: And so intermediaries are fine, but when intermediaries become more powerful than the entities, they broker relationships between, they can usurp that relationship and rather than. Serving both parties and assisting them. They end up, uh, extracting from both parties to their detriment and to the benefit of the intermediary.
Cory: And so in, in Ification follows this characteristic three stage process. Not every platform inify the same way, but this is the kind of platonic ideal platform start by being good to their end users. But they find a way to lock the end user in. So maybe they like sell you a bunch of media that's locked to their app.
Cory: And so if you quit the app, you have to throw at your media. So that would be like, uh, buying audible audio books. Or maybe they find a really nefarious form of lock in, like they get the Saudi royal family to give them $31 billion to piss up a wall. Losing [00:14:00] money on every taxi ride until Uber is the only cab company in town and all the transit have been starve investment for a decade.
Cory: Or maybe they just do what social media networks do, which is like have a place where people you like have gathered. And of course we love our friends, but they are a pain in the ass. And the fact that you all hate where you are doesn't mean that you can agree when it's time to go somewhere else or where to go.
Cory: You can't even agree on what board game to play, so, so long as you love your friends more than you hate. The social media network you're on, you'll stay there because that's how you're staying in touch with them. And so once they've got you locked in, they, they make things worse for you and they make things better for business customers.
Cory: You know, Facebook made it possible for publishers to insert things into your feed that you hadn't subscribed to, and they let advertisers target ads to you based on surveillance data that brings those. Sellers to the platform and locks them in too. And sellers are actually very sensitive to lock in.
Cory: It's much easier to lock them in than buyers. In our discourse about competition and markets, we focus on [00:15:00] monopolies, partly 'cause like we've all played the board game, but Monopsonies, right, which is powerful sellers, they don't get as much look in, but there are very few businesses that can survive, say a 15% overnight drop in their sales.
Cory: That is often fatal to a business, nothing else. It means they have to fire key employees and then if business picks up, they have to retrain those people. It can be very hard. And so once you are a business dependent on a customer for say, 20% of your, of your turnover, that customer just gains an enormous amount of power over you.
Cory: And you know, anyone who's like supplied stuff to Walmart knows what that looks like. So these business customers also become dependent on the platform. Then the platform starts to screw them too. And this is one place where I think my critique departs from a lot of the critiques that predated it.
Cory: Because you have people like uh, Shoshana Zuboff or others who are quite critical of platforms, but who embrace the idea that you may have heard as an aphorism. If you're not paying for the product, you're the product. Really what the platforms are doing is they're selling you out to these business customers.[00:16:00]
Cory: As though the platform and the business customer are in cahoots, but the platform and the business customer are not in cahoots, the platform wants to exploit them too, right? Apple rips off every app seller for 30 cents out of every dollar. Amazon takes 45 to 51 cents at every reseller dollar meta.
Cory: Google take 51% of every advertising dollar. These are extremely predatory actions, and so what they do is they start to drain off the share that the business customers have as well. And they try to attain a kind of stasis where all the available surplus, everything except what's needed to keep users locked to each other and business customers locked to users has been extracted for the shareholders and for the executives.
Cory: That's where we're at now, but it's also like a really difficult place to be in because the difference between, I hate this, but I can't seem to stop using it and I hate it and I'm never using it again. Very thin. And so oftentimes you'll get to a place where these platforms start to hemorrhage users and then they panic, although they call it pivoting and they do all kinds of [00:17:00] really dumb shit.
Cory: Like, you know, mark Zuckerberg arises from his sarcophagus and says, you know, harken to me brothers and sisters, I've had a vision in the night. I know I told you that the future would consist of like arguing with your racist uncle using the. Primitive text interface I created in a dorm room so I could non-consensually rate the fuckability of Harvard undergraduates.
Cory: But it turns out that the future really consists of me converting you and everyone you love into a legless, sexless, low-polygon heavily surveilled cartoon character that I'm gonna imprison in a virtual world I stole from a 25-year-old satirical cyberpunk novel that I call the Metaverse, right? And like that's when the platform turns into a pile of shit.
Cory: And we've seen that before. The kind of 2010s, 2000 fives when you saw platforms coming and going, you know, MySpace would rise and fall. What's changed since then? Is that platforms are much more durable even after they're terrible. They're zombies, right? They don't die. And the rest of the in enshittification thesis, which I can get into, but it takes a while, is about why that's happened, right?
Cory: What created the [00:18:00] circumstances where all these guys decided it was time to wreck the things that they built and how come it didn't result in those things going away.
Alix: Yeah. I also really love that idea of like the monopoly container. There was a quote in the book about. Imagine if the carpenter who installed your new kitchen got a share of the closing price of your house when you moved as a, as a, yeah, as a correl.
Alix: Do you, can you explain? Can you actually, I didn't know about the Pantone. Is it Panton or Pan? Pantone. Pantone. Pantone.
Cory: Pantone.
Alix: Pantone is the bread.
Cory: Pantone is Italian. Italian Christmas bread, right? Or maybe, yes. I think you're right. Yeah. Okay. Yes. Yeah. So,
Alix: Pantone, can you tell this story? Sure. 'cause I, I've never heard this before and it's insane.
Cory: Well, if you own an inkjet printer, you know your printer has four different colors in it, right? Black, blue, yellow, and, and red sign magenta, black and yellow. Most colors can be made by mixing these, but there are a bunch of colors that you can't make by mixing those four colors and. Those colors are things like glow in the dark colors or very bright colors or colors that are matte or shiny or [00:19:00] gloss or that have lacquers in them or whatever, right?
Cory: They're called spot colors. And when you're printing print paper out and when you put something out on paper, you take one of these big industrial printing presses and you run it through four times to get the black, yellow, red, and and blue, and then you run it through again to put spots. Of the spot color on and Pantone, it's the color swatches that tell you what those spots are.
Cory: So you go to the paint store and you get a big box of paint chips and it tells you what the colors are, and then you tell 'em what you want and they mix that paint up. Same thing if you're a designer and a, uh, with a client. You take out your big box of Pantone swatches and you go through and you're like, you like this neon orange?
Cory: Great. I will tell a printer, and the printer will either mix that up or buy it, and they'll put it in the, in the fifth bucket, in the, in the, uh, giant printer when we run your job. And so designers totally rely on this. Pantone colors are. As foundational to printing as type almost. And uh, since time immemorial, [00:20:00] if you bought a copy of Photoshop or Illustrator or any other Adobe design product, they license the Pantone colors.
Cory: And so you could just say like, make this color, whatever, bright orange. And it would try and simulate it on the screen, but when I went to the printer, it would be fine. So somewhere in there, Adobe starts to move away from selling you a box with a CD in it that has a copy of Photoshop and replaces it with thing called Adobe Creative Cloud.
Cory: Adobe Creative Cloud lives in the cloud. You install in your browser. The fact that you no longer own the program. Means that they can take features away. And they had some kind of negotiation with Pantone. No one knows what the content of that, of that negotiation was, but at the end of it, they said, we are no longer licensing Pantone colors.
Cory: If you open any of your files ever again, all the Pantone colors will be black. And so if you've got a document composed of five Pantone colors, it's now just a black blob. And the only way to get those colors back is you have to give Pantone $21 a month for the rest of time. And if Adobe had [00:21:00] still been selling you CDs with software on it, which I admit is like not the ideal way to get software, but maybe they sell you a download, uh, with a software on it and say what happened with Pantone, went to Adobe and said, you've been paying us $21 a user.
Cory: We wanna make it $50 a user. And Adobe was like, no way. We'll walk away from the deal before we do it. And uh, Pantone said, no, you'll never walk away from the deal. And then Adobe called their bluff. Pantone just wouldn't have called their bluff if they knew that you could go on using Pantone colors even after they severed the deal with Adobe because the Pantone libraries were in the software that was already installed in your hard drive and it just couldn't be taken away.
Cory: So they created the technical capability to confiscate a feature. And then either they or one of their key suppliers confiscated the feature, right? So they put the mantle, the gun on the mantle piece. In Act one, they don't really get to act surprised that it's gone off. In Act three, I often find technologies that have these capabilities [00:22:00] that are clearly ripe for abuse.
Cory: And as a science fiction writer, when I see this, I think about all those science fiction movies where someone slips on the bridge and presses the self-destruct button. And I always think like, this would be a better spaceship if it didn't have a self-destruct button on it. Why did they put this self-destruct button on this spaceship?
Cory: It's a terrible idea. And then I look at the world and I'm like, oh, we are putting self-destruct buttons everywhere. And then we keep acting surprised that someone comes along and hits the self-destruct button.
Alix: Yeah. I think that's a really good analogy. Following from that, you hear a lot about platform economics and like being able to deliver to a lot more users, um, without necessarily increasing the cost of producing the products that are delivered to those users, thereby reducing the price of people to engage in goods and that that in some way maybe is.
Alix: Positive, um, in an era where more people want more stuff. So I feel like there maybe there's like, I don't know, interesting, good parts of platforms that we can set aside. But one of the things you [00:23:00] say in the book is that you think it's important to make platforms less important, rather than making them less terrible.
Alix: Do you wanna talk a little bit about like where you want platforms to sit in this understanding of value creation for. People.
Cory: Yeah. So to get back to what I said at the start, I don't think there's anything wrong with the platform per se, with an intermediary per se. Or there are lots of people who wanna be designers who don't want to invent Photoshop.
Cory: And that's fine, right? It's fine to have like someone in your supply chain who is giving you those tools. And the problem is when they get very, very powerful. And one of the things that we've tried to do with powerful platforms is make them wield their power more wisely. So the best example would be like with social media platforms.
Cory: We've had like a 20 year experiment in trying to get social media platforms to behave themselves better, mostly in the European Union with rules about things like hate speech and harassment and, you know, other kinds of, of bad user to use conduct. And I don't wanna minimize the harms of that stuff. I don't think it's like fake.
Cory: To worry about hate speech and harassment. And I'm keenly aware that the people who are [00:24:00] most subject to hate speech and harassment aren't middle class white guys who speak English as a first language and live in North America. And so I never want to say that this is like not a problem because it's not a huge problem for me.
Cory: But I think we can say that it's been a failure trying to get them to prevent it. Right, so 20 years of policing hate speech, and we are at this place where we just keep getting tripped up by the fact that really adjudicating a claim that a platform failed in its duty to police user speech. It's so fact intensive and time consuming that it never really seems to work, right?
Cory: If you've had something horrible happen to you on a platform and you ask a regulator to look into it, well first they have to agree on what is hate speech. Then they have to agree on whether the thing that was said to you meets the definition. Then they have to determine whether the platform took reasonable technical steps to prevent it, which, you know, given that the platforms are all purpose-built hardware.
Cory: And infrastructure. That means deposing their engineers because like nobody understands how Facebook works except for Facebook senior [00:25:00] ops people. 'cause it doesn't work like anything else that's ever been built works. And you wouldn't build something new like it today, it's Sui Janes. And then after all that five years, seven years, you figure out whether Facebook wronged you.
Cory: And this is for, uh, an occasion that occurs on Facebook a hundred times a minute. And it's just not well matched. And so I think we should look at why people who are marginalized in harm's way and so frequently subjected to hateful conduct on the platforms don't just leave. And the answer is that the only thing worse than being marginalized and subject to hateful conduct and in a lot of trouble, is to have all of those problems and be alone.
Cory: Right. Be isolated from the people on the platform that matter to you. And so those people are staying because they're holding each other hostage. Now, there is nothing about a platform, a social media platform. That intrinsically says that if you leave it, you can't stay in touch with the [00:26:00] people who stayed behind.
Cory: When Mark Zuckerberg started Facebook, he had this problem that everyone who he might approach to join a social media network already had a social media account on MySpace. His pitch to MySpace users wasn't, forget your dopey friends. Come use Facebook. It was, come use Facebook and here's a bot. And if you give us your login and password, we will log into MySpace several times a day, pretending to be you.
Cory: Grab everything waiting for you there and put it in your Facebook inbox and you can reply to it. We'll push it back out to MySpace so you don't have to choose between your friends and a superior service where they care about you more. 'cause Mark Zuckerberg's other big pitch was Facebook is like MySpace, but we will never, ever spy on you.
Cory: Understanding how he came to break that promise. Not, not whether he meant it sincerely or not, but why he thought he could break it and get away with it. It's a very important matter because maybe we could have arranged things so that he didn't think he could get away with breaking that promise, and he might have kept it.
Cory: And I think a lot of why he thought he could break that promise. Is because he made sure that no one ever did [00:27:00] onto Facebook, as Facebook did onto MySpace, you cannot legally today make a tool comparable to the one that Facebook made to allow users to ease their passage outta MySpace without running afoul of a bucket of laws we call IP Law.
Cory: Which is a really incoherent category, but I think if we say that in commercial contexts when we say IP law, what we mean is a law that allows a firm to reach beyond its own walls and exert control of the conduct of its critics or its customers or its competitors. And so Mark Zuckerberg avails himself of contract law and of copyright law and anti circumvention trademark, patent copyright, non-compete non-disclosure, and a bunch of other laws that allow him to make it legally very fraught to build the kind of tool that he built to free users from Rupert Murdoch's Grasp on MySpace.
Alix: Just interviewed Mike Masnick a few weeks ago, and I didn't know about the power.com, um, platform fight. I didn't realize that it was basically an API fight to basically protect [00:28:00] them, but I didn't know that they had done that very thing to MySpace. Oh, sure. Which like really? Yeah, yeah,
Cory: yeah. Oh, uh, I mean, when they do it, it's progress.
Cory: When you do it to them, it's piracy. So the fact that that. Could conceivably leave Facebook and they could, rather than making, you know, these kind of gorilla warfare ways of getting off Facebook where you have the scraper that Facebook doesn't support and they're trying to break it and you're trying to fix it and so on.
Cory: We just order them to do it and we, we've done this before. If you have a Verizon phone and you go out and you get a Google Sim and you put it in your phone. Your phone number works and like your friends don't know, they don't even care. It would be like deeply weird to call a friend up and say like, you would not believe who Sim is in my phone right now if you're just talking to your friend.
Cory: 'cause that's what you want your phone for. It's entirely possible to imagine a future where you leave Facebook and you go to whatever Blue Sky or MAs it on or something that doesn't exist yet because it's so hard to get market oxygen for new social media and your friends don't even know you've left.
Cory: Twitter or Facebook, [00:29:00] they just send you messages the way they always have, and they come to you wherever you are, and that just makes the platforms less important. We still want the platforms to like pull their socks up and be good to their users, but if they're not, you don't have to wait for a lawmaker to figure out how to crack the nut of making Mark Zuckerberg treat people like they're worthy of moral consideration.
Cory: You just get to go somewhere else. Maybe the fact that Mark Zuckerberg is now losing money as opposed to merely being loathed. Sharpens his attention and makes him behave himself better. Maybe it doesn't, in which case, you know you can leave. And so I really think that taking away platform power rather than improving its use, it's the regulatory direction we should all be applying ourselves to in the coming decades.
Cory: I think getting the buggers to behave is a dead letter, and it's time to stop pretending that Mark Zuckerberg will someday have a ene conversion.
Alix: I think that's right and I feel like one of the aspects of like when you were, you were engaging in solutioning, I found really interesting is the, [00:30:00] the policy, adminis ability is the expression that you used and kind of describing what good policy looks like.
Alix: So as an example, it is extremely difficult to police harassment on platforms. That's like a very normative set of things. It's very difficult. By the time you deal with one incident, a hundred more have happened and a lot of the regulation right now is, seems to be targeting. Um, direct user experiences, but it sounds like you want more structural requirements that will take longer to actually maybe manifest in.
Alix: What we would wanna see is kind of one of the questions I wanted to ask you is kind of how to deal with the sense of societal urgency and this like pressure of policy makers, like please make a law to like make it all better, but like, actually some of the medicine these technologies need is so structural that like to.
Alix: Course correct is gonna take time. And I mean, I feel like, just for starters, do you wanna explain this idea of policy, adminis ability, stability? Yeah. And then we can talk about it. Yeah.
Cory: Yeah. Well, I [00:31:00] mean, you know, I was just talking about this idea that you could leave a platform and go somewhere else.
Cory: Now we do have smaller platforms that do that. So Mastodon, which is like an open source alternative to Twitter. Unlike Twitter, you don't have just one server that everyone's on. There's lots and lots of different servers and they all talk to each other the same way that like you can be on Gmail and I can be on Outlook and we just send each other messages.
Cory: I can be on one Mastermind server, you can be on the other, and I can subscribe to your feed the underlying standard for that. Makes it really easy to jump from one server to another. Built into the software is a link that you click as a user when you're logged in, and it just gives you a little blob of data and then you quit from that server.
Cory: You go somewhere else and you send them that blob of data and everyone you follow and everyone who follows you and all your blocks and mutes and everything, they all just come over to the new server. No one even notices that you've moved in the same way that you can port your number from from T-Mobile to Verizon.
Cory: If we said to Twitter, Facebook, and their rivals, you know, you have to support this standard. We could easily administer that because say you are running a [00:32:00] server and I wanna leave 'cause I don't like how you run the show and I want my little file so I can go and you haven't given it to me. I can go to the regulator and the regulator can say, Alex.
Cory: I know you say you've given Corey his file. Corey says You haven't, I don't care who's telling the truth, just give a copy to me and I'll pass it on to him. Right? And then the regulator knows that I have the file. And you could imagine like 10 people in an office somewhere. And. Administering this policy for 4 billion social media users and really not having much, uh, breaking much of a sweat to do it.
Cory: 'cause these are, that's not a fact intensive question. And so, you know, obviously there are some policies that are really easy to administer because they're trivial, right? They don't get anything done. But I really do believe. That this is not actually a hard policy to set up, and I don't think it's one that would take a long time to pay dividends.
Cory: I think if we made that the rule tomorrow, and remember the European Union [00:33:00] crafted the Digital Markets Act in like 2023 and brought it into effect in 2024. We're not talking about extremely long timescales then. There would be lots of people who would stand up. Things like Masteron and blue sky servers that already exist.
Cory: So getting back to things that don't exist, don't happen. Things that do exist might, so we already have the other, the endpoint that you could go to if you left Twitter or Facebook. We just don't have a way to leave one and go to the other. And I do think that we could get somewhere really quickly, you know, earlier you said something like, um, no one can fix this.
Cory: And I think you're right. Like, we can't fix this. Individually like, and you know, some of the early reviews of the book have said, you know, readers will, uh, possibly be frustrated by the solutions that doctor offers because they're all policy oriented and there's nothing for individuals to do.
Alix: That's interesting.
Cory: Agree with that. It's true. Yeah, yeah, yeah. I was gonna say, completely agree with it. And I wish, because it shouldn't be our responsibility, like, I mean, or well, but, but also like, you can't [00:34:00] fix climate change by recycling, right? Like, it's just not gonna happen. You know? This is, this is 40 years of neoliberal brainwashing that the only way to make an impact in society is by shopping carefully.
Cory: And you know, I think joining, uh, a polity that puts pressure on policymakers to change the policy, that's how we get it. Now, it is true that once the policy is in place, individuals can make a really big difference. If you can make the software right, like Mark Zuckerberg could make the bot that let people leave MySpace.
Cory: Once he made that software, it made a huge structural difference. But he was only able to make that software because we had a different legal regime.
Alix: I also think there's something to this dynamic of. The monopolies that are now kind of husks of some large administrative body that manages lots and lots and lots of users on a platform that the speed with which that dynamic would be disrupted if infrastructure was in place to allow for competition, which I think is one of the areas you describe as having kind of eroded away.
Alix: And that [00:35:00] without that, that. Sense of competition in these spaces. It makes it extremely difficult to even stress test whether users would leave until we see that. It's really hard to even let ourselves imagine that possibility. But do you wanna talk, there's a, there was like the four then your solution.
Alix: You talked about competition, regulation, interoperability, and tech worker power. Do you wanna like walk through those as domains of. Areas you see potential.
Cory: Those are the corollaries of why it all happened. So I've alluded a couple of times in this conversation to like, what allowed this to occur? Like, why can Mark Zuckerberg get away with hurting you without getting into trouble?
Cory: We used to have forces that discipline Mark Zuckerberg, so we had competition. Um, he had to worry about. You know, people leaving for a competitor. I just love this idea
Alix: of him being disciplined. I'm sorry. It's just like so funny to imagine him personally think, oh, there's rules, but sorry, carry. Yeah,
Cory: yeah.
Cory: But of course, I, I, I am not an economist, but some econ jargon has kind of slipped into my brain. 'cause it's, it's like sticky and it gets on everything and you can't wash it off. Uh, and so [00:36:00] this is econ jargon discipline, but when Mark Zuckerberg bought Instagram, he sent a memo to his CFO saying, I'm buying Instagram.
Cory: 'cause people like it better than Facebook. And so if they, even if they leave Facebook, the platform, they'll still be customers of Facebook, the business. And so, you know, like they used to fear competition. They don't anymore. They bought all the rivals. Google's a company that had one good idea in the previous millennium.
Cory: About how to be a better search engine. And in this millennium, they have had zero or next to zero successful product launches. All they've done is buy other people's successful products and operationalize them, whether that's Android or Google Maps or Docs or their whole ad tech stack. YouTube, you know, Google video is a failure.
Cory: They had to buy someone else's successful video platform to have a successful video platform, so they don't worry about competition. And then when competition collapses, you don't have to worry about regulation because. Cartels find it really easy to capture their regulators. They solve the problem that your friends can't solve when you [00:37:00] wanna leave Facebook, right?
Cory: When there's hundreds of you, you, you know, on Facebook, you can't agree on when to go, but if there was just like three of you, it's a lot easier. Which is why it's often the case that like if there's a bunch of you at a party. It's three of you who leave and then the other ones get pissed off and they're like, why didn't you tell us you were leaving?
Cory: Well, we did tell you, but you wanted to say goodbye. So we just left. Because when there's a small number, you can solve collective action problems. And so they can solve the collective action problem of capturing their regulator. Uh, they can agree on what it is they're gonna say to their regulator. And not only that, but because they're not really competing.
Cory: You know, these companies, they divide up the market like the Pope dividing up the new world, right? Google pays Apple $20 billion a year not to make a search engine. That means that they have a lot of surplus capital because they're not competing. So they don't have to worry about what Peter Teal calls wasteful competition.
Cory: They don't have to like pay their higher salaries or offer lower prices or increase product quality to retain your business because they're not competing and so they have excess capital. They solve the collective action problem and they can capture their regulators and [00:38:00] you know, like that's why the last time.
Cory: Uh, Congress passed a consumer privacy law. It was a law that banned video store clerks from disclosing your VHS rentals in 1988. Right? And so, like, why do we have privacy violations? Well, congress has been captured by the surveillance industry, and so that used to discipline firms and it doesn't anymore.
Cory: And then there's interoperability, there's technology itself. You know, we, we saw how Mark Zuckerberg was able to overcome the collective action problem on MySpace by giving users a bot. And the fact of the matter is computers are quite remarkable in that they have this feature of universality that, um, is, uh, not like anything else we've ever seen.
Cory: Formerly speaking, a computer is a, uh, touring complete universal von Neumann machine, which is like a fancy way of saying it could run every valid program. And, uh, that means that anytime someone builds a 10 foot pile of shit, someone else can build an 11 foot ladder. Uh, you know, every ad has an ad blocker.
Cory: Every ink ripoff has got a [00:39:00] third party ink cartridge, and the only thing that stops you from doing that is the fact that it's illegal. IP law has been mobilized to make that stuff illegal. And so over the last 20 some years, we have lost competition in tech. We've had regulatory capture. We've seen the expansion of IP law that made it harder for people to disrupt in a good way, disrupt bad rent seeking businesses with technology that restored power to users.
Cory: But we did for all of that time until pretty recently, have powerful tech workers and the tech. Workforce is uniquely constituted in that even though it's not unionized, it had a lot of power. And that power came from scarcity. Uh, and that scarcity has ended. So we've had half a million layoffs in the last three years, and US Tech alone and tech workers just don't have.
Cory: The juice they used to have, and a lot of tech workers, you know, they had been motivated to work very long hours, even though they had so much power by an appeal to their sense of mission. They really cared about what they did, and they cared about the users that they [00:40:00] felt they were, they were working on the behalf of, I call them Tron pilled, you know, I fight for the user and when the bosses got the whip hand over them.
Cory: They could no longer say like, I'm not gonna inify that product. I, I missed my mother's funeral to ship on time. They had to say, yes, boss. Uh, I'll have it for you on your desk by Monday, because there was just 10 people lined up to take their job if they wouldn't do it. And so, you know, you, you add all those things up and you see that this is how we got to the situation we're in.
Cory: And then finally, we are at a moment in which more workers want to be unionized. And public support for unions is higher than it's been since the seventies. Yeah, Trump did away with the National Labor Review Board, but like he's indulging in the Grinch fallacy. The Grinch thinks that if you take all the who's tinsel and trees and bells and presents that they'll lose Christmas because he thinks that they feel Christmasy because they've surrounded themselves with these things.
Cory: The thing is the who's feel Christmasy, so they surrounded themselves with these things. The reason we have [00:41:00] labor law is not. That labor law, uh, people thought it'd be good to have unions, so they created labor law, and then we got unions. We have labor law because unions, when they were illegal, fought for labor law.
Cory: And Trump thinks that he's ended the game, he's just thrown away the rule book because one of the things that labor law really does is limit what unions can do. And so now that there's no rules, um, I don't think there's any reason not to be the most militant moment in labor history. Especially given that Trump has taken away the contracts from 1 million unionized federal workers illegally in the last three months.
Cory: So all of these policy prescriptions, competition, better regulation, labor rights, reforming IP law to give people more rights to repair, adapt, and modify the technologies they use, protect themselves from privacy predation and so on. All of these are not. Just tech prescriptions. They are issues that people who care about climate and gender and labor and lots of other issues should be caring about.
Cory: And it means that [00:42:00] we have a much bigger coalition than we would if all we were doing was trying to fix, you know, iPhone stealing 30 cents of every dollar from every performer on. Patreon who gets their money that way.
Alix: Yeah, I think, I mean, I think the, the pastiche of, of policy prescription there, I think is what's compelling about this because I feel like a lot of, um, focus I see is like on one piece of that pie.
Alix: Um, and then you're like, well, even if that all. Came to pass, it wouldn't solve these other problems. But like if you look at that altogether, it feels like that kind of structure we've been talking about where removing the self-destruct, uh, button from the spaceships feels a little bit more possible where you explain what a chicken eyes reverse sent ta.
Alix: Because I, because I read it and was just like, you're, you're, so there's like a cumulative. Concept set of adjectives they like get longer and bigger and like more laden with all kinds of really interesting political concept modifiers. Yeah, totally. Yeah. So what's a [00:43:00] ized reverse center?
Cory: So, uh, it comes from, uh, labor studies and centers and reverse centers comes from automation studies.
Cory: So in labor studies, ization refers to. Workers who are treated as though they are independent contractors, but who have their conduct controlled to a very fine degree in a way that allows their employer to effectively pay them just above a starvation wage, but keep them hooked to the job. And it comes from chicken farming.
Cory: Because there's three major poultry processors left in America and they've divided up the territory. So any given farmer really only has one poultry processor they can sell to. And so if you want to be a chicken farmer, you have to buy your baby chicks from the poultry processor. They tell you what you have to feed, how often you have to feed them, how much to feed them, which drugs you can use, which vet you can use, how the coop is to be lit, when the light is to be on, what spectrum, bulbs to use, and so on.
Cory: They tell you all of that, but they don't tell you how much they're gonna buy your birds for. And what they do is they wait to observe across the [00:44:00] whole supply chain of how many birds have been made to set a price that allows each farmer to roll over their loans, but no one to get out ahead so that it's like the minimum price they can give you.
Cory: And they do all kinds of really sleazy things, like they will tell some farmers, you're on this light schedule or this feeding schedule, or you're using this kind of food. They won't tell them this is an experiment which may produce a greater yield but may produce a lower yield. Only they know this. And so you have these people who are effectively indentured because they're carrying lots of debt and because they're these three processors, they also stick together.
Cory: So there are farmers who speak out against one processor, can't sell their birds to any of them. And indeed there was one processor who, when a, A farmer spoke out against them. Not only did they not buy chickens from him, but then he started a business fixing coops and they told other farmers that if you hire this guy to fix your coop, we won't buy your birds.
Cory: And so they just, they just made him economic roadkill. So that's ization reverse cent. A centor in automation theory [00:45:00] is a person who is assisted by a machine. So when you drive a car, you're a centor. When you wear glasses, you're a centor. If you are typing and you have a spell checker that tells you when you make a typo, you're a centor.
Cory: All of that is cent. You are a, a human head on a kind of tireless machine body, a reverse center. Is a person who has to assist the machine and we see reverse center all over the place. You know, Amazon drivers who have these impossible quotas and they have to pee in bottles. Amazon warehouse workers who also have impossible quotas and are injured at three times the rate of other warehouse workers, Uber drivers and and so on.
Cory: You have these machines that are just using humans as like a kind of inconvenient last step actuators to get some part of the job done without regard to the wellbeing of the human and a chicken eyes reverse center. Is someone who has to buy their own equipment and then gets abused by that equipment and forced to act as its assistant.
Cory: So there's a company called Arise. It's a kind of pyramid [00:46:00] scheme. They recruit mostly black women and they say you can be a customer service rep for Disney Carnival Cruises or many other large blue chip companies. You have to go and you have to buy a special computer, a certain kind of headset. You have to pay us to train.
Cory: To be a Carnival Cruise customer service rep. So you have to pay for the training, and then we listen in on all your calls. And so if we hear your kids in the background or a neighbor with a, you know, leaf blower, we're gonna dock your salary. Or we may in fact just say, you can no longer work for Carnival Cruises, so all the money you pay to work for it is gone.
Cory: Uh, so everything is being observed and scripted to the finest degree, and yet you are technically your own boss, right? And you, you have all this debt you're carrying from buying this specialized equipment that they charge a premium for and that you can really only use to work with them. The same thing happens with Amazon delivery drivers, that, that companies that hire Amazon delivery drivers are not Amazon.
Cory: They're called delivery service providers, delivery service platforms, DSPs. And [00:47:00] so they find some like entrepreneur hustler and they say, do you wanna be your own boss? Take out a giant loan, buy a bunch of vans, give them Amazon paint jobs, buy Amazon uniforms, hire drivers, fit them out with all the cameras and sensors that we require, that watch those drivers and penalize them if they open their mouth while they're driving.
Cory: 'cause you're not allowed to sing if you're an Amazon driver. And then you can get paid for being an Amazon delivery service provider and you are the last mile and you, you're your own boss and you own your own business. But Amazon then will like get five or six in a territory that can only support four and then they'll fire two of them.
Cory: And it's not like you can. Deliver for someone else in your Amazon van, right? So you are a chicken eyes and then your employees are reverse centers 'cause they're having their, their movements scripted to this fine degree by AI cameras that, you know, Andy Jassy has insisted on. So yeah, chicken eyes, reverse cent.
Cory: Very important idea from, uh, both labor and automation theory.
Alix: Yeah, it's super interesting. I mean, I feel like we should also join up the, in ification theory with some of the other like scam, structural sort of speculative economy [00:48:00] things that we're seeing where basically companies are like, oh, Yolo, uh, there's no rules and regulations.
Alix: These norms are, you know, changing by the second. All of the, you know, the possibilities technically are changing. Um, and also, I dunno, the whole risk calculus of, of being a company right now feels off in terms of labor power and economic power from businesses. Um, so. I mean, isn't ification basically industrialized scam?
Alix: I mean, it has, so
Cory: I am. Very, very Catholic in my view of who can use ification to describe what, by all means, describe industrialized scams as in acidification, the way I use in acidification. There are some specific technical characteristics that has this pattern, good to end users, good to business customers.
Cory: It has this underlying mechanism, which I think is characteristic of a modern scam that I call twiddling, which is changing the underlying economics on a per user, per session basis. So, for example, um. You know, hospitals hate paying union scale for nurses. So most hospitals don't employ enough nurses and they bring in contract nurses [00:49:00] on a daily basis and they source them from, uh, you know, labor agencies, staffing agencies.
Cory: And it used to be that there were a lot of staffing agencies up and down the country, several competing in every market, but they have all been supplanted by four giant apps that I'll call themselves Uber for nurses and these apps. Buy the nurse's credit history because we haven't had a new privacy law since 1988.
Cory: So they buy the nurse's credit history and um, they calculate how much the nurse owes on their credit cards and how delinquent that debt is. And if the nurse owes more money, they pay them a lower salary. Because they're more economically desperate. They'll take a lower wage. Right. And you know, that is a very scam dynamic, but it's a scam dynamic that is labor intensive without automation.
Cory: Right? So, you know, black Heart of Coal bosses in Tennessee Gurney, Ford songs would surely have love to have paid their. Miners on this basis, but you just couldn't pay enough guys in green eye [00:50:00] shades to like do this with a ledger and you know, a pen pot and a quill, right? You need high speed digital tools to do this.
Cory: And what computers do is they take these very simple scam dynamics and they speed them up. And I think one of the mistakes that people make when they talk about the platforms and their power is they ascribe to them a kind of sorcerer capability. The people who believe in surveillance capitalism, hypothesis and dopamine hacking think that somehow like we finally arrived at the era of the mind control Ray, you know that where Mesmer and Rasputin and MK Ultra and pickup artists all failed.
Cory: Mark Zuckerberg has succeeded with AB splitting and warmed over skin area behaviorism, and I think that like it's much. More straightforward than that, which is good news, right? Because it means that it's easier to fight, it's much more straightforward. What they're doing is this simple trick, but they're doing it fast.
Cory: They're doing a trick that con artists have [00:51:00] have done since time immemorial, but their hands are so fast because they're computer assisted. They're a center that they can rip you off in this remarkable. A way where, where you don't even notice the knife going in until your guts are spelled out on the floor.
Alix: Yeah. It's a tangent, but it's one of my main issues with how Cambridge Analytica was covered is that it made it sound, I mean, it was just like, that's not how this works. Like even, is it worth media attention that it got like having this sector of like, you know, magician quality, we're just, we've passed into this realm of influence and persuasion that is beyond anything we've ever seen.
Alix: And it's like, no, they just like. Didn't respect data privacy and got slightly scammed and like transferred a bunch of data and then a bunch of people, I don't know, like the whole thing just felt really off.
Cory: Well, I also think if you're, I think if you're a lib, it's nice to think that the reason that people voted for Donald Trump is because there was a mind control ray and not because like institutions have failed them so much that they've been made into or easy marks.
Cory: Dumb hucksters like Trump and Bannon.
Alix: In the acknowledgements, you talk about how dirty [00:52:00] words are politically potent, and I just wanted to know your reflections on, in acidification as a term, um, and where you felt it's, where does the power, its power come from? I'm sure you've thought a lot about it.
Cory: I've come up with lots of words and analogies and explanations.
Cory: I've had some success. But it didn't take off until I started swearing. And, uh, the word in acidification I came up with just to describe something that I was not enjoying. I had been on a family vacation. We went to, um, Puerto Rico and we rented a cabin in a cloud forest. It had microwave internet. And, uh, microwave, relay internet.
Cory: If you know anything about microwave relays, you know that it doesn't go through clouds. And we were in a cloud forest and so it was not good. And we were really far from town. And so every time we wanted to go into town, we would try and look up what restaurants there were and which, what was open and so on.
Cory: We'd go to TripAdvisor and TripAdvisor loads, 250 trackers before it loads the restaurant review. It would just [00:53:00] time out. And I went to Twitter and I said, has anyone a TripAdvisor ever been on a trip? This is the most in ified site I've ever used. And a lot of people went, ha ha ha, inify. That's a funny word.
Cory: And then like a year and a half later, I was writing about platform decay, and I used the term acidification to describe it. And that was very successful. And I have concluded on that basis because, you know, I used the word in acidification to describe TripAdvisor. And no one kept using it, right? Like it came and went, right?
Cory: It wasn't like it suddenly became the word. Everyone used to describe bad things, but when I wrote a long detailed technical explanation and included the swear word, suddenly the two things came together. It was very sticky. And it's funny because, you know, I have a friend and colleague who speaks in very elite security circles.
Cory: He said like, I wish you would use a different word. 'cause I can't use that word in front of a NATO general. And I'm like, I'm pretty sure NATO generals have heard the word [00:54:00] shit. But if you think that that's a real problem, I encourage you to come up with a different word. Because I didn't come up with this word first.
Cory: I came up with it last. I stopped because that word worked.
Alix: I mean, it is really effective. Thank you so much both for the book. Also the combination of a political concept and a curse word that has traveled so well and carried. I think so much of the meaning, I'm sure sometimes. It gets used in glib ways that don't exactly align with,
Cory: I don't mind.
Cory: You have my blessing. Yeah.
Alix: Um, but this is great and yeah, thank you so much.
Cory: Well, thank you very much. I've really enjoyed talking with you and, and, and, um, you know, scams, uh, impairment schemes, subject near and dear to my heart. So thank you very much.
Alix: All right, so Cory's book is out now, I think as of yesterday. Um, so we've dropped a link in the show notes to buy. It's really worth the read, and, um, we have really enjoyed making this series. Uh, it is the last episode in. Gotcha. Thanks so much for listening. For those of you who that have stuck through [00:55:00] it with the whole series, if you've got any thoughts on the series itself or any ideas for future series, um, around particularly meaty topics.
Alix: Let us know, and thanks to Georgia Iacovou and Sarah Myles for putting it all together. Up next we have Hillary Ronan, who is a former elected official in the Bay Area, who is recently exercised about how important it is for local policymakers to understand. What role they play in AI regulation. And we get into some things she learned when she was an elected official about how policymaking works on technologies, um, and procurement.
Alix: And then now that she's no longer an elected official, she gets into what she hopes other local policy makers will do to protect their constituents. So we will see you next week.
Stay up to speed on tech politics
Subscribe for updates and insights delivered right to your inbox. (We won’t overdo it. Unsubscribe anytime.)
