Show Notes
Grammarly launched a feature that no one wanted and now they’re getting sued. They used the names of writers, journalists, and editors to pretend that AI versions of those people were making writing suggestions via the application. None of these ‘expert reviewers’ had any idea. Grammarly pissed off the wrong journalist.
And now Julia Angwin is suing them.
More like this: The Toxic Relationship Between AI & Journalism w/ Nic Dawes
In this episode Julia (and her lawyer Peter) discuss what happened with Grammarly, why she’s suing, and how neither of them can believe that this tool made it through their legal team and into the public realm.
Please email info@prflaw.com for more info, or if you would like your name to be searched in the list of experts that Grammarly used for their tool.
Further reading & resources:
- Julia’s op ed in the New York Times
- Pre-order Julia’s new book On Courage: How to be a Dissident in an Age of Fear
- Check out The Markup, founded by Julia
- Grammarly pulls AI author-impersonation tool after backlash — BBC 12th March 2026
- Shishir Mehrotra’s (CEO of Grammarly) apology on LinkedIn
- Grammarly Is Offering ‘Expert’ AI Reviews From Your Favorite Authors—Dead or Alive — Wired 4th March 2026
- Grammarly is using our identities without permission — The Verge 6th March 2026
- Grammarly turned me into an AI editor against my will and I hate it — Casey Newton, Platformer 9th March 2026
- Details of the case, from PRF Law, Julia’s representative firm
**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
Computer Says Maybe is produced by Georgia Iacovou, Kushal Dev, Marion Wellington, Sarah Myles, Van Newman, and Zoe Trout
Transcript
Alix: Hey there. Welcome to Computer Says Maybe. This is your host, Alix Dunn, and we have a little treat for you today. You may have heard about the recent Grammarly scandal where the AI company, if we call it that, I think they call themselves, that basically trained a model to act like a particular person, a particular expert, academic journalist, editor was doing the editing on your work so you could like pick, do I wanna have Stephen King edit this blog post?
Uh, well, Grammarly thought it would be cool to make that possible for people without telling the people that they were naming and they made the mistake, which I feel like they're probably regretting now of imitating Julia Angwin, one of the hardest hitting tech journalists. In the world, I would say, and Julia's response was to sue them.
So she's now the lead complainant on a class action lawsuit against Grammarly saying, Hey, you can't do that. That's illegal. And we are delighted to have her on the show today. And it's not just Julia, it's also Peter Romer-Friedman, her lawyer who's gonna walk us through the legal stuff. Julia will walk us through what happened, how'd she find out about this secret feature that they didn't tell her about in advance?
So I'll stop now and let Julia tell the story in her own words.
Okay. So Julia, how did you find out that you were being, I don't even know. How would you describe what they were doing to you, uh, and how did you discover it?
Julia: I described it as a deep fake of the mind, which I just made up. But I feel like you think of deep fakes as like a, you know, an avatar of you visually or maybe audio.
But this was a weird attempt to like recreate what my mind might be like as an editor. And the way I learned about it was from Casey Newton's newsletter, platformer. So. It was Monday night. I think his newsletter comes out at like 8:00 PM and I think it almost as soon as it landed, people started sending it to me.
His article was about how he had been named in Grammarly as an AI editor, and he was like, this is weird. Why are they using my name, whatever. So he wrote about how he had learned that they had this feature called Expert Review. Noticed that his name was one of the expert reviewers. And then he also mentioned in that article, other journalists, tech journalists mostly who were listed and my name was one of them, right?
So I was not unique in this. And there were lots of very famous people like Stephen King, the novelist, you know, like who were also listed as expert review. But of course when you see your own name, it becomes personal.
Alix: So how fast did you decide that you wanted to take legal action?
Julia: To be [00:03:00] honest, it wasn't my first thought.
What happened when I just sort of saw it. I was sort of shocked. As one does. I went to Blue Sky and said, what the hell? You know, I just kind of threw it out there like, I can't believe this, what is happening? And then people started reaching out to me. So a friend wrote to me and said, I think you have a name image likeness claim.
And I was like, I don't even know what that is. Right? Another friend sent me a, like a recording, a video recording. That they had made of like what it looked like in the Grammarly to have me pop up as an editor. So my friend had basically made this recording. He put in an article from the markup, the newsroom that I founded back in 2018, and asked it to edit that.
And then what happened was like little bubbles popped up. Julia Anguin, like as if I was editing the way it would look in Google Docs, right? With little suggestions. And then you could click on each suggestion and get more information about that suggestion. And when I saw that, I was like [00:04:00] horrified. And then my friend had sort of explained to me that there was actually a New York law, the right of publicity.
And so I put these two things together and I was like, this isn't. Like the usual AI thing where you're like, I wish there was a law, right? Which is where we often are with ai, but this was like, no, there is a law. And so I texted Peter because I've known Peter forever. He had brought a bunch of lawsuits, civil rights cases based on some of the reporting I'd done on discriminatory advertising back at ProPublica.
And so I knew that he was. Good at these tech and civil rights combination. I knew that he was just a good guy that I trusted, and so I basically just said, Hey, should we do this? And so basically that's how it started. I don't think I ever had in my mind, plaintiff in a class action suit as like a bucket list item.
So one thing I wanna mention is not only was it impersonating me, [00:05:00] but it was. A terrible editor criminally bad. It was like a real upfront to my reputation, right? I have spent years like holding my skills as an editor, and so it made some suggestions that looked a little bit odine on the surface. So it said like, for instance, there was a news article that basically was like a classic markup story.
It said like, Facebook is invading the privacy of these people according to investigation. It said. Sharpen the lead, like Julia in England says like, you know, let's, let's person make this more emotional stakes, which is like a very boring suggestion and not appropriate for an investigative piece. But when you clicked on it to expand further, it said, try starting with an anecdote.
And it made up a story. It said Laura, Laura went to the doctor and her personal information was sent to Facebook. And then it had a little button, which you could press to click to paste that into. The article, right? And so it was basically just [00:06:00] created a fictional anecdote to stick in. This is the problem with AI generally, right?
Which is that it has no context, right? That might be okay for fictional writing, but it's not okay for a nonfiction investigative piece. It's actually completely like in my profession, basically illegal. Like I'd be fired from my job. And so would any journalist who took that advice, right? So. It was a terrible effort, and so I think that also really made me mad was that like they're trying to pretend like these horrible suggestions are coming from me.
So that's one thing I just wanted to make sure that was clear that it wasn't. Doing a good job at the thing that it said it was doing. And the other thing I wanna mention is just when I was researching to write about it, I did a New York Times opinion piece about it. I learned that we, we don't have a federal right of publicity.
Different states have different state rights of publicity, and New York's is good. And some states have, it's only [00:07:00] really for celebrities. And some states it's like for posthumous, but. Others not posthumous. So like there's all sorts of varieties in the state laws, and it's a situation where it feels like a classic example of where we need a federal right of publicity.
And one thing that I just wanna point out is there has been a lot of proposals for like sort of anti DeepFakes laws. Not that any of these proposals are going anywhere since Congress is sort of a myth at this moment, but let's pretend that they were functional. The ones that have been proposed have been mostly about visual representations, right?
The idea of a deep fake has been, oh, it's gonna be a picture of you, or it's gonna be your voice. It's gonna be some sort of representation of you bodily, and I think it's important, this case really. Put a point on the fact that we need to expand that. It's not just visual, right? This was an attempt to represent me by using my name to represent sort of an abstract skill that was not visual.
And so one thing I wanna just [00:08:00] add to this deep fakes discussion is that we do need some sort of, when we're thinking about what kind of protections for deep fakes should exist, they do need to expand beyond just the visual to this kind of representation of skills. That's why I call it a deep fake of the mind, which.
I just like the way it sounds.
Alix: I like it too. And I think that there's something about, 'cause I find the word publicity strange because it feels less about the accumulation of knowledge and expertise that then is faked. And also, I mean, I think the fact that it's bad at it is also really relevant because it feels like.
Someone could say, well that's a bad edit. Is Julia Anguin personally suggesting that I make something up? Like, is Jason Blair also an option in Grammarly? I'm wondering like,
Julia: can you imagine? That would be amazing. Yeah.
Alix: Um, can you like, describe what Grammarly is for people that maybe less familiar with it as a piece of software?
And if you've learned anything about. How many people were part of the Expert Review [00:09:00] or any other kind of dimensions of Grammarly's politics and work?
Julia: So Grammarly is like kind of a glorified spelling and grammar checker, and I've used it before, I think maybe the free version back a million years ago.
But to be honest. I think that Microsoft Words like built-in spell and grammar checker is pretty much fine. And also as a writer I see that as sort of an area of expertise that I don't need to outsource. So I haven't been a, a big Grammarly user, but I know it's extremely popular and. Apparently what I learned from Casey Ian's reporting and the other fine reporting at the Verge and Wired was that last summer, Grammarly had added this new expert review feature, which was essentially like an AI feature.
And I don't know that much about their thinking, but my guess is based on the world we live in, that every company feels like they have to add some sort of AI thing to juice up their product and the expert review thing. [00:10:00] Is weird because. You don't choose the experts. It's not like you say, oh, I want Stephen King to edit this.
It generates them based on the content you put in. So that's why I had like my name popped up on a story from the newsroom I founded because I think perhaps it was associated right. But I. That's why there's a lot of confusion about who was listed as an expert and lots of people have reached out saying, was I an expert or, I think I most likely was.
And I think one thing we just don't know until this lawsuit progresses is exactly what that list looked like, but it was pretty wide ranging. Like even in the example that my friend shared with me. It was Julie Brill, the former, um, chief Privacy Officer of Microsoft. Right. There was like Daniel Solov, an ex professor at George Washington University.
So it's wide ranging, like these are not household names,
Alix: but they are names of people that if they discovered this was happening would obviously take issue with it. I also find just like the idea that [00:11:00] they thought this was gonna be acceptable. Bizarre,
Julia: yeah. I mean the CEO's defense, which he wrote on LinkedIn.
Says that he thought the experts would be sort of happy to have their views shared more widely. Um, I will say that I did not see a single positive response comment on his post, so I'm not sure how well that argument landed. Um, it's also worth noting that Grammarly renamed itself superhuman. And it's part, do you remember Superhuman?
So back at, do you remember this Alex? Way back in the day, superhuman was the first email provider to do email tracking. So they would embed an a visible pixel and your email and see if you had opened it. And it was really controversial. Of course. Unfortunately that has become standard practice, but I remember.
I was writing my book on privacy and covering privacy at that time, [00:12:00] and I remember thinking, this is a really privacy invasive email system, and I think it's sort of funny that they've merged with now this weirdly. Invasive grammar editing system.
Alix: That's so wild. So superhuman. The male client is now merged with Grammarly?
The Yes. Um, weird. Okay.
Julia: I could not be happier with an Anguin versus superhuman name for the lawsuit.
Alix: Oh. 'cause it's actually the name of like the l ls, like the incorporated entity that you're gonna be suing. Oh my God, that's so great. This is like a Marvel movie. I love it. It
Julia: is. I want a movie deal after this.
Alix: Incredible.
Julia: Peter, can you get me a movie deal?
Alix: I feel like, uh, he's gonna be too busy trying to make a legal case as to why, um, they broke the law. But Peter, do you wanna, I don't know. How was it from your side? Were you expecting this call? Did you, did you see the Grammarly stuff before Julia contacted you?
Peter: I hadn't seen the Grammarly reporting before Julia contacted me, but it was pretty clear to me when I. Read Casey's piece and did some basic investigation and saw the [00:13:00] materials that this was a violation of the New York and California rules. On the right of publicity that have been around for decades in California and actually over a hundred years in New York State.
So, um, I've litigated other cases before where people's names, uh, were being used for commercial gain by entities without their consent, and, uh, was very familiar with this area of the law and thought this was a pretty clear violation of it. In a lot of these cases where a lot of people are impacted, we sprung into action very quickly, and in part we wanted to.
Take the action of filing this suit quickly so that it would encourage superhuman and its product Grammarly to stop this practice. We do see sometimes when we file lawsuits that affect a lot of people, that the company will take action, and it just so happened the same day we filed the lawsuit, Grammarly took down the expert tool temporarily to rethink how they were gonna do it.
Presumably in the future they'll try to do it. While complying with the law and do better, but I mean, honestly, as a lawyer who's [00:14:00] done public interest litigation in a range of areas, I'm pretty shocked that this idea of the Grammarly Expert Review tool made it off the cutting room floor. Like it, it's a lot of lawyers who I've talked to about this have said to me, how did this get past their lawyers?
It's shocking. And in our view, the law is so clear that it is shocking that their lawyers greenlit this. Kind of program and put it out there to the world without asking any of these people, whether they're really important or not for their consent. I should note the right of publicity in New York and California and a lot of other states, it doesn't just apply to people like Stephen King or Neil deGrasse Tyson, who are some of the people who are used as the expert review?
Experts. It also applies to boards your people or people that no one even knows about. So we've heard from well over a hundred people who, as Julia said, some know that they were included. Some don't know and wanna find out, but it's very clear that this dragnet of experts brought in a lot of people who were famous, but a lot of people, probably more people [00:15:00] who were not famous, lots of academics who have one book or a certain number of articles, their name and identity was used in this.
And we think that that violates the law.
Alix: What types of knock on effects are you hoping this has? Are there other practices that you've seen that maybe are a bit less on the nose for this particular legislation, but you imagine if you win in this case, that there might be impacts in the broader AI industry?
Peter: Absolutely. To be clear, these are laws that have been on the books for a very long time, and in my experience. Whether you're doing civil rights or consumer protection or tech accountability litigation, the practice of a public interest lawyer is finding those old laws and applying them to new situations factually and, and, you know, novel technologies, we don't necessarily need new law to address this, although Julia pointed out in our op-ed that a federal right of publicity would be great, but setting a precedent that this kind of a thing is a violation of the right of publicity and potentially other laws.
Would have a solitary effect in a lot of other ways. Right? For [00:16:00] example, there's like character ai. I know some companies rolled it out there where the, the chat bot pretends to be a particular character if the chat bot owner is not getting permission from the person that they're impersonating. That could be a similar problem, right?
So we can see this in almost every application of AI where they're using someone's name, and I think it's important to nip this in the bud and make clear that you may end up paying more in damages than you ever earned in revenue for something like this if you roll out illegally, right? I mean, let's just think about this.
If they had come to Julia and said, how much are you willing to sell? Your name and your identity for this tool, right? She might've said, hell no, I'm not gonna do this ever. Or, I have some questions. Let's talk about what this is gonna look like so that this is an accurate portrayal of me. But let's say they even got through those hurdles and then they had to talk about how much she's willing to, to be paid for using her name or identity.
I'm sure it's a lot more than, you know, a few hundred bucks, right? And. Given the [00:17:00] number of people that were used and their names and identities, I imagine that there was just an enormous exploitation of these people's, uh, economic opportunities. And that's unfortunate because Julia and so many other folks, whether they're journalists or authors, or editors or academics, they've spent years or even decades honing these skills and developing their skills.
They really shouldn't be taken away or impersonated willy-nilly without their consent for no compensation. So I think that lawsuits like this and lawsuits that are being filed against some of the, the AI companies for chatbots that are killing people are really important because it sends a message to the owners and developers and investors of these companies that you may create a really cool product, but if it's gonna kill or dramatically harm people, you've gotta think about.
Providing for guardrails and protections before you launch these products to millions or billions of people so that we don't have all these negative impacts. And so I think, you know, it's interesting that representing Julia [00:18:00] was certainly one of my bucket list items. She's one of my favorite journalists of all time.
And I think in my view, no journalists has made a bigger impact on tech and tech accountability than Julia. So it's interesting that now she's part of. Holding a tech company accountable. Usually journalists aren't part of the story, but in this case, Grammarly made her part of the story by misappropriating exploiting her name for profit.
But we need more people fighting for accountability, whether that's in. Court or journalists like Julia and folks in the the tech journalism bar who are out there investigating, learning and exposing these kinds of problems. So, you know, Casey did a real mitzvah for all the people out there who heard about this for the first time and didn't know that for eight months their names and identities were being exploited for profit.
So kudos to Casey and kudos to Julia for having the courage and the wherewithal to know about her rights. Seek legal help and not just do it for herself, but to represent others in the class action. Right? We're seeking to help everybody who's been [00:19:00] impacted this, whether it's Stephen King, never thought he'd be a class member in one of my classes, or a University of Iowa professor who's looking for tenure and, uh, had their book used to impersonate them.
At first, when the, the reporting on this came out, the concern of most folks was that the names were being used, the identities were being used, and that there were these kind of generic comments. But when we dug a little bit deeper, as Julia said, we saw not just stylistic comments, but also substantive ones about actual content, right?
And it raises questions like, why are, you know, why are they having people like a former FTC or a former Microsoft Chief Privacy Officer? Giving comments to people's writing. She's not a journalist, she's not an author. She's like a public figure. Is Ted Cruz? Is Bernie Sanders are, are they giving expert editor reviews in this product?
And why would we want that? Why do we want companies providing substantive edits to people's writing? You know, without the authority or even awareness of these people [00:20:00] who are out there, whether they're public figures or local librarians, right? Their thoughts should not be taken and used in this way. So.
To your question earlier, we're gonna find there probably were tens of thousands of people, is my guess, who were involuntarily conscripted into this, you know, AI expert review tool. And it's gonna be wild, and I think very interesting to see who is a part of this and we're looking forward to getting justice for them.
I mean, the last thing I would note is anyone who does any sort of writing. Could be part of this. And there are a lot of, I think, apt analogies. So like, think about lawyers, right? We use AI sometimes, uh, when we're writing. What if Westlaw or Alexis had a tool or offered edits from John Roberts or Sam Alito or Elena Kagan on how to write a complaint or how to write a brief.
I think those justices would find it repugnant and absurd that their names would be used without consent for how to write. And frankly, we don't know whether Grammarly use those Supreme Court justices [00:21:00] as experts. They they very well could have. Right. So I think in a democratic society where we have rule of law, it's important that.
Lobby respected, especially when you're talking about professionals who have built their reputation and skills and not have that just pilfer and taken in without their consent. So many people's professions and expertise and livelihoods are at risk right now, so we couldn't have a more important time to defend reporters, authors, editors, and people of all types of backgrounds from this kind of.
Bog.
Alix: Amazing. Okay. Well keep us posted, um, on what happens with the case. I am gonna be following it closely. Julia. Uh, good luck. I feel like it seems like a slam dunk, but who knows. What else do you have going on aside from this lawsuit?
Julia: I have a new book coming out in June called On Courage, how to Be a Dissident in an Age of Fear.
It's about what it takes to fight authoritarianism and my co-author and I interviewed more than a hundred dissidents around the world and. [00:22:00] Came away with sort of what it means on a personal level to fight against a power that feels like unstoppable. And I guess I would just say that it has changed my view of my own role and I think also did lead me to be more willing to do something like this.
As a traditional journalist, I've always felt like, oh, my position is to be a witness on the sidelines. And, uh, reporting this book has made me see how much I am in the fight and fighting. On every terrain possible, right? And so the fight against AI stealing our identities is part of the fight against authoritarianism.
And so I would just like to tell all your listeners to pre-order on courage.
Alix: We will add a link to the uh, pre-order in show notes and I have pre-ordered it and I encourage everyone else to as well. 'cause I think. Combining technology, politics questions with authoritarian logics and what everyday people can do about it feels essential right now.
So thank you for writing the book and for sharing it with folks. Um, and thank [00:23:00] you Peter, for suing.
Peter: If folks want more information about the case, they can go to PRF. law.com and one of the case pages is the case, uh, that Julia filed, or you can email it as info@prflaw.com. We do want people to contact us so that we can make sure that we search for their names once we get the list to confirm whether they were part of this dragnet.
Alix: Okay. So, um, as a next step, if you think you may have been a part of this expert review and wanna find out, um, you can contact Peter, um, and maybe be a part, um, of this. Mary Band of Litigators who I'm very supportive of. Um, cool. Okay. Well, thank you both. This was great. Thank you so much.
Julia: If you wanna know more about the case,
Alix: we have left a lot of links in the show notes for you, as well as the email address you can use.
If you wanna check whether you're on the list of experts that Grammarly used in their tool. Peter is looking for other people that wanna join the class. Action. So do check that out if you think you [00:24:00] might be, I don't know, important enough online enough for it to be possible that Grammarly tried to steal your likeness.
Thanks to producers Sarah Myles and Georgia Iacovou. Also the team at the maybe Kushal Dev, Marian Wellington and Zoe Trout, and obviously to Julia and Peter. Keep on keeping on and we are gonna be following. The case up next, uh, we are continuing with our Fantasy Factory series with Valerie Veach coming out this Friday.
Thanks for listening.
Stay up to speed on tech politics
Subscribe for updates and insights delivered right to your inbox. (We won’t overdo it. Unsubscribe anytime.)
