E86

Who Knows? Independent Researchers in a Platform Era w/ Brandi Geurkink

Read Transcript
Listen on:

Show Notes

Imagine doing tech research… but from outside the tech industry? What an idea…

More like this: Nodestar: Turning Networks into Knowledge w/ Andrew Trask

So much of tech research happens within the tech industry itself, because it requires data access, funding, and compute. But what the tech industry has in resources, it lacks in independence, scruples, and a public interest imperative. Alix is joined by Brandi Guerkink from The Coalition of Independent Tech Research to discuss her work at a time where platforms have never been so opaque, and funding has never been so sparse

Further Reading & Resources:

Disclosure: This guest is a PR client of our consultancy team. As always, the conversation reflects our genuine interest in their work and ideas.

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Hosts

Alix Dunn

Release Date

November 28, 2025

Episode Number

E86

Transcript

This is an autogenerated transcript and may contain errors.

Alix: Hey there. Welcome to Computer Says Maybe. This is your host, Alix Dunn. And in this episode, we are going to look at independent tech research, which each of those words on their own might make sense, but you know, when you put 'em all together, maybe it's not so obvious that most research that gets produced about tech platforms is produced by, you guessed it: the platforms themselves, whether that's through them financing and funding academic research, or hiring the best people they can find that will come internally and produce research for them. Some of that research may be made public, and if it shows bad things about platforms, maybe it isn't made public.

So we've got this problem. I would say these digital platforms are such a big part of our lives, our political systems, and that role is only getting bigger and bigger. But the resources available to research on this are getting smaller and smaller. And not only that, the companies that used to say, "Hey, here's a bunch of data about what's happening on our platforms, can you help us understand what's going on?"

They're instead oftentimes trying to prevent researchers who aren't operating within their own walls or corporate towers to access that research and data at all, because they don't necessarily want independent researchers showing the world what's going on in these platforms. So that's the dilemma.

What do we do about that? Obviously, tech companies have a lot of money. Obviously academia famously doesn't and neither does civil society. So this is where a new organization, new-ish organization, I think it becomes really interesting and we're gonna dig into it now. And that is the Coalition for Independent Tech Research.

This is a network of researchers who produce research on platforms to help us understand what is going on, and they are increasingly under threat, not just because they don't have enough money to do their work, but also because these tech companies increasingly are acting adversarial towards. Them. And that's what we're gonna get into today, sort of that [00:02:00] operating environment for people that are doing God's work, producing research for us to understand what's going on in our world, which is increasingly so digital.

So with that, let's jump into my conversation with Brandi Guerkink, the executive director of Coalition for Independent Tech Research.

Brandi: Hi, I am Brandi Guerkink. I'm the Executive Director of the Coalition for Independent Technology Research.

Alix: What's the Coalition for Independent Technology Research?

Brandi: The coalition is a. Global community of more than 500 independent researchers who study the societal impacts of technology. And we work to defend, advance and sustain the right to do this important research When we're talking about independent tech research, we are, as a coalition, are primarily concerned with [00:03:00] research that is conducted independently of the technology industry into the societal.

Impacts of the industry. So questions about how does the behavior of the large tech companies impact our communities, our children, our bodies, our workplaces, these kinds of questions. An

Alix: independent part is that we can't trust tech companies to be the producers of all the knowledge about what's happening on the platforms.

Brandi: That's right. They don't have any incentive. To produce research that is genuinely unbiased and informative about their own impacts on society. That doesn't mean that their research isn't important or that there's no public value in research that comes from the technology industry, but it just means that exactly as we wouldn't rely on the tobacco industry to give us all of the research about impacts.

Of tobacco on the human body. Why would we [00:04:00] trust research that is solely coming from the technology industry about the societal impacts of their products? Take AI research is a really good example of this. When you think about the resources that are needed just in terms of financial resources, in terms of compute, to be able to do the research that we need as a society.

About technology's impacts, those resources are almost solely concentrated within the technology industry.

Alix: Are there any corollary industries that have taken on such a giant role in like the public consciousness and in the economy and all these things that have had to navigate, or where the public has had to navigate the challenge of doing good research on those platforms when they had, I mean, platforms, I just, yeah, another industry other than platforms where we've had to deal with this dynamic.

Brandi: I mean, an example that's talked about frequently is the fossil fuel industry. The need to have essentially scientific consensus, some sort of reliable [00:05:00] understanding about climate change and the pace at which our climate is changing, and the potential extent that we could mediate those harms to our environment.

You saw the development, for example, of the International Panel on Climate Change. I think it's what it's called, the IPCC, and the need for scientists to come together to essentially be able to answer some of these fundamental questions at a really high level. And that's an example that people often talk about in our field.

And there have been a few different attempts to create something similar as it relates to say the information environment. I think one of the challenges is that when, at least from the coalitions. Side. When we talk about the societal impacts of technology, it's not just about the information environment and platforms, for example, there's also these broader societal questions about surveillance and the the role of surveillance in our societies and our communities, and the [00:06:00] extent to which the technology industry plays a role in that.

When we think about things like defense contracts, when we think about government procurement of technological systems, that is also. Fundamental information that the public needs, reliable and trustworthy information on that is different from research on the information environment. And so it's sort of a question of, you know, your question got at the scale of what we're actually talking about when we say technology's impacts on society and how we might be able to get there.

Alix: Yeah, I think it's a good corollary. I will say that it feels like understanding whether or not a drug or a cigarette is addictive or is causing, like public health crisis feels easier to measure, and it feels like the second you get into trying to do empirical study on information systems, it becomes so loaded.

It feels like maybe. 10 years ago studying these platforms was seen as like a rigorous scientific effort that was being done in earnest, but it seems like it's really changed in the last 10 years how these researchers are [00:07:00] conceptualized by partisan political actors who like don't want this to happen.

I mean, you and I have talked about like a couple of like key moments when that's changed. Do you wanna share the story of what happened? Maybe we start with Laura Edelson.

Brandi: Yeah, sure. I'm ha happy to get into those stories. And I also just wanna add to something that you said just then. It is absolutely possible to study rigorously and scientifically many, many, many of the questions that the public.

Has about technology's impact on society. It's not that easy to study rigorously and scientifically if you don't have the data to be able to do that, because the companies don't wanna give it to you if you don't have the funding because the ecosystem is largely dominated by funding from technology companies.

And also if you're facing. Political retaliation and politicization of your work. So you have to spend all your time defending yourself from harassment and smear [00:08:00] campaigns by political actors instead of doing science. And so I just wanna put it out there that, you know, it is absolutely possible to do this work.

And the barriers that are, there are intentional ones that have been put there just as we're seeing the politicization of a lot of different. Threads of scientific inquiry in in other areas like climate change, for example.

Alix: I think it's a really good shout that empirical work in this area is possible, has been done, and when the resources are there, fantastic.

Researchers and kind of new fields have emerged to produce research that's really useful. I think it is a political attempt to prevent that very good research from happening because the results of that research is often not very flattering for the platforms, and it's also not very good news for policy makers who are sort of confronted with the way these platforms are affecting all kinds of things that they might have to do something about and maybe don't.

Want to so noted research is possible if the conditions are right. Something tells me that the conditions for Laura Edison's research were not right for her to be able to pursue what [00:09:00] she was doing. Um, so, 'cause she was using Facebook data to, what was the actual piece of research she was doing when they sent their lovely cease and desist maybe letter?

Brandi: Yeah. So. Laura Edelson and Damon McCoy at the time were two researchers based out of New York University. They were essentially studying political advertising on Facebook around elections and helping people to understand what kinds of advertisements they were being targeted with by political parties and helping the public ultimately understand how.

Facebook was being used to shape public opinion around elections, and they did this by using a browser extension, which is a common way of studying political advertising, especially as companies become more cagey about data that they actually make available proactively about political advertising that's happening on the platform.

So back in [00:10:00] 2020, just a few months actually before the US presidential election. Laura and Damon received a cease and desist letter from Facebook that essentially told them that they needed to stop their research and otherwise as cease and desist letters do otherwise vague threat. That's sort of looming in the background.

And so when this happened, a few different communities came together in solidarity with Laura and Damon and really. Defended the research that they were doing, and especially they defended the privacy protections that Laura and Damon and the other researchers involved in the work upheld in undertaking their research because Meadow was trying to make the point that they were.

Violating the privacy expectations that users had on Facebook in their research when in reality, this is completely false. The browser extension that they used [00:11:00] did not even collect any personal identifiable information about users. So it was essentially a lie that meta put forward in order to try to.

Justify their attempts to stop Lauren Damon's work a few months before a critical US presidential election, and we can speculate about what the reasons for that work. Maybe they didn't want Lauren Damon to really uncover the extent of what kinds of political advertising may have been happening on the platform.

Nonetheless, a group came together to really defend their work, their right to do their work, the principles that they were upholding in doing their work around ethical principles, around the privacy protections that they had in place, and ultimately. Launched a pretty successful defense of Lauren Damon and their work.

I believe that Facebook nevertheless shut down their personal accounts, which essentially stopped them from being able to do their work. But you know, in the [00:12:00] end, as far as I am aware, they didn't actually. File a lawsuit or something similar. Right. And ultimately, the coalition that came behind them to support them, not to be confused with the Coalition for Independent Tech Research, which did not exist at the time, but it was one of the reasons.

Good because I'm immediately confused by

Alix: that. But carry on. Yeah. Yeah.

Brandi: Which is one of the reasons why we went on to create this coalition. 'cause we sort of realized. That there would need to be a group that could do this again and again as we expected attacks against researchers to pile on. Got the acting head of the FTC, for example, to write a letter that was countering the claims that Meadows making got Mozilla to do a full privacy review of the browser extension that Lauren Damon use and verify.

Look, this is completely sound from a privacy perspective, and so essentially this group really single-handedly like undermined all of the arguments that Meta was trying to falsely make about the work. This led to Lauren Damon ultimately feeling really supported to continue on in their work. [00:13:00] Because the point of these campaigns when they happen is to.

Essentially isolate you and make you feel like you have done something wrong. It's to create the conditions where people feel like, oh, my reputation has been irreparably damaged, and I have no choice but to change my research topic or leave this field entirely, or, you know, decide to go silent for two years.

And when that happens. We lose out on amazing research from amazing researchers who do this important work every day. And so that's exactly why we need concerted community solidarity behind people to be in defense of them when shit hits fan.

Alix: Yeah. It's also, I feel like academics are also quite conservative, like small C Conservative, and I can imagine like how terrifying it would be.

For an object of study that's a core part of what you're interested in, in your academic career, to then be accused by that actor of some type of methodological data access illegality or like wrongdoing, [00:14:00] and how hard it would be in the face of that to basically just assert that, no, you're. Doing this because you just don't want my research to proceed.

So then after this happened with Laura, is that when the coalition, 'cause the coalition was starting around that time? Yeah.

Brandi: Yeah. So we really, it was about a year later in 2021 that sort of a core group of folks came together to really ask the question of, do we need something like this as a sustained community effort to support researchers when they face retaliation?

And to advocate for better conditions for independent research to really sustain, but also advance the right to do independent tech research. And the coalition was born out of that. So we officially became an organization in 2023 with about 50 members who were primarily academic at that time, and primarily based in the United States.

And today we are a coalition of over 500 researchers based across. 51 different countries in virtually all parts of the world except for Antarctica. But we're looking in [00:15:00] case anyone's listening, who's doing research in Antarctica.

Alix: Yeah. Amazing. That's incredible. I didn't realize that it was that. Growing that fast.

'cause the, the presumption is that when you join, it's not that you're doing particularly precarious research, that you're worried is, might be under threat. It's that you are joining in solidarity with other researchers so that there's a growing force that can basically respond at a moment's notice if platforms or companies that are being, uh, investigated.

So like what's the ambition? Like, how big do you wanna get?

Brandi: Yeah, so I think that the ambition is for us to have a coalition that actively fights for independent tech research, fights for the rights of researchers at a time when that work is getting more and more precarious. So yeah, as you say, we have some members that joke, like, I'm in this coalition as an insurance.

Policy. I really hope that it doesn't happen to me, but if it does, then I'm glad I have a coalition behind me.

Alix: I am Spartacus vibes. I like it. Yeah, yeah, yeah,

Brandi: yeah, yeah. But you know, in reality, what's most heartening to me, I think is that. [00:16:00] Members of the coalition resist this urge that's often present within academia.

That is all about like individual achievement, right? It's often a competitive vibe. It's a desire to achieve novelty and to achieve like success at a very, very individualistic level. And the coalition understands that the threats that we are facing to. Our independent tech research are threats that are gonna be facing the whole community, right?

And so we actually have and embed within the coalition a bit of this idea of like privilege hacking in a sense, which is like, I will support you and you will support me. And we have each other's back. And we realize that when we are. Standing up for one another and standing up for the right, for independent tech research to exist and to thrive as a genuine counterpart to the research on technology that is coming from the industry itself.

That raises the bar for everybody, so the floor becomes the [00:17:00] ceiling. That's our ambition.

Alix: Okay, so Laura had to go to the lengths of making a browser extension because data is not regularly and systematically and structurally made available to researchers who want to explore and do investigations into how, what's happening on the platforms.

Do you wanna say a little bit about. The importance of data access and then potentially like how the platforms have sort of changed posture about whether or not they're gonna provide it and like under what conditions.

Brandi: So I often talk about this matrix that exists with almost all independent tech research today, which is that the questions that we can ask about the societal impacts of technology are essentially filtered by like.

What data can I get access to to be able to answer those questions? What can I get funding for to be able to actually answer those questions? And what is my appetite for legal risk [00:18:00] that I have for answering those questions? Filter like everything that we need to know about technology and its impacts on society through that matrix.

And you get an extremely limited amount of questions that we can actually ask. And so I think that's a place to start with with data access, right? It's not just the fact that. It's not possible to get data through any other way, and so we have to make browser extensions. It's also the case that some researchers believe that using a company's API or like using an interface that is provided by a platform to essentially give researchers that data.

There are questions about, are we sure that we can actually trust that data or you know, in a sense in which you have to, in some cases, apply for access from the platform to get that data. Like, do I wanna tell meta in advance what I'm researching about their company? Some people would say, hell no, I don't.

Right? And so it's not just about. The inability to get the [00:19:00] data through any other means. But for many researchers accessing data through these un permissioned means, which is either through, for example, scraping of public data or the use of browser extensions where users actually donate their data to researchers, as in the case of Laura and Damon's browser extension, that was a data donation one.

So people act. Actively opted into sharing that information with researchers. Those are also important means of accessing data that preserve the integrity and independence of the researchers and their ability to do that work completely independently of the technology industry. That is getting harder for many reasons.

Laura and Damon's story being one of them, but also the means of getting access to data through these permissioned means, right? So some of what I talked about of like accessing interfaces or having access to APIs. Is also getting more and more difficult. The tech companies are pretty much systematically shutting down [00:20:00] avenues for public interest researchers to access this information.

And this is even though, or really even in spite of, or maybe even because of regulations that we've. Seen, for example, in Europe with the Digital Services Act, which actually require companies to provide this kind of data access to researchers. So in my view, it's no surprise that as the costs of corporate misbehavior have increased because of regulations that have been introduced.

The companies wanna make it as hard as possible to find any evidence of that kind of corporate misbehavior and anything that could essentially be used in, you know, a regulatory investigation, a court case, in order to hold the company accountable. For those impacts. So it's getting a lot harder. It's getting a lot riskier, and it's no surprise, right?

That's not just a phenomena that is happening. It is a concerted strategy.

Alix: It's also hard, not in my head at least, to connect the dots with like Mark [00:21:00] Andreessen talking about essentially wanting to destroy academia and some of these. Fusions of these venture capitalists or people that would benefit from completely unaccountable corporate behavior in this space.

And basically just like as long as the people don't know what's happening, then we can kind of do what we want and that they recognize that there's just not that much research capacity in our society to study these things. And so it's quite easy to imagine squeezing the people that are doing this to prevent.

Research that might affect their ability to just yolo govern their way into where wherever they want us to go as a society, which is really depressing.

Brandi: Yeah. Well, yeah, and as demand from policymakers for scientific research. Just completely goes away, right? Because distrust in science is a pervasive trend now.

And so yeah, this is all linked, right? This distrust in science. It's like well before if you were trying to lobby legitimately as a company around. [00:22:00] You know, regulation be taken seriously. You at least have to have like scientists, legitimate independent scientists on your side. Right? And I don't know that that is actually the case anymore.

So yeah, it could that broader distrust in science and attack on academia and just attack on research as well. It's not just within academia, it's also community scientists. It's also journalists that broader trend plays. Into the hands of corporations and it's really sad because without that kind of work, we wouldn't have consumer protections that we have today that are so important that make sure we don't have like lead in the paint of our houses and in the toys that our children play with, right?

Like it's incredibly important things that have come from this kind of independent science that have been holding corporations to account and now that is being eroded not just in the technology industry, but across the board. Not to be too depressing.

Alix: No, no, no. But I think, but I think it is all joined up in a really interesting way.

I keep thinking about where is this all going? 'cause as the world gets more and more complicated, if anything, we want more and more [00:23:00] robust infrastructure to help us understand like the implications of that and like what's happening. And it feels, there is a concerted attempt to like. Turn out the lights in a way that I just find like really, I mean, pun intended, I guess dark.

There's an irony too that the insistence that their products are transforming society and that they should get credit for that, but at the same time don't seem to understand that there is an ancillary responsibility to actually understand what's happening in a way that isn't just based on the technology, but is based on the way that it affects people.

And it feels like they're just like, oh yeah, no, this technology is. Universally important and is transforming everything, and yet I don't want to know how mm-hmm. Or the implications. Mm-hmm. Because then I'm exposed to liability for, yeah. The way I'm accumulating wealth and power. Maybe this is a good segue that like, I feel like the best example of adult brained.

Astronomically wealthy person who basically needs some exposure to other thoughts than [00:24:00] his own is Musk. And when he took over Twitter, it felt like it was a sea change in terms of how Twitter might position itself vis-a-vis independent tech research. 'cause it felt like before Twitter was actually invested more than Facebook in APIs and structures that would allow research.

I think back 10 years ago in like Twitter, big Twitter studies was like made possible because Twitter was like, Hey, we, you know, we think this is important. And then must took over and. He's never found something in the public interest that he wasn't willing to privatize and destroy. But, but he then, like, do you wanna talk a little bit about what happened with CCDH?

Brandi: Yeah. So you're definitely right that this, like Twitter was the most studied social media platform also just is like more open by defaults than other platforms, right? Like it's a, a channel, those. Mainly meant for broadcasting, um, your messages out to the public, with the exception of people who have private accounts.

Right. So there was just a lot of research that was done on Twitter and they also had a [00:25:00] relatively open API that provided researchers with a lot of information that they could use to study conversations and other, you know, types of things on the platform. And yeah, that all changed right? When, when Musk took over, he.

Infamously shut down the access to the company's free API, which you know, a lot of researchers and also developers and different kinds of people were using and introduced a pricing tier that was $42,000. A month to access. So yeah, for those researchers

Alix: that are notoriously rich, um Oh yeah. And well-heeled.

Yeah, we're

Brandi: drowning in cash. Yeah, exactly. So that was really one of the first moves that was clearly sending a signal. I think it's important to say that that wasn't something that at face value was necessarily directed at public interest researchers. Right. This was about this broader trajectory that the industry was going in, and that they realized that.

People would be [00:26:00] accessing that API in order to retrieve data for training, large language models for getting on that train, right? Earn earning a lot of money that way. And so they wanted to limit the amount of people who are accessing the API in that way, or essentially make them pay for it, right? What this did in the absence of like a plan for actually continuing to provide access to public interest researchers is leave all of those researchers in the dark, so.

A lot of people were, and still continue to use alternative commercial tools like brandwatch, for instance, which is made for marketing. It's not made for public interest researchers. They're essentially like social listening tools that allow you to access data that can support research that folks are doing.

And Elon Musk became really upset by research that was done by the Center for Countering Digital Hate, which is a research and advocacy organization, and sent them a letter [00:27:00] while the lawyer sent them a letter that was essentially threatening that they were going to sue them. Because of their research and the first letter that was sent suggested that they were going to sue them on the basis of their speech, right?

Like a, a classic defamation case. The actual cease and desist letter that was sent the following weekend said, actually, no, we're gonna sue you because of the way that you've accessed data. So it made it very clear like what the company is actually upset about is the speech of this organization. What they're going to challenge them on is the way in which they've accessed the data that they did in order to do their research, which then pissed off Musk, and so they filed a lawsuit against the organization in 2023.

That was really marked first moment that a major technology company, at least that I'm aware of, that a major technology company sued a major research and advocacy organization. That was about [00:28:00] their speech, but that the legal claims had to do with the way in which they access their data, which obviously has ramifications for a lot of different groups in the space who are also accessing data as best they can in the ways that they can.

Given that, you know, no one is. Having this $42,000 a month necessarily to pay for the API access. And so ultimately, CCDH, similarly, a group of people within civil society, also within academia, like a huge groundswell of support came through for CCDH really standing up for their rights to do their research.

Really calling out what Musk was doing as an attempt to. Silence them and to prevent their First Amendment protected speech from continuing to exist. And the judge ultimately dismissed the case. The ruling was, was really damning. They wrote, you know, that this was very clearly an attempt to limit the speech of the organization that was brought forward under different legal [00:29:00] claims.

And in that ruling, one of the things that was cited was. A survey that had been run by three coalition members, Joe Lato, Megan Brown, and Kai Chung Yang, who had really surveyed the public interest research community and found that an overwhelming majority were concerned about being sued by Musk. And as a result, they had changed their research projects or topics.

And so the judge saw that and was like, that is the chilling effect. That is evidence of like what a chilling effect is. That actually was something that they cited in their ruling, in their decision to, to dismiss the case.

Alix: Yeah, and it also, I imagine, emboldens. Zuckerberg and these other people sort of watching the brazenness of like going after a nonprofit who's pursuing knowledge about something that's of societal import and also part of the discourse right now.

I mean, the fact that it was about antisemitism, given everything that's going on right now, that's such a sensitive topic and like any information about that is really helpful given the rise in hate [00:30:00] crimes and antisemitism globally, and also Musk's behavior. In terms of, you know, he constantly is flirting with or explicitly promoting tropes about the Jewish community and basically like has this like globalist conspiracy theory brainwash.

So the fact that he was able to do this also, even if the case wasn't ultimately successful, I think feels like it paved the way for other types of very overt attempts at shutting down this type of work in ways that you would think would have more PR blowback than. It did, did this cost must anything aside from his legal fees to like broadside attack an organization that's obviously just trying to like learn some stuff?

I don't think so. I mean, personally, yeah. Yeah.

Brandi: It's, it's a very depressing question in a way because I'm, I'm really thinking about it. Like, did it, I mean, after that he, he continued, last summer he sued gm, the Global Alliance for Responsible Media as part of the World Federation of Advertisers saying that they [00:31:00] had.

Illegally orchestrated a boycott against X, which is like the most anti-capitalist perspective that I can ever think of. Like I wanna prevent advertisers from being able to choose where they advertise is completely anti-capitalist. It's a fascist perspective. Right? And that lawsuit ultimately led GM to shut down in two days.

So like, if it works, it works. Right? And that's unfortunately what we're, we're seeing him kind of continue to, to trudge on with. So,

Alix: yeah, I think also he's like learning a lesson from Trump about if you have the money to pay lawyers, it's like other people don't and it's, it's like, it's much more traumatic to be sued if you're not Musk or Trump than it is for if you are.

Brandi: But I will say that the impact of the case being dismissed may be. Meaningless for Musk. It's incredibly meaningful for our community [00:32:00] because that shows that like, you know what, you can't just be the richest person in the world and use all your money to sue a small research organization. And get away with it and basically succeed.

And so I think that that for our community was really meaningful. It's meaningful when the court system works as it should. We shouldn't like discount the importance of that. And so that's, you know, really what the, the dismissal showed was that you can actually stand up and fight back and not just fold because you get sued.

Like you can go to trial and you can win and you can push back. And that's what we have to do. Like that's what we have to keep doing.

Alix: Yeah, no, the cost of not fighting feels, yeah, unimaginable actually. Yeah. Well, so then these are like two examples where a company meta goes after Laura Edelson and then.

A person, a psychopath goes after, uh, CCDH who, a psychopath who happens to own a platform, which feels slightly different [00:33:00] than an organization's posture, like an OG platform posture towards emerging research. But then we get into like the, I don't even know how to describe Jim Jordan as a character, but like the absolute Red Pilled Congress person who.

Has no shame and also no suit jacket ever, and is like just weaponizing. I mean, the irony, him weaponizing the state for the purposes of his big tech friends, but also to kind of turn the lights off so that we don't know what's happening given the amount of radicalization happening on platforms. But doing that in his words, to prevent the weaponization.

Of government. I get so disoriented when I think about the number of inversions that's happening in the political process in the US where you have people like Ted Cruz and Jim Jordan making claims that the government is attempting to suppress speech on platforms by jawboning, or partnering with companies to influence speech.

They [00:34:00] use that. As a mechanism to use the government to censor speech on platforms and jawbone and change your behavior. Yeah, it's like, it's so turtles all the way down. I can't even, but do you wanna talk a little bit about, this feels like a very new, much more aggressive, much more from the heart of. The most powerful government in the world, arguably, to target these researchers.

Brandi: Yeah, yeah. I mean, yeah, you have to do like several somersault leaps of logic to really like follow what Jim Jordan's been doing, which I will not try to do in its entirety. But yeah, for the last several years there has been a concerted effort by Jim Jordan's committee, also by a handful of states.

Attorneys general and journalists as well, to essentially launch coordinated smear campaigns against public interest researchers and organizations working on things like online violence and harassment and [00:35:00] disinformation and you know, you name it. And recently they have turned their sites on. Europe, which is weird in a bunch of ways.

I think that one of the reasons is because they're essentially falsely claiming that there's this censorship cabal that is created by the Biden administration and also some academics and also some NGOs. They claim that it's a, a global censorship industrial complex, right? And this was largely used by Jordan, even though he had no evidence that suggested that this was actually happening.

Nonetheless, he, you know. Deposed researchers. He held a number of hearings where researchers, you know, had to come up and answer questions that were not legitimate ones, like it was never a legitimate investigation or inquiry at all. Then they would basically use [00:36:00] clips from like the hearings or things that were said during these different hearings, and then.

Take them out of context, uh, as well as take information out of context that was retrieved through, like freedom of information requests, and then give it to the right wing media that would like play it ad nauseum. Um, and, and essentially make people look like they're saying something that they're not actually saying.

Right? So it was really sort of. Bad faith, effort and campaign, and it was largely used as a, I think, a campaign tactic right ahead of the election in 2024, the US election. They won that election and then they decided, well now we're gonna focus on Europe, I guess because there's not anything to focus on other than, you know, the gross violation of free speech that's happening in the United States where the call is coming from inside the house.

They're like, no, no, we're gonna look at Europe. Europe's the undoing, all the censoring of speech, and so. What they really claim is that. European governments are pressuring American technology companies to [00:37:00] censor the speech of American citizens through Europe's laws, which is one of those weird gymnastic luvs that you have to do to even understand how that makes sense because it.

Doesn't, right? Like Europe's laws have absolutely nothing to do with Americans and like the speech of Americans at all whatsoever. But it doesn't really matter what's true. It's never really mattered what's true. It's mattered like how much these lies can be used to a political end. And so that's been the attempt is really to try to.

Water down or essentially kill entirely regulation of American companies by foreign governments. And so that is where we've seen Jim Jordan in particular, become increasingly aggressive towards researchers who are involved in like advocating for the Digital Services Act, for example, which is one of those regulations.

Um, in Europe also the regulation I talked about that has to do with data access. So this is how it's all connected. Or you know, [00:38:00] researchers who. Produce scholarship that then regulators use in informing the way that they enforce the laws here in Europe. Right. And so what we saw in particular was an instance in July of this year where Jim Jordan's committee published a report which tried.

To make the argument of how Europe's laws were being used to censor the speech of Americans, and it just like completely failed. It's very nonsensical if you actually read it, which unfortunately I did. But in the report, there's absolutely no evidence that the European Commission or national regulators.

Pressured companies to take down any speech, not only Speech of Americans, but like any speech at all. Instead, what the report focused on was a workshop that the European Commission held with platform representatives, with civil society and with regulators, where they talked about fictitious scenarios and really.

Use as like a thought exercise. How might enforcement look in [00:39:00] this fictitious scenario? What Jordan's committee did is send subpoenas to the technology companies for all information about what was discussed during those meetings and what they got from at least one of the technology companies were internal readouts that they had sent to their colleagues where they were essentially smearing the reputations of researchers.

So they named researchers. And research organizations, and we're talking about some of the most well-respected free speech and free expression organizations, and wrote things like, this organization is extremely pro censorship. They think that all content, including legal content should be removed from the internet when in reality the researchers said.

No such thing, but you know, it doesn't matter because those are the documents that the tech companies handed over to Jordan, which he then published and are being used as the basis for attacking researchers on completely false premise. And so that is the level that we're dealing with of collusion between.

Political actors who have been waging [00:40:00] this campaign for years, and the technology companies very willingly stepping into this space and recognizing, Hey, now we have an ally that's in a position of political power and we can join forces in order to crush these regulations. And essentially the people who do the important work of providing this kind of information, not only to regulators, but also to.

The European public, the American public, and ironically, are the people who are actually fighting for free expression online. Right? Like they're trying to push the narrative that these are researchers who are trying to censor speech online, when in reality, like these are organizations that are trying to get to the bottom of like, is shadow banning happening?

Who is it happening to? And how can we hold companies accountable when they are doing that shit? Right?

Alix: Yeah. And also it goes to your point earlier that. With the appropriate resources, not in an environment where you're under a political threat. It is possible to do empirical research on this that is useful regardless of your political.

Persuasion or views. That's why when [00:41:00] you hear these kind of bizarre, it's to prevent this research from being produced. It's not to prevent the weaponization of this research. It's not to prevent platforms from all those poor little platforms feeling pressured. None of that is what's happen, the richest companies

Brandi: in the world.

Yeah, totally. Like, oh

Alix: my heart bleeds, you know? But it is to basically disincentivize, undermine, and drown out actual empirical work. Let's leave Jim Jordan aside for my last, my last question, which I feel like is one, I don't know. I feel like we don't get to ask this question enough, but like if this was all working well, if there was really good data access, if there were appropriate levels of financial resources, if researchers weren't under political attack constantly for this work, what would it look like if we had a healthy.

Way as a society to know what's going on with these platforms in a way that was independent of the corporate line.

Brandi: Love that you asked me that. I mean, I think, you know, one [00:42:00] of the most exciting things for me to think about are what if a person in your community, your mother, your best friend, your colleague, has a legitimate question about the internet.

You know that they're experiencing, that their children are experiencing, that somebody in their church is experiencing. And they can ask a researcher, Hey, is this actually happening? Like, is this happening at scale? What might we do about it? Is this happening only to me? Am I isolated? What I see is is going on in my community and my own life, and my child's life seems kind of bad.

And like, can we actually get to the bottom of it? Can we figure it out? And then researchers are actually empowered to answer the questions that really matter to people. Like not people in academia, not people working in professional civil society, but just everyday people that are using technology, like they can have answers that they can trust about the real impacts that [00:43:00] these technologies are having on their lives and in their communities, and also how it might be different.

And so I think that also has to do with, you know, right now people are essentially in a position where like they see research. Oh, okay. Social media is like bad for children's mental health and then they see research as like refuting that. And it's kind of the way that research works, right? But the ability to actually verify or refute claims that are coming from the technology companies that have a vested interest in the answer being one thing or another, that is what independent researchers can provide.

And if we actually had information like that that we could trust. The choice is up to people to decide. And I mean that from the perspective of like a consumer. I use a lot of these technologies myself in my everyday life. I wanna know what the information is so that I can make the best choice for myself so that people can make the best choices for their families, for example.

And then the same goes for policy makers. Like in a [00:44:00] democracy, we elect people to make laws for us on the whole, right? And so it's also about having information that can inform. Those laws, and guess what? If we don't like 'em and we don't like those laws, then under democracy we can elect new people, which I hope democracy stays for a long time.

But I think that that's the promise, right, is just about the everyday person's ability to have like information that they can trust that informs their decisions, that informs the decisions that policy makers might make, and ultimately that informs. The way in which technology is developed on the whole.

So like we've talked a little bit about the ways in which like capital screws that up, but there are plenty of examples of like startups, of public benefit corporations, for example, that are doing things differently that rely on independent research in terms of how they build better technologies. And so I think that unbiased, trustworthy information about the societal impacts of technology helps.

Us to ultimately get better technologies. We're not gonna get that if we [00:45:00] only rely on information coming from technology corporations. Like, look where that's gotten us right now. You know? So I think that there is a way out of that, and that independent research is a really important part of that puzzle.

It's not the only part, but it's a very important part and it's a part that is often discounted and where people have so many barriers working against it that are relatively easy to remove if we want to. As a society,

Alix: love it. And I feel like none of this is slowing down. So the idea that we're gonna like not be able to keep building this capacity as a society, it's just not an option.

Um, so yeah. Okay. Well thank you. This was great. I have respected what you put together with the coalition. It's so exciting to see how much bigger it's gotten and also how. Uniquely positioned. It is at this moment to basically of the troops we've got, um, that are doing this work to make sure that they're allowed to keep doing it and not, I don't know, not having to be alone [00:46:00] in going up against these giant platforms.

So thank you for all the work you're doing and for this. Conversation.

Brandi: Thank you. It was great to be on.

Alix: All right, so next week we're gonna continue on this theme and dig into other kind of vantage points of this challenge of doing research on politically contentious. Topics, both of human rights, also of emerging technologies and how they're affecting society. And we're gonna be talking to three people.

In the next episode. We're gonna be talking to Megan Price, who is the executive director of an organization I love called HR dag. Human Rights Data Analysis Group. Sounds really boring. What they do is actually so interesting. Um, and we're gonna get into how it all started when they used statistical models to show in a legal setting.

That's something that happened was a genocide. And then we're gonna be talking to Janet Haven and Charlton McIlwain. Janet is the executive director and Charlton is the board chair for Data and Society, which is one of [00:47:00] the research organizations that generates a lot of independent tech research and does so on kind of a longer timeline than lots of other civil society organizations may be resourced to.

And we're gonna dig into both of their kind of organizational approaches and work with this idea being of, I've just got interested in talking to people who I've known for a long time, but haven't necessarily addressed this issue on the show of how hard it is to do research that makes a difference in policy conversations.

Actually what it means to try and do research on incredibly fast changing, incredibly resource sector like the tech sector that is so influential and has so much power in society. So that's what we're gonna do next week. Um, and I will see you then. Thank you to Georgia Iacovou and Sarah Myles for producing this episode.

And to all of our independent tech researcher friends out there who are doing so much work to help us and also the public understand what's going on. We hear the show really respect the work that you're doing in not so great conditions. Um, so keep on keeping [00:48:00] on and we will. Do our best to keep platforming voices and research that we think is both useful, but also brave in our current environment.

And with that, we'll see you next week.

Stay up to speed on tech politics

Subscribe for updates and insights delivered right to your inbox. (We won’t overdo it. Unsubscribe anytime.)

Illustration of office worker in a pants suit leaning so far into their computer monitor that their entire head appears to have entered the face of the computer screen