E80

Local Laws for Global Technologies w/ Hillary Ronen

Read Transcript
Listen on:

Show Notes

What’s it like working as a local representative when you live next door to Silicon Valley?

More like this: Chasing Away Sidewalk Labs w/ Bianca Wylie

When Hilary Ronen was on the board of supervisors for San Francisco, she had to make lots of decisions about technology. She felt unprepared. Now she sees local policymakers on the frontlines of a battle of resources and governance in an AI era, and is working to upskill them to make better decisions for their constituents. No degree in computer science required.

Further reading & resources:

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Hosts

Alix Dunn

Release Date

October 17, 2025

Episode Number

E80

Transcript

This is an autogenerated transcript and may contain errors.

Alix: Hey there. Welcome to Computer Says Maybe. I'm your host, Alix Dunn, and in this episode we are gonna go local. We have talked a ton about how local policymakers should or could participate in better decisions related to how technology companies and technologies generally are incorporated into city things like law enforcement or public health or public transit. Even though local policymakers get this kind of name check a lot, we haven't actually sat down with someone who was democratically elected locally to represent a community. And that's what we're gonna do today. And it's super interesting to hear how someone who maybe doesn't come from a tech background is confronted in those types of governance roles with all kinds of decisions that need some type of underlying understanding of the politics and the technologies that we are being bombarded with.

Alix: So with that, let's dig into it with Hillary Ronen.

Hillary: Hi, my name is Hillary Ronen. I recently served two terms on the San Francisco Board of Supervisors, and in the six months I have been out of office, I have been working with local Progress Impact Fund and AI now institute to write a report that was. Recently published called Local Leadership in the Era of Artificial Intelligence and the Tech Oligarchy, which aims to educate local elected officials and policy makers about why and how they can step up right now to protect their constituents from the harms of AI technology, and make sure to be having the right conversations.

Hillary: That are nuanced and important when deciding whether or not to purchase and adopt this technology in their cities and counties throughout the country. How did you decide to run for [00:02:00] office?

Alix: I feel like that's like such a big deal and a big thing to do.

Hillary: It is, and I, I, I can't say that I came into it in the most typical way.

Hillary: I was sort of a reluctant politician, which it's good to have reluctant politicians because we tend to stay true to. Public service in a way that people who have ambitions and want to be a politician, if they ever had it, lose it quickly. But on the other hand, I'm not sure I would fully recommend it because it's such a grueling job that you sort of need that.

Hillary: Crazy ambition and quest for power to be able to sustain it. The way that I came to local politics is I worked right outta law school. I went to law school. I wanted to work on social justice issues just from my own upbringing in issues I saw growing up. And I worked for about seven years straight out of law school at a local, uh, nonprofit serving immigrant workers.

Hillary: [00:03:00] Ended up becoming much more of an organizer than a lawyer. Because I worked mostly with undocumented immigrants in the underground economy, and the law does not work in those contexts. Even if we'd bring a case and win the case, actually recovering unpaid wages for a worker or attaining anything that would resemble justice was virtually impossible.

Hillary: So I ended up becoming much more of an organizer and tried to organize, uh, low wage immigrant workers, mostly day laborers, domestic workers to fight for their rights and to participate in social movements for change and. Within that, I ended up working with domestic workers to write a law for the state level of California called the Domestic Worker Bill of Rights, which eventually became state law and just kind of fell in love with that process of, oh wait, we can actually write laws so that they work for people [00:04:00] and we can actually include the people that are gonna be impacted by the laws.

Hillary: In the process of writing those laws, and I so enjoyed that process that when there was an opening to be a legislative aid for my predecessor on San Francisco's Board of Supervisors, San Francisco is a city and a county, so we have sort of double powers within one body. It's called the San Francisco Board of Supervisors.

Hillary: I applied to be a legislative aide with him, David Campos, my predecessor. I got the job. Honestly, that was the best job I've ever had in my life. I loved being a legislative aide much more than being the supervisor herself because I love the work of organizing, bringing people together. Putting together puzzle pieces to create solutions and turn that into law and fight with the city attorneys when they tell us we can't do something and find ways around them telling us no.

Hillary: And that was working on every issue under the sun, because at the [00:05:00] local level, you're really at the ground floor and you. You're looking eye to eye with people that want to live in safe, healthy communities, and they expect you to make that happen. So I loved that process and when my predecessor was termed out of office.

Hillary: The way that the lineup of candidates looking to replace them unraveled. We were gonna lose our progressive leanings in the district. I was gonna be out of a job anyway, after a lot of pushing and prodding from a lot of different people, I just decided, well, I know how to do this job and I. Don't want our work undone.

Hillary: Most importantly, the work I'd been engaged in for about seven years, and so I ran for office and won the seed and then ended up serving two terms. I just finished my last term at the very end of December of last year.

Alix: Amazing. I don't follow local San Francisco politics closely, but it feels really entangled [00:06:00] in a lot of the technology discourse that's happening at like the national and global level.

Alix: And it feels a bit like a frontline in a, a way of sort of how we're conceiving of what cities. Should be what local governments should do, what good looks like in governing. Um, do you wanna talk a little bit about what it was like to be working in a city government where the technology industry is so dominant and where they're kind of constructing our imagination about what cities should look like?

Hillary: San Francisco local government is a particularly powerful place in every sense of the word. All you have to look at is Kamala Harris was our district attorney in San Francisco, our local government. Gavin Newsom, the governor of California, was the mayor, served on the same board I did, was Board of Supervisors and came from there.

Hillary: Nancy Pelosi, speaker of the house came from San Francisco. I mean, we tend to go on to very high places in San Francisco because of the cutthroat. Intense nature of local [00:07:00] politics to run. As a supervisor in San Francisco, nowadays you have to raise millions of dollars for a local race. And mind you, San Francisco has a population of less than 800,000 people.

Hillary: So we're tiny. We're tiny city. We're tiny, big city. And yet, in order to run a competitive race in the city, you have to raise so much money because the amount of tech influence and tech money in the city and directly involved in politics in San Francisco is extraordinary. And it's, it's gross. It's disgusting, and it's extraordinary and it's gets worse every single cycle.

Hillary: From the very beginning of working as a legislative aide in City Hall all the way through my two terms, as supervisor, we were dealing with the issues that technology and the big tech companies are creating in society from the Gecko. So one of the. Very first fights that I worked on was trying to reign in [00:08:00] Airbnb's complete upending of the rental market and the hotel industry in San Francisco.

Hillary: We have a very, very powerful. Hotel workers union that has pretty much unionized all the hotels in San Francisco, living wage jobs with retirement benefits, health benefits, uh, and these are for very low wage, mostly immigrant workers. And all of a sudden their jobs were all at threat because of the Airbnb.

Hillary: And so there was a massive effort to limit who could rent their house out. To tourists in San Francisco and Airbnb spent millions of dollars trying to limit our ability to limit their takeover of the housing market. And we kind of. Landed somewhere in the middle in San Francisco. The vestiges of that sort of power really continued to this day.

Hillary: Airbnb was created in San Francisco and Joe Geia for one of the, one of the [00:09:00] founders is now a head worker in Doge, right at the federal government. So there is a direct line from what starts here in San Francisco and then where it goes and the links to the federal government. Because of preemption laws at the state level, we were never able to regulate Uber or Lyft.

Hillary: But what we were able to do by getting special permission from the state government was to tax them eventually through a ballot measure. To mitigate the impact on traffic and on traffic safety. So when you ride an Uber or Lyft or a Waymo in San Francisco, you pay a little tax. And that goes to helping make our public transportation system more, more robust.

Hillary: So again, always this intersection between the tech industry and really the kind of bread and butter issues that local policy makers, local legislators deal with on a daily basis. Then we had this big fight over a Twitter tax break, which was another fight and [00:10:00] another issue that local governments deal with all over the country.

Hillary: But we started right here in San Francisco giving huge tax breaks and subsidies to billion dollar corporations to lure them. To headquarter into the city. I'm not sure. I'm not sure there's a net positive when that happens, but our former mayor, ed Lee, won that fight, lured Twitter to mid-market. It was a revitalization effort because it was an area that had large vacancies and a lot of social issues on the streets.

Hillary: Twitter came in and didn't pay taxes for many years despite being valued at billions of dollars. Now, once Musk took over, he left San Francisco, and I think it's pretty much Wi Lee noted that the Twitter tax break never really brought the prosperity that the mayor had hoped, but instead, the constant luring of the tech industry to San Francisco has created one of the worst housing crises.

Hillary: In the world. We are now one of the most expensive cities, [00:11:00] cost of living. The rents, housing costs here are unlivable for teachers. For restaurant workers, for many city workers. So it became quite a disaster. So because up until recently we still had a Progressive and UNB bought by corporate majority on the board of supervisors.

Hillary: We were the first city to ban facial recognition technologies used by the city. We were the first city to ban rental price fixing and companies like homepage, using algorithms to drive up rental costs. By getting that proprietary information from big landlords and then colluding to use that information to drive up costs in a organized fashion.

Hillary: So we were still up until recently, really able to place limits and controls on how this technology causes harms to people. But I fear that that's changing.

Alix: Yeah, there's so much in there. I feel like [00:12:00] you use the word preemption, which I think implies that making a law at the local or state level that conflicts with a federal law can't be done in certain situations because that has been prevented, is that.

Alix: How preemption works.

Hillary: So the way the preemption works is that either, you know, higher level of government, so either the state or the federal government will say, we occupy the field in regulating this particular area, and lower governments are not allowed. To make laws and regulate in this area. And so, you know, the example that I gave, the state legislature in California said that local governments cannot regulate TNCs.

Hillary: What are they? The Ubers, the Lyfts, the Waymo's. That that can only be done by this. State Public Utilities Commission. And so that's the preemption that took place there. So even though Ubers and Lyfts and Waymo's are creating traffic hazards and even though workers are being exploited and all [00:13:00] these horrible things are, are happening in neighborhoods, the Board of supervisors has no power to regulate or control the industry.

Hillary: I'm sure you're asking me this because right now as we speak, the horrible BBB bill, I won't even call it by their name. They are attempting to completely occupy the field and preempt all state and local government regulation of artificial intelligence. And if that pass, it will be a disaster. And there are many people that I'm working with that are doing everything their power to try to prevent that right now.

Alix: So I wanna talk about the report that you've been working on, but I wanna hear a little bit more about, I mean, one, were you involved in procurement of AI systems as part of your role as an elected official? Or was this something that as you worked in that capacity, you realized that this was this emerging issue, that no one was really navigating with any foresight or proactivity, but there was this like kind of, yeah.

Alix: Like how, how did you get involved in in this? I.

Hillary: Was not [00:14:00] following these issues at all, Alex, until about maybe a year ago. I mean, other than what I was talking about. You know, I mean, San Francisco is a hot bed of technology, but AI in particular and some of the more harmful technologies that exist today that are being employed in our communities, I wasn't, wasn't playing.

Hillary: Close attention to that. And the Board of Supervisors had to approve the acquisition of automatic license plate readers, which are being used all over the country and are becoming increasingly problematic. And in many ways, we had at the time, extremely progressive Board of supervisors is an extremely.

Hillary: Well educated, pro-immigrant rights, you know, a board of supervisors that that definitely tried to reign in the excesses of police and their use of technologies. I mean, really a great board of supervisors. And we have, you know, electronic Frontier Foundation in our city. We have the ACL U. We have wonderful civil society organizations.

Hillary: Not one person came [00:15:00] out to protest the contract and it just. Glided through our budget committee, which I wasn't on at the time, and came to the full board for approval. I voted for it and it passed without discussion. So I of course was not conscious of this at the time. Like I, you know, there's a million things going on.

Hillary: Every single day when you're a local legislator, there are hundreds of things you're voting on every single meeting. So if nobody is making noise about it, if your colleagues who are on the committee and usually study the issues more deeply, don't find any problems with it, it was being billed as, oh, you know, it doesn't record faces, it just records the license plate, and it's just gonna be used if there's active investigations with that car.

Hillary: Criminal investigations with that license plate that the police will be alerted and nobody else will be alerted. So it seems like relatively [00:16:00] innocuous. It just glided through. And when I started about a year ago, reading and studying all these issues and getting involved, I thought back to that vote and I just, I couldn't believe it, you know, I couldn't believe I voted for it, and I couldn't believe several of my colleagues who are as passionate about social justice issues and protecting community as I am.

Hillary: Voted for it and didn't say a word, and that nobody from the outside came to draw attention to the issue.

Alix: I think this is really common actually, that like people who historically have engaged on political issues see technology issues as distinct and as not. Actually questions of power. 'cause it can look like about efficiency or it can look like, I don't know, just like kind of boring a little bit and technocratic in a way that doesn't feel nearly as threatening as other types of power politics we're used to engaging in.

Alix: Is that your experience in terms of like separating these two things out? What do you think it would look like to awaken policy makers to understanding that some of these technology questions [00:17:00] are the vector of the power politics that they're used to historically? Like what do we do to make that.

Alix: Epiphany faster and easier for people.

Hillary: Absolutely, and I, I'm the perfect example of this, right? I don't think I ever paid much attention to any technology issue unless it was so in my face related to a social justice issue I was dealing with on the ground. I sort of had this awakening and we're just watching it happening before us with Musk and.

Hillary: Peter Thiel and Mark Andreessen and David Sachs, you know, having these positions either officially or unofficially that are really directing so much of what's happening on the federal level, and then looking who they are and how they imagine. Society should look in the future and all the myths that they're creating.

Hillary: I just woke up and said, whoa, this is bigger than, than just, um, whether or not to [00:18:00] allow automatic license plate readers in our communities. This is about surveillance and the capture of data and how. Going to be used by a very limited number of very powerful, mostly white men to control and undermine and ultimately destroy our democracy.

Alix: Didn't they try this in San Francisco? I mean, I feel like they they it

Hillary: right now in Alameda, right? Yeah, right across the way. Yes. They are trying to build these network states where they're trying to buy massive plots of land that are in. Remote areas that haven't been developed yet, and create new cities that are outside the laws of any city, state, or country where they can roam free, free of any sort of regulation.

Hillary: And. Innovate to their heart's, contents. I mean, this is how they describe it. And they've tried to do that here in California, in the Bay Area, not in San Francisco particularly, but in the Bay Area. They're trying to do it right now across [00:19:00] the bay from us in Alameda on a piece of federal land. They're trying to get Donald Trump to declare a state of emergency so that they can basically.

Hillary: Take over the land and, and, and occupy it and create this network state. I mean, so, so this is happening. This is seriously happening, and it's funded by these people, the same people that are hiring all the individuals to work in Doge that are sending people from their own companies to work in Doge that are capturing federal government data and creating a massive dossier system on every single.

Hillary: Resident and human being in the United States of America. When you start to look at what's happening and how it's happening, you realize that it doesn't matter if I don't know how to program, it doesn't matter if I didn't get a degree in computer science. I understand power, I understand money, and I, can read the articles and see the tea leaves on these people's intentions.

Hillary: That's enough for me to engage. [00:20:00] So the first thing I would say is I think many elected officials don't engage in this issue because they're intimidated by it. I certainly was. I don't understand how technology works or how to program, so this isn't my issue, you know? Other people can, work on that and protect on that, and we have to get past that.

Hillary: And we have to say, first of all, that's not necessary. You don't need that technical knowledge in order to understand how to, how to regulate systems to make sure that they don't harm human beings, because you can rely on your IT departments to make sure that the technical aspects are covered. What you need is to step up as a policymaker and say.

Hillary: You know, I believe in our democracy, and I believe citizens have a right to privacy in our democracy. And so therefore, we're gonna minimize the surveillance technology that we use in this government. And we're gonna say no to some of these companies that are marketing to our police departments left and right every single day to [00:21:00] sell every technology under the sun and just say, no, we.

Hillary: Continue to value privacy. We continue to value the protection of our data. We don't want our own data used. By these billionaire companies to further price gouge us, surveil us, and control our movements, and in many cases now deport and disappear our coworkers and our neighbors and the parents of our kids' friends in schools, which is happening as well.

Hillary: So again, I, I think that this is a really important moment for local legislators to wake up and say. Whoa, let's really understand what's happening here and get beyond the hype of AI technology. And you know, my husband loves chat, GBT for example. He thinks it's so great and I'm like, yeah, chat. GBT does some, some really wonderful things.

Hillary: I've used it to plan a vacation. You know, like where should I go visit and what are the blessing? There [00:22:00] are some limited uses of the technology that are really, really great, but. At the expense of what, and we are not having those conversations and we are not doing that analysis. And that's precisely the type of work and analysis that policymakers should be doing.

Alix: Amazing segue to talk about this report. 'cause I feel like it's really nice to hear about. I dunno, your own sort of self-education on these issues and then realizing that there's basically no, I wouldn't say neutral 'cause I don't think you're coming at this from a neutral lens, but there's no non-industry method of educating yourself on these issues or knowing as a local policymaker, how you should consider incorporating different technologies.

Alix: Procuring them, overseeing them, and kind of being clear-eyed about, you know, what. The opportunities they present and the kind of pitfalls that they also present. And I feel like it's just really nice to have an actual policy maker make a report like this, particularly kind of starting from zero and being like, if you could travel back in time five years ago and give yourself this report, like how, how [00:23:00] might it have changed your work?

Alix: So I wanna give you a chance to just share a little bit, like, was that the motivation? Was it kind of make something you wish you had? Or like, what, what was, how did you get involved in this? And maybe say a couple words about what it is and what it was meant to do.

Hillary: Yes, absolutely. So I, as I explained and had my own awakening of, you know, how important the overwhelming power grab that these billionaire tech corporations and, and the individuals at the helm of them have taken in our society and how much.

Hillary: Hype around AI and the way that these systems are being developed are actually causing real harm to us today in our communities. I started, you know, reaching out to different people that work in this area and there was no one really focused at the local level. There are tons of works, and increasingly the labor movement has been getting involved in trying to pass regulation and laws at the state level where.[00:24:00]

Hillary: Um, it's, it's sorely needed and where it should happen. I mean, ideally we would have functional government and the federal government that actually cared about the people of this country as opposed to a handful of billionaires, and they were doing it at the federal level. Given that, what an

Alix: idea,

Hillary: what an idea given that Well, you know, and, and, and under, we should say, under the Biden administration, even though there wasn't any direct regulation or laws on AI that was.

Hillary: Incredible enforcement work being done, especially by Lena Khan and the FTC and in other regulatory agencies. So it was happening, it's all but completely stopped now. And so there are tons of activists and advocates and groups and policy organizations and labor movement that are working at the state level to fill in the gap and, and the void where the federal government is failing to protect communities from the dangers of this technology.

Hillary: But at the local level, there really wasn't anything and there wasn't much advocacy. And when I was a supervisor, I was a member of an [00:25:00] organization called Local Progress, which is a wonderful national organization of local elected officials. It's a membership based organization that help councils create connections, spread good ideas to self-proclaimed progressive elected officials at the local level that are trying to solve some of the hardest issues of our time, homelessness, mental illness.

Hillary: Police abuse, protection of immigrant communities, et cetera. And so I approached that organization and I said, you guys aren't doing anything around technology and ai. And this is increasingly becoming an extremely important part of everything we do. It crosses every issue area. It will help if we sear it in the right direction or exacerbate, which is what's happening right now.

Hillary: Every social issue we care about and work on, you know, would you partner with me and, and can I, you know, start off and write a report trying to convince my peers, my local elected peers around the country. [00:26:00] I wasn't paying attention either. Now I am and you need me to as well. And so they said, yes, let's do it.

Hillary: And they supported me in writing this report. And so first of all, you know, our powers are limited at the local level in terms of what we can do. So I had to figure out well really, where does our power lie? How can we be part of this growing movement of. Local legislators, state legislators, civil society unions, community groups that are gonna work together and start to.

Hillary: To be a countervailing force against the overwhelming power of, you know, the dozens of billion dollar tech companies that are increasingly shaping our future. There are a few ways. So let me start with the first way, which is the government itself, being a user and a purchaser and a promoter of this technology.

Hillary: And what I would not have known a year ago, and when that automatic license plate [00:27:00] reader information came before me, I created a list of questions that I wish I had known to ask at the time. It starts with this first question, what is the problem you're trying to solve? And why is AI the best way to solve this problem?

Hillary: So you're asking these questions to the department head that's seeking to purchase this technology. And if they can't clearly answer that question, then that is your first sign that this is more hype-driven than a tailored response to a problem that perhaps other. Strategies have not been able to solve.

Hillary: Right? And if we could just get policymakers around the country to ask those two questions and be serious and engage deeply about it, that would be a tremendous help. Because again, there's just so much hype around AI, and the technology, quite frankly, isn't ready for prime time. But because the creators of the technology have [00:28:00] raised millions, if not billions of dollars in venture capital, they have to show.

Hillary: Results they have to start turning a profit on the technology eventually, otherwise it'll all explode in their faces. And so they're just selling this technology like crazy to governments, right?

Alix: Because partly I think government procures are gullible consumers. And also once you sign a contract with a vendor, like the relationship between a government and a vendor is so different than an individual consumer and a company.

Alix: Because it means basically if you get locked in, you gotta keep. Buying it and also you, you have much bigger budgets and there's a desire. I mean, before we move on, I feel like this hype point is really interesting 'cause I think most people, like again and again watching people that have formal roles in government being taken in by it.

Alix: It makes me want to ask sort of, can you describe a little bit what the incentive structure or is around a local policymaker who is considering these technologies? Why might they rush to purchase these [00:29:00] solutions even if ultimately it's gonna be embarrassing?

Hillary: So, you know. Our political system in the United States is broken.

Hillary: Let's just start there.

Alix: It's a wonderful disclaimer. We should make it more often. Yeah.

Hillary: Um, there were already huge problems, but Supreme Court case, citizens United, that basically opened the door to have unlimited money in politics. Really just term the knife, right? I mean it just, it just broke American politics.

Hillary: This is happening all over the place. You mentioned it in your report on data centers that you recently put out. These companies have so much money that they are, well, first, for the individual politician they are giving. Millions of dollars to politicians that they think that will do their bidding and that won't challenge them or challenge what they want and will rapidly believe all their hype and buy their technology.

Hillary: So those people are getting into office more and more. So that's one issue and one problem. Another issue and problem is that. We have years of not investing in the social safety net. [00:30:00] Right? So going way back to the, the Reagan years, the Republican party in particular has been trying to shrink the role of government in society.

Hillary: Government is supposed to. To do the things in society that that capitalism fails at, right? Which is take care of people that aren't making it. Have a social safety net and make sure that e everyone has their basic needs met so that they can live with dignity and be productive members of society over the last.

Hillary: 60 years or so, we have been shrinking away from that duty as a government and we've been giving tax cuts to the rich and shrinking the social safety net for the poor. I mean, we're doing it right now. Congress is trying to do that right now and the. Bad big bill that that they're trying to pass, that would shrink Medicaid to such an extent that millions of people would lose healthcare coverage.

Hillary: It's constantly happening. It's been happening at the federal level, at the state level, at the [00:31:00] local level, and so governments are increasingly starving for funds. We have to do something about the fact that people are dying in our streets, like especially at the local government level, right? They come to me and scream at me when.

Hillary: People are pitching tents in front of their homes, in neighborhoods in our communities, right? We can't ignore that. We have to fix that. But how do you fix that? You can't disappear human being. So we have to actually have a place for them to go. Well, building shelters, building affordable housing, that all is extremely expensive and we need revenue to do that.

Hillary: And it's. Instead of investing, you know, our tax dollars, sort of the overall social good. You know, at every level we've been starving communities. So what does that mean? That means that local lawmakers who are the ones on the ground who are failing the residents of their city on some of these issues, oftentimes in San Francisco, you know, we've been the ridicule of the world on this.

Hillary: You see [00:32:00] the signs of poverty now all over the streets where people are sick, are dying or addicted to fentanyl and are homeless because we don't have a robust enough social safety net to take care of people. And so politicians have to look like they have a solution to this. Right, because otherwise they're afraid they're not gonna get reelected or they're not gonna go on to higher office.

Hillary: You know, each term is four years. We have a two term limit in San Francisco. So they have very little time to show that they're effective or to show that they have ideas, to fix things, to get reelected, and for many of them who wanna go on to higher office to get elected to hire office. So what are the ideas?

Hillary: What do you do when. You know, we have to balance our budget at a local level. In San Francisco, we have a $14 billion budget. We have a massive budget, but we have to balance it. We can't deficit spend, same at the state level. It's only the feds that get to, you know, increase our national debt and print money and spend, whether they have it or not.[00:33:00]

Hillary: They have just refused to give more money. To help the social safety net so we don't have a real root cause solution to this problem. So we have to pretend to people that we have a way of fixing it. And what's everyone's favorite way to pretend? Efficiency. If we were just more efficient, if we just stopped wasting money, then we could solve these problems, right?

Hillary: And AI gives a perfect excuse and way for them to do this, right? This will help us figure out where the waste is. And I'm not saying there's no waste, I'm just saying. It's an overblown excuse and way for people to pretend that they have solution to problems that they don't have because the solution is simple.

Hillary: Instead of wasting money on tax cuts for billionaires and nonstop wars in foreign countries, we should be spending on the. Basic needs of the residents of the United States [00:34:00] to quality housing, quality education, nutritious foods, and healthy communities. Right. And, and it's a fraction of the cost of all that other stuff.

Hillary: Right. And healthcare, of course. And, and quality healthcare. I, if we were able to provide these things, then the people in office that are making them happen would be incredibly popular. Right. But we're failing. And unfortunately at the local level, we just do not have enough money to fix these problems for the country.

Hillary: And so politicians look for like new, innovative excuses and ideas to say, oh, I have the idea for how we're gonna fix this and. AI becomes a really good excuse or plan, and if you say anything negative about it, then you're anti-progress that comes into the picture. So

Alix: Well said. Because I think, no, seriously, like I, I think I, I, I, I think a lot about the desire to perform governance and that technology gives you this, um.

Alix: I don't know, this like [00:35:00] vaporware thing and even though what it actually is, is an additional misappropriation of resources to an unrealistic set of solutions and a complete denial of the structural issues that you're actually trying to confront and an attempt on a very sometimes short term in office or short budget cycle be seen to be doing something regardless of whether that's actually a good idea and like to just kind of draft on these vibes.

Alix: Of progress and utopia and the tough talking efficiency person who's gonna come in and be like the reasonable manager of everything, even though most of the time that doesn't work and shows that you have no idea what the actual problems are or the appropriate solutions. Yeah. But it's really well said.

Alix: And I think focusing on those first, like what are the actual problems and the structural issues and like really getting under the hood of those issues and recognizing that there is no simple solution. And if there was like someone else would've. Done it already. So how do you, I mean, I'm curious how you engage with those politics about [00:36:00] like the kind of bad faith use of technology to obscure versus the like good faith policy makers out there that are like actually earnestly trying to figure out how to procure technology that makes things better.

Hillary: Yeah, and there are those people you know, and there are some ways that technology can be really helpful and we need to, as policy makers, we need to be able to, to tell the difference. In the first place, we need to be procuring technology. That a couple things that we need to make sure it's safe, right?

Hillary: There's all of these different ways that we have to dig in deep and really question departments that seek to procure this technology. And I almost think that cities should have an overarching policy of sort of hyper vigilance when it comes to AI products. They should have special hearings and requests, special reports from departments that really answer all of these questions and.

Hillary: Many vendors won't even [00:37:00] share the information with cities. And if that's the case, then maybe we shouldn't be working with those vendors, right? Like if it's a black box system that we're never gonna understand, then perhaps we shouldn't be purchasing it. Also, we need to figure out, like you mentioned this, Alex, about.

Hillary: What happened? Are we gonna get hooked on this technology and dependent upon this corporation to the point where if they end up going outta business or decide to stop supporting the product, then we're screwed

Alix: or just charge you more. Or, you know, use that leverage they have in a myriad ways that I'm sure they're, they're they're mindful that they can.

Hillary: Exactly. Exactly. And then there's this issue of job replacement. This is sadly, the big tech companies right now are all, you know, in a race to produce artificial general intelligence or super intelligence, whatever they, they call it, basically a one size fits all product that is smarter than any human being that ever lived and can do everything we do only 10 times better than us, or a thousand times [00:38:00] better than us.

Hillary: And that's what they're all pursuing. Even though they acknowledge so many of them acknowledge that if they were able to reach that, you know, I'm very skeptical that they will be, but if they were to reach that, then the impact it would have on society would be devastating that millions of people would lose their jobs, upending our entire social order.

Hillary: They have not figured out how to do this in a way that doesn't advance climate change. Perhaps the biggest issue of our time, and they're doing it. To replace most human beings, job and upend society. Like, what the heck? Like this makes no sense. Why are we doing this so that like 10,000 white billionaires can get even richer, can become trillionaires?

Hillary: Like, what, how is this good for us? Instead of, let's figure out how to use this, these breakthroughs in technology to build very narrow interventions that start solving. Our biggest social problems. That is not the discourse that is happening in society. Instead, if you bring this up, you know you're labeled [00:39:00] anti-technology and money is thrown to your opponents in the next race for your seat.

Hillary: We need to start having this conversation in local legislators all across the country. We need to start asking these questions. We need to start thinking critically and not saying no to technology, but saying. Well, what is its purpose and does that help us or harm us and, and what does that mean for these other issues?

Hillary: How does it intersect with other issues we really care about and have the much more nuanced conversation?

Alix: Yeah, I love it. I, I mean I, yes, 100%. And I think what I also like is hearing you kind of awakening a sense of confidence in the appropriate role that a democratically elected official should have vis-a-vis technology.

Alix: And it's not to become a technical wonk, it's to really step into the role that the public needs you to play in stewarding resources and dealing with actual problems. And this kind of. I don't know, like stiff arming, uh, that technology people have done of basically saying you don't [00:40:00] get it. Like, you can't be a part of the actual decision making that that's, that's bullshit and it has to, has to stop.

Alix: And I'm really glad to see that you're awakening in that way and that there's the possibility of others. Similarly, and I, I just wanna, I wanna share a little bit of the report 'cause I feel like even just like the structure got me thinking a lot more specifically about how local policy makers should be thinking about these issues.

Alix: And I feel like, I mean, I kind of just wanna go, so you have like a, a section of the report, which we'll link to in the show notes, obviously that kind of takes policy makers through buckets of thinking they should be doing and with some normative suggestions about like what you think good like. Looks like in those sections.

Alix: I love this idea of an inventory of AI management. So like thinking, if you're responsible for the budget of a city, you should know when you have procured what technologies and what is the inventory of AI that the city uses so that you can put processes in place in a. Specific and kind of granular way.

Alix: And the only way you could do that is to start by knowing what you've procured. Have you seen this done in practice [00:41:00] and do you wanna describe a little bit more about what you think it could look like?

Hillary: Sure. Yes. It was the last piece of legislation that I passed as a San Francisco supervisor. It was creating this inventory and it goes into effect next month.

Hillary: So San Francisco starting next month should on its main data webpage, have an inventory of every AI product that the city and county uses and not just. The product, but quite a bit of information about that product so that the public is aware and can evaluate and can chime in on whether or not that makes sense.

Hillary: In addition to just sort of the basic information about the product, how it was developed, how it was trained, how it keeps stores and uses data that's inputted into it, what kind of testing around bias and discrimination, the results of those tests, how the product mitigated for those issues, et cetera.

Hillary: But then there's some questions in there. So most of those questions go to the vendor. Itself to answer how the product was developed and created and and tested. [00:42:00] But then there's a list of questions for the department that's going to use that product, including what are some potential impacts that use of this product could have on the people that might interact with it or be impacted by it to get them to think about those impacts.

Hillary: What impact is this product gonna have on the. Public sector jobs and the people using it in city government, does it change their work conditions? Does it replace their job altogether so that we can start to understand the impact on our city workforce? So it's an inventory so that we know all of the products that the city and county are using, including information about the vendor that's selling the product, but also on the impact of using that, that product on the residents of San Francisco and on the city workforce.

Alix: Yeah, that makes loads of sense. Um, and I feel like it's also, I just seems so basic to me. I'm really sad that this isn't already in place, but, um, I think it's a really good way of [00:43:00] spelling that out. So then you have this inventory, you know, what you own control or responsible for as a city. And then it comes to, okay, so a company comes to us and says, Hey.

Alix: There's this new cool thing that's gonna transform elementary school education in your state. Um, uh, here, uh, buy it from us. How does one go about deciding if one should buy that technology? And I think this whole section on pre-purchase review and like conceptualizing procurement as both a. Detailed process that is important, but also as a lever of change in terms of the expectations you have towards companies.

Alix: But do you wanna describe a little bit about what that procurement review would look like in your ideal world?

Hillary: Sure. And the report lays out a list of questions that lawmakers should be asking, any department that seeks to use. The product, and it's very similar. It asks a lot of similar questions to what I talked about in the inventory, but it asks even more questions about effectiveness and oversight, whether there's been [00:44:00] independent evaluations of the system, how effectiveness of the system will be measured over time, whether or not the AI system.

Hillary: Could be shut down or modified if it's found to be harmful or ineffective, how often it will be reviewed. The total cost, this question of what happens if the vendor stops supporting the AI or the contract ends. So it's really, it suggests a list of 20 questions that, that get into the nitty gritty of sort of the major issues with using this ai.

Hillary: But in general, I, I'm wondering if it makes sense for cities. Given the problems with this technology that we've seen all over, given the potential that it has to replace so many jobs in the future, that there should just be overall heightened scrutiny at the city level when there's the consideration to procure AI products, and especially it's not just impact on jobs, it's also the impact on privacy and, and data having.

Hillary: Policies of data [00:45:00] minimization, like, do we really need to collect this data on our residents? Let's only collect what's absolutely necessary to perform on job, and then let's protect that data at all costs from getting to third parties.

Alix: Yeah, also, federal privacy law would be really great, wouldn't it? Oh my goodness.

Alix: Wouldn't that be lovely? Oh

Hillary: my goodness.

Alix: Or at

Hillary: least, at least protecting the, the state ones we have here in California. Yeah. Our privacy. It's true. CPA ain't

Alix: bad.

Hillary: You're getting,

Alix: they're getting

Hillary: gutted. Yes, it's

Alix: true. Um, okay. And then I also really liked, I mean the two pieces, um, you talk a little bit about how to consider buying technology too, is public consultation and engaging workers in the process of understanding the implications of procuring this technology.

Alix: Do you wanna talk a little bit about. What good would look like in your head in, in those two areas?

Hillary: Absolutely, yes. I think the workers that are gonna use the technology should be intimately involved in a, deciding whether to procure it in the first place. And then B, design, let's say we decide, yes, AI could [00:46:00] really help us perform a particular task better.

Hillary: Let's have the workers who are currently performing those tasks intimately involved in every. Part of the creation of the technology, the implementation of the technology, the DEC decision to use the technology, the, uh, evaluation of its use, evaluating the impact on people that we wanna design this technology to augment and, and make workers.

Hillary: More effective and better at their jobs, but I don't think we should have a goal to be replacing jobs with this technology. One way of really ensuring that that's the end result if we decide to use it, is having the workers themselves involved in every stage. And there's many ways that local elected officials can help facilitate making that happen.

Hillary: The main roles of public officials is you have the power to write laws. Then you have the power of inquiry and you have the power of the purse. Those are the three powers. So the power of inquiry, you [00:47:00] can hold hearings in any legislative body on whatever you want to and bring attention to. The issues.

Hillary: Require departments to report on those issues, to bring data. You can have hearings on. How are workers gonna be involved in this decision making of using technology in the city going forward, you can direct your human resource manager. To bargain with unions to create language and collective bargaining agreements to involve workers in the process of bringing and integrating new technologies into the workplace.

Hillary: So there's several ways that local elected officials can, you know, I, I say democratize the process of procurement, bring civil society and bring workers in. To be part of this conversation, and then we'll have a much more nuanced conversation and ultimately we'll get products that will actually help us without causing the harms that so many of these products are causing in cities throughout the country today.

Alix: [00:48:00] Totally. You've brought up money a bunch, which I think is really important. 'cause I think sometimes we talk about technology power in terms of the political power it gives, but I think that the fact that it's so connected to money feels really important to keep bringing up. And I feel like one of the things I also really like about the report is you get into taxes.

Alix: We just did this report on how data center development has been happening in different countries and one takeaway. Seems to be that when tech companies approach a city, there's oftentimes this deep excitement about those companies coming into their backyard. Obviously, physical infrastructure is different than what we're talking about, although you do get into it in the report and cities are rushed.

Alix: To subsidize this stuff, like actually spend money modernizing the grid for the purposes of serving a spike in electrical needs of a data center in that example. Or just giving them tax breaks for really, really long periods of time, or not requiring things that they would require of any other business that was asking to zone a piece of land in the same way that power of [00:49:00] subsidy, it feels like it's changing a little bit, but do you wanna talk a little bit about what you wanna see local policy makers when they consider tax breaks, benefits, subsidies.

Alix: When considering partnership with these kinds of companies?

Hillary: Absolutely. I mean, just, just study the records, study what's happened in other cities and other places because when there's been these subsidies to lure, let's say, a specific company in, those have rarely brought. Widespread general economic development that benefits the city.

Hillary: This has happened so much in the United States that you can actually see over time the results. Like did they get what they were hoping to get out of it? I mentioned the Twitter tax break in San Francisco. Mid-market is just looks the same to me as it's always looked. Twitter's gone now even though they didn't pay taxes for years.

Hillary: Like they should have. Meanwhile, that brought in a wave of high paid workforce that pushed out and drove up housing [00:50:00] costs of an entire population in San Francisco. So is this truly a net benefit to communities when you use tax breaks and subsidies to release companies in? I think the evidence does not show a positive result there in general, and that is.

Hillary: Times 100 when we're talking about data centers, which is a major way that right now that cities have given away hundreds of millions of dollars to major corporations to build data centers in their communities. And the impacts that those data centers have had on communities have been, in many cases devastating.

Hillary: Instead, let's start. Experimenting with how we can tax these companies to mitigate the impact that some of their products are having on our communities. Taxation is never simple, especially at the local level. You know, there are limited abilities to tax one. Very interesting tax that [00:51:00] Chicago was the first to do.

Hillary: It's called with the cloud tax. They just increased that tax to 11% and are bringing in about over $880 million a year. So an annual tax, bringing in that amount of money by taxing. Products that people purchase on the cloud. The slight problem with that is that's not necessarily progressive taxation. You know, regular people purchase Netflix and are, are paying a tax 'cause they're, they're downloading that from the cloud.

Hillary: That's not the ideal tax. You really wanna get taxes that mostly impact the corporations that are. Creating the harm in society or that are creating the social impact that you're trying to mitigate. In San Francisco, a long time ago, we started talking about a local robot tax. So if this new automation is truly gonna displace so many jobs, and that's gonna transform the social order in ways that we can't even begin to imagine, let's start thinking how we would mitigate that.

Hillary: People constantly say, oh, we'll have a [00:52:00] universal basic income. Well. How are we gonna pay for that universal basic income? If you look at the record of these big corporations that are building the ai, a few years ago, they were raising their hands, asking to be regulated, and then when push comes to shove and legislators started regulating them, they're the first to come out and oppose that regulation and threaten legislators by funding their opponents.

Hillary: If they even dare take on the industry. These are the same corporations that are pushing the preemption of local and state laws around AI at the federal government. This is a double-edged area of work. On the one hand, let's rethink and be and be very skeptical about offering subsidies to billion dollar corporations.

Hillary: And then on the backend, let's think about how we can actually tax them. To make up for the harm to individuals and to our social safety net that they're causing. And you know, given that they're about to get a massive tax break at the federal [00:53:00] level, now would be a great time to tax 'em locally so that we can protect people from the harms of their products.

Alix: Totally, and I think it's, you know, the classic windfall example, like where it's like if we don't design systems where people that are extracting from them aren't compensating, sort of maintain those systems, it all collapses and it feels really. Obvious at this point. I feel like your example with Airbnb is a really good one, or with, with Uber and Waymo and having a, a small tax that actually contributes to building the infrastructure necessary to sort of deal with explosion of cars or other things that a city government would need to deal with systems like Uber and Waymo.

Alix: So it feels very sensible to me, and I'm glad that you're structuring away for local policymakers to think about it. Cool. Okay. Well, that, that was all from the report I wanted to talk about. Um, I'm wondering how do you want. Policymakers to use this. What do you want people to do with this? Should we be sharing it with local policymakers?

Alix: Like how should we be supporting the report and making it more useful so you can have the impact you're, [00:54:00] you're describing here?

Hillary: I hope first and foremost that, that it's read by, by local policymakers and that it disrupts this myth that we aren't responsible or capable of engaging with and regulating this industry.

Hillary: That is the first and foremost. I hope it breaks down that myth, and I also hope it shows that. The potential harms of AI are not in the future. They're not. When a GI is developed and everyone becomes unemployed, that's when we'll have to deal with the problem. There are real harms happening today. You know, in many instances there are AI products that are in, in widespread use in cities throughout the country that don't work, that we're spending, you know, millions of dollars on, and then they waste our resources and waste limited.

Hillary: Time of police, for example, in the, in the public safety, there's so many like shot spotter and resource router. In these predictive technologies, they don't work. And yet we're spending so much time and energy chasing [00:55:00] down, you know, a report from one of them that, that, that don't result in an actual crime.

Hillary: I mean, that's, that's wasting our limited resources, the opposite of efficiency. So those are problems. You know, we have to start thinking about the unbelievable surveillance. In every aspect of society that these products are creating, that workers all over the country are getting injured at rates, you know, three times of the past because of the unbelievable constant productivity management and surveillance that's happening.

Hillary: There's a social harm. We should be very wary of using AI to distribute public benefits. There have been disaster after a disaster on this front all throughout the country. So the report explains some of those examples to realize like we can't just. Ignore the technology today and wait until potentially things get worse.

Hillary: There are problems that are happening right now that we need to pay attention to, and then when communities come to policymakers [00:56:00] and say. This is a problem. We need you to step up and do something about it. I'm hoping this report primes those policymakers to listen to community and not to ignore them and not to think that this is a secondary issue.

Hillary: Local policymakers are so busy today, especially under. Trump administration dealing with the life or death issues at their doorstep every single day. So it's easy to put off this issue as a secondary issue, not something I need to worry about today. I, I fear that I did that when, and, you know, when I was a local policymaker, and I hope you, it's okay.

Alix: You're making up for it. You're making up for it. I'm

Hillary: trying, I'm trying to make amends. I'm, I'm hoping that, uh, policymakers read this report and so they'll be primed to listen to community and listen to activists. When they knock on their door and say, I want you to lead on this issue. I want you to stop the use of this technology.

Hillary: I want you to say no to this contract. I want you to say no to this data center. I want you to protect our limited water [00:57:00] source, all of those things. I hope this report helps policy makers. Pay closer attention and take those concerns seriously.

Alix: Yeah. Here, here. Well thank you. We will share the report with all the people we know.

Alix: Um, I feel like you probably know more local policy makers. Um, but thank you for writing it. Thank you for coming on and talking more about it. 'cause I think it's really great piece of work and I'm hopeful that it will change how all this stuff happens at the local level and at least give us a running headstart in some of these, these really important questions.

Hillary: Thank you so much.

Alix: All right, next up is very different episode. Um, we're gonna be talking to one of the co-authors of a paper that came out a few months ago now, basically asking the question, are chatbots good at therapy? And taking a step back and saying, what are we trying to achieve when we enter into a therapeutic relationship?

Alix: So, um, stay tuned for next week with Stevie Chancellor. Just as a quick plug, a couple weeks back, I sat down with Sarah West from AI now, and we dug into the circular [00:58:00] transactions that are. Building this appearance within the AI sector of all kinds of economic activity that might not be real. So if you miss that episode, do jump back and listen to it.

Alix: Um, we've actually been getting a lot of people reaching out and saying how helpful it was for understanding some of what's going on, kind of piercing the veil of all of this sort of sound and fury, signifying nothing when every week there's an open AI announcement, you know, a hundred billion dollars here, $10 billion there, et cetera.

Alix: So if you haven't listened to that, give it a listen. And aside from that, we will see you next week. Thank you as ever to Georgia Iacovou, who makes the rambling conversations I have with guests somehow Sound coherent, which thank you for that, both I and guests appreciate it. And to Sarah Myles who does all the audio engineering work that makes all this sound so nice.

Alix: Um, so thanks to both of them and we will see you next week.

Stay up to speed on tech politics

Subscribe for updates and insights delivered right to your inbox. (We won’t overdo it. Unsubscribe anytime.)

Illustration of office worker in a pants suit leaning so far into their computer monitor that their entire head appears to have entered the face of the computer screen