Computer Says Kill: How a Calculator Company Reshaped Modern Warfare w/ Jeff Stern

Show Notes
Precision weapons are meant to make warfare more exact. But what happens when the executive branch uses precision as an excuse to make more war and target with less and less accountability for accuracy?
More like this: Computer Says Kill: Collapsing the Chain w/ with Matt Mahmoudi
In part two of Computer Says Kill, Jeff Stern shares how a calculator company transformed modern warfare by making more precise weapons. After the Second World War, the US military wanted to be able to wage more war and target with more accuracy. At first it was about saving American troops. Over time it became a permission structure for more executive control over lethal force.
What does this history tell us about the role of precision and accountability in war?
Further reading & resources:
- Get The Warhead by Jeff Stern now
- More on Weldon Word and the development of precise weaponry during the Vietnam war
- Operation Desert Storm: 25 years on — CNN 2019
- Right to strike when your boss sells AI to the military? — Cori Crider, The Register Lecture, 2019
**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
Computer Says Maybe is produced by Georgia Iacovou, Kushal Dev, Marion Wellington, Sarah Myles, Van Newman, and Zoe Trout
Transcript
Alix: [00:00:00] Hi, I'm Alix Dunn, and this is part two of Computer Says Kill, a series tracing the people, decisions, and systems that have, I would argue, recklessly ushered AI into the business of war. And this week, we are talking with Jeff Stern, whose recent book, The War Head, I think offers a striking account of how military decision-making and the spectacle of war was slowly transformed by the development of so-called precision weapons.
And really, this is a story about how a calculator company reshaped twenty-first century warfare. So Jeff walks us through how the approach to military weaponry shifted from being broadly, almost randomly destructive to a much more precise and in many ways, even more destructive set of technologies and practices.
And it's a story that spans several U.S. presidencies, and it shows along the way this ever-growing concentration of control in the executive and an expansion of scale of violence that has huge implications for our world today. [00:01:00] Jeff is gonna start us off with the story of Dragon's Jaw Bridge, which I didn't know about until reading his book, but it was a major target during the Vietnam War that the U.S.
repeatedly tried and failed to destroy. And that failure helped spark an absolute obsession within the military with precision. Um, and it also ended up transforming U.S. military might.
Jeff: Dragon's Jaw Bridge was a suspension bridge that connected North and South Vietnam. It was the main route that the Viet Cong used to transport material from north to south. So it was an important target for the Americans, for American Air Force. It was a very hard target to hit. It's a narrow span bridge.
It was very well-defended. And it became not just sort of tactically and [00:02:00] strategically important, but almost symbolically, it became this sort of mythical thing. Over years, pilots tried to attack it, and it was hard to hit because it was so relatively small, narrow of a target. The order of the day was basically dive bombing.
You'd have to dive, go into a very steep dive, and get very close to the bridge in order to have any chance of hitting it, and that was very dangerous because the lower you got, of course, the more in range you were of all sorts of anti-aircraft artillery. And you ended up in this situation where if you pulled up too soon and too fast, pulling high Gs, you could, like, strip the wings off a plane.
If you didn't pull up fast enough, you'd, you know, go right into the water. So it was really dangerous, and a lot of pilots got shot down, relatively a lot of pilots. So for this whole constellation of reasons, the Dragon's Jaw was this really dangerous place for pilots who were continually getting shot down and missing the bridge.
And in the U.S. at Ellis and all these different air bases and places where we were designing weapons, we just kept coming up, "How can we get a weapon to steer itself into the bridge?"
Alix: Yeah, which wasn't possible to do at the time. The book does a really good job of showing this [00:03:00] problem that basically any attempt to, um, attack from the air was just extremely imprecise.
So you're dropping bombs and sort of crossing fingers, which meant you were essentially trying to create as much damage as you could, not necessarily trying to specifically target things, and you could only do things that could accept a huge error rate in terms of, like, where the thing was dropped.
Jeff: Right. That or trying to get very close to the bridge, which was dangerous.
Alix: Right. Okay.
Jeff: Or there were, you know, there were other attempts at guided, self-guided weapons, just none of them really worked that well.
Alix: So that's a theater of war that I think people might say, "Well, the US just shouldn't have been there."
And you also tell this amazing story about John F. Kennedy's older brother, Joe, which I didn't know he had an older brother, um, who is basically sent on this, not necessarily a suicide mission explicitly, but as dangerous as you can get. Do you wanna say a little bit about what he was sent to do?
Jeff: Yeah. So there were
The Nazi vengeance weapons are relatively well-known, V1 and V2. What I think is less well-known is V3, which was this super gun that they were building in a [00:04:00] bunker in occupied France. The intelligence was that it would be able to fire these, I think, like, seven-meter long bullets at seven times the speed of sound or something.
And the fear was that if this thing got up and running, it would make the Blitz look tame. It would just sort of, like, London would be wiped off the map. But it was in a bunker, and they couldn't figure out a way to get enough firepower close enough while getting pilots back home safely or, like, a plane back home safely.
One of the ideas was what if we remove get home safely from the equation? They developed this plan to repurpose war-weary bombers, strip all unnecessary equipment out of it, pack it with Torpex, something like 30,000 pounds of Torpex, was, like, torpedo explosives, and rig it with, like, robot hands and levers and television cameras, and basically fly it remotely.
And they figured out how to do that, but taking off and landing a four-engine big bomber was still too complicated a maneuver to do with robot arms. So the best solution was to put a pilot and a co-pilot in it [00:05:00] to do the takeoff, establish the connection with the mother plane that was gonna steer it, and then parachute out, which meant parachuting out at, like, 200 miles per hour or something like that, which is pretty extreme.
But they did some tests and figured this was the best way to do it. Joe went up in one of the first actual missions to attack the bunker. I say Joe like I know him personally. But he ... It, it seems like what happened was there was a insufficient shielding all around the electronics where the camera was transmitting, and just because of the improvised wiring, that happened to be close to the detonator, so essentially the plane detonated whatever was 30,000 pounds of explosives in the sky over England before they even got in range, and Joe died.
Try not to be too editorial, but if there's an argument, one is that the idea of sort of bloodless pull the trigger, disappear, the, our target is eliminated, and we're done is a little bit of a, a fallacy. And we've seen this especially in, like, modern wars since the 2000s, where these presumptively, like, very precise, [00:06:00] very clinical surgical air campaigns turn into these protracted land-based wars where we have to send people.
So the idea of sort of removing the person from the loop is appealing and still is not quite reality.
Alix: Yeah. Mm-hmm. Even if you remove the people attacking, you can't remove the people that are also being affected by the strikes. Reading the book, especially the first part of the book, it was very just eye-opening about how blind a lot of these military folks were in terms of what they were doing, and it was kind of terrifying.
The more I thought about it, the more I was, like, thinking about how many casualties happened during theaters of war in the last 100 years that I, like, would never have thought about happening because basically they had no technical competency. Like, they couldn't be precise at all. Like, I'd always thought about, like, precision as this pursuit, but not the absence of precision.
And when you start thinking about these stories where there isn't, like, zero ability to do this, it's really terrifying. I'm very glad we don't live in that world [00:07:00] anymore, even though I'm also not very happy with what's going on
Jeff: now. Well, yeah. It's a bit of a double-edged sword because I still have the illusion of precision.
I mean, one of the facts I trot out probably too frequently is that by some estimates, within six months or a year of the intervention in Iraq in 2003, the second Gulf War, more civilians had been killed than in Hiroshima and Nagasaki combined, and the Iraq War was called the most precise air war ever.
And by proportion of precision munitions used, it's, it's a valid assertion, and yet we precisely killed, by some estimates, like, millions of civilians.
Alix: And does it make a difference if you did it on purpose or did it Nominally to be doing something else-
Jeff: Well, right ... but like,
Alix: yeah.
Jeff: I mean, there's an argument that without that technology, you know, the firebombing of Dresden, no one was like, "We're gonna spare civilians."
It was like, "Well, we have to do this because this is the only way." I guess there's an argument that if you don't have that ability or, you know, illusory ability, then you make decisions differently. You know, are we gonna topple Saddam if we know that we are gonna [00:08:00] kill millions of civilians? Maybe the calculus is a little different.
Alix: Also, I think there seems to be a consistent incapacity to learn or to be held accountable for when decisions are made that are bad, which I think increases the chance of the bad decisions being made because there's an understanding that you won't be held accountable for a miscalculation or an intentional-
Jeff: Or we learn the, the wrong lesson because of Vietnam and because the idea that, uh, you know, incrementalism by, you know, these rules of engagement where we're gonna kinda go light 'cause if we bomb too much and we kill a few Russian advisors, it's gonna escalate.
There's an argument that that prolonged the war. And so then we get to Desert Storm, the war planners are very conscious of that and wanted to not be sparing because they thought it would be worse for everyone, including civilians. One of the sequelae is that after a lot of Saddam's military intelligence apparatus and headquarters were destroyed, they started going after secondary locations where they thought that the Muhabarat or something would relocate and ended up hitting bunkers full of civilians.
And there's almost a [00:09:00] direct line between the lesson learned that led to bombing all those civilians because it could have been a Muhabarat headquarters from Vietnam and the idea that, you know, incrementalism allowed the, the war to go on longer and more people to suffer.
Alix: Okay. Well, I wanna ... I don't want us to deprive people of knowing about Weldon Word, who is this kind of, I don't know, very determined, visionary, and, like, I don't know, single-minded almost guy who kind of upends everything.
So post Dragon's Jaw, post what happened with Joe Kennedy, the military's like, "We gotta be able to send stuff with some precision. How would we, you know, tell bombs where to go," essentially. And then Weldon Word enters the equation. Do you wanna say a little bit about him as a character?
Jeff: Yeah. So Weldon is sort of the godfather of this particular bomb, Paveway, and as such is also sort of the godfather of precision strike.
And he worked for Texas Instruments, and he was brought down to Eglin Air Force Base in the [00:10:00] mid-'60s to try to drum up some business for this new product that Texas Instruments had invested a lot in called the silicon transistor, uh, basically the f- the first silicon semiconductor, which the company thought could kind of revelatory product but didn't really have customers for.
The kind of prologue to sending him down was Operation Paperclip when in the, in the aftermath of World War II, the OSS sent- The, the sort of the precursor to the CIA sent people to Germany to try to recruit essentially Nazi scientists, both to sort of capture some of what they were doing and to prevent it from getting into the Soviets' hands.
But as they were trying to essentially kind of refine the Nazi weapons into rockets that could steer, which they weren't very good at 'cause they'd never really had to, like, Hitler hadn't really been interested in precision. He just wanted something to hit London. Every test flight, they would burn up, you know, a bunch of these heavy vacuum tubes.
They needed a way, they were looking for a way to sort of do more tests, to have something that was lighter, cheaper, and through a confluence of [00:11:00] events, they got hooked up with Texas Instruments, and that became one of the first applications of the transistors, the, the semiconductors that Texas Instruments were working on.
And that kind of lit the light bulb of we're trying to find a customer for this product that doesn't mind being an early adopter, that can buy in bulk and can spend a lot. Let's find more Defense Department products. And so Weldon Word was sent down to try to find a way to kind of drum up some business, and he met with, with an Air Force colonel who was trying to solve this bombing problem with the Dragon's Jaw.
Weldon kind of brought together... He saw the convergence of a few different technologies, obviously the silicon semiconductor, also laser, which largely the Army had been working on as a weapon and hadn't quite been able to figure out a way to make it man-portable to get enough power to actually hurt anyone or to disable anything.
It just was impractical. It largely seems to be Weldon who brought together the idea of like, well, what if we use laser not as a weapon, but as essentially a [00:12:00] flashlight for a weapon to point at the target.
Alix: It worked pretty quickly. Like, he figured it out on a pretty low budget in competition with another very large military contractor, and then Paveway has a monopoly basically on laser-guided weapons for, like, how long?
Jeff: Yeah. I mean, it's still used. It's gone through iterations. There's Paveway I, Paveway II, Paveway III, and it's since been combined with INS, with inertial navigation, and GPS, which also Texas Instruments was largely responsible for bringing to us. For a while, it was the only laser-guided weapon and also was just really the only usable guided weapon.
It was inexpensive. It was very easy to use, uh, and it was very precise, and that also the sort of easy use was by design because Weldon and, and another person he brought on to help develop the first Paveways was, I think, a, a former Naval aviator. Jack Sickle was his name. But he kept saying, "Okay, this is, this could work, but this design is too complicated.
You gotta think of, like, an 18-year-old kid just out of high school or whatever who's working the [00:13:00] ground crew or is in the pilot. They won't use it. Defense Department can say whatever they want. Pilots and ground crews won't use it if it's not easy to use." They discarded a lot of ideas that were kinda too elaborate or too intensive, and I think that's part of the reason why it got adopted so quickly is 'cause the results, the bombing results coming back kind of spoke for themselves, but also pilots and ground crews liked it.
Alix: Yeah. Super interesting. Okay, so Paveway is successful. It gets taken up by- The US military. I mean, do you wanna take us through a couple of the times it was used in a way to kind of help us understand how it changed what military interventions were kind of on the table, or sort of how presidents, um, have used this capability?
Jeff: Yeah. So one of the first times post-Vietnam it was used was in Libya, an operation called El Dorado Canyon, which was in the aftermath of... Th- there were essentially cold but becoming hotter conflict between Gaddafi and Libya and kind of the West. There was a terrorist attack in Berlin where some American [00:14:00] soldiers were killed and hurt at a nightclub, and in response, the US launched this strike on Libya.
This is sort of noteworthy partially because it's the first example of what is called a preemptive strike, which is what ultimately used also in Iraq and, and elsewhere. But it was the idea that you could launch an operation unilaterally to strike a foreign power that was sponsoring or exporting terrorism.
Also noteworthy because it's a faraway target. You know, Libya's far away. It required flying planes from Lakenheath in, in England. They had to do a longer route to avoid the airspace of other countries that we didn't think would help or that, that didn't cooperate. This mission was so secret that even people in the White House didn't know about it, let alone Congress, and they were able to do that because they had Paveway and planes that had the range, and it required a lot of refueling.
It was a strike that used primarily Paveway, a few other bombs too, to destroy a bunch of [00:15:00] Gaddafi's intelligence military apparatus, and ended up killing a lot of civilians. Libya's response to that was the Lockerbie bombing. They bombed the 747 over Lockerbie, Scotland, on its way to the US, which was because the two countries that had participated in the raid on Libya were the US and the UK.
That jumbo jet bombing was sort of a way to get both of them.
Alix: I really liked in the book, too, how you were able to kind of subtly shift perspectives between the people planning the strikes and the actual reality, the longer term reality. So there was this, after those strikes, there's, like, a celebratory, "We did it.
Isn't it amazing that without congressional oversight we could tactically attack with precision? Nothing bad is now gonna happen. They're gonna... Libya's, you know, Gaddafi's gonna, like, leave us alone," or something. I don't know what they thought was gonna happen. Because when you see what happened, I'm not suggesting that, like, the presumption should be there's gonna be a really dramatic terror attack when an administration does something bad necessarily, but, like, I'm [00:16:00] just surprised 'cause then we go to Bush Senior and his use of Paveway and, like, it seems like he didn't learn very much from what Reagan experienced.
You wanna say a little bit about how Bush used them?
Jeff: So Bush used them most notably in Desert Storm, and that again was viewed as
Desert Storm, in a way, was, was sort of the coming out to the public of Paveway. I don't think necessarily people in the public knew this is a thing called Paveway, but it was kind of the first televised war.
El Dorado Canyon in Libya, there was footage, there was targeting pod footage, but it was anyone who saw it, saw it, you know, weeks later, and Desert Storm,
the opening strikes were virtually live. Some of them were live. And then-
Alix: On CNN, right? That was like right when
Jeff: CNN came-
On CNN. That was kind of the birth of CNN.
Yeah. So what we saw also was fairly sterile. I mean, we saw the opening salvo was a new Paveway bunker buster. So this is a Paveway that had the ability to penetrate bunkers and not detonate until it was, you know, deep into a building or underground, taking out a telecommunications building in downtown Baghdad.
[00:17:00] But you don't see people. You see a building being destroyed, then you flash to F-14s or whatever taking off aircraft carriers. So it all, it all looks sort of very video game, very sterile, very righteous, and you don't really see what's happening on the ground. And one of the things that's happening on the ground is we begin bombing bunkers where civilians are staying because we think they might be secondary headquarters for intelligence services.
Alix: You deploy a precise bomb To kill people you can't see or know with certainty who they are, which I feel like is this very bizarre combination of not knowing anything and having precision.
Jeff: Well, right.
One of the lines is that precision is only as good as the intelligence on the ground. Like it can't replace intelligence, and that was one of the problems with the second Iraq War, where by that time we didn't have a functioning embassy in Baghdad where we could even have people under diplomatic cover.
We had no intelligence presence there at all. We had to sort of rebuild it from the ground up, or we just wouldn't know what to strike, even with the latest generation of GPS laser-guided [00:18:00] weapons that, you know, can ostensibly strike within a, you know, pinpoint whatever. If you don't know where the good guys and bad guys are, it doesn't really matter.
Alix: What's the point? Yeah, and then it becomes this performance because they, they need targets also.
Jeff: Yeah. And I think this happened in the Balkans too where there was, you know, strikes trying to pressure Slobodan Milošević to the negotiating table.
Alix: Yeah, tell us, talk about Clinton, and Clinton's use of them.
Yeah.
Jeff: So Clinton had seen Desert Storm and been very impressed by it, and came into office and, you know, there's a Balkan crisis. There was the siege of Sarajevo. There was Srebrenica. He believed that air power alone could stop the genocide and sort of bring people to the negotiating table. I think you could probably argue both that it did and that it didn't.
We ended up going in with, I think, three different series of airstrikes. But that also is noteworthy because that was really where we began experimenting with drones, the Balkans, because this was a place you couldn't really determine the good guys from the bad guys. There were guerrilla types who kind of blended in, and even with spy [00:19:00] planes or, you know, with, with, or satellite recon.
Also, the weather wasn't very good. And there's a whole story, of course, of how drones became used in surveillance and then became armed. That was the, really the first time that we used drones for surveillance. Then that created another problem because someone operating a drone would see a target, and every house might look the same, so the drone operator's trying to tell the pilot, "It's this house," or, "It's that house."
And maybe by the time the pilot becomes on station with the right weapon or whatever, it's, that, you know, it's a fleeting target or can't tell. And that sort of initiated a push to put a laser designator on a drone so that it could just designate the target. And then it was before the 9/11 strikes where they tried to get Bin Laden in Afghanistan, and it became clear that, like, even if we have a laser designator on a drone, we still have to wait for a pilot to come from wherever.
To bring a plane. Yeah. What if we just... To bring a plane. Put
Alix: a weapon on a drone.
Jeff: Put a weapon on it, yeah.
Alix: So then the Obama administration basically inherits this new muscular drone weapon combination, and then it just [00:20:00] becomes drone war on steroids and then, like, obsession with, like, strike with basically no oversight as far as I...
I'm not a lawyer, but I feel like they did, they killed a lot of people without anyone having any real oversight
Jeff: on it. Yeah. And I think the JAG definition of civilian versus combatant essentially was fair game if it was military-aged male.
Alix: Yeah. Corey Kreider did this amazing presentation like, uh, it's almost been like eight years ago now, looking at the way that kill orders were allowed to be issued for drone strikes, and it's basically a percentage of likelihood that the person is a terror, an enemy combatant, and then a percentage likelihood that the SIM that they're carrying with them is theirs.
So it's like this double step of probability, and it's much lower percentage than you would want it to be for them to be allowed to, to kill someone. It became, again, precision without intelligence, which I think is a really great way of thinking about it. [00:21:00] What is the point in being able to precisely target something if you don't know that you should target that thing?
Like, what is that about? '
Jeff: Cause we can.
Alix: Yeah. So what happened to Texas Instruments? Paveway, hugely financially successful, has become-- became this like cornerstone of the military arsenal of the biggest army in the world. What happened to them? Like, how big did they get? How big did Paveway get?
Jeff: So Texas Instruments went through a bit of a transition in the aftermath of the Cold War when there was the idea of the peace dividend, which was that we had maybe won the Cold War less by fighting than by outspending the Soviets or by forcing them to outspend their own capability and sort of bankrupting themselves.
And now is the time to sort of spend all this money domestically. And so, you know, orders-- The, the defense industry faced a bit of a, a crisis. You know, orders collapsed. That brought about a big consolidation. The way that the company lore describes it is that they kind of lost the recipe. Paveway started coming off the assembly line riddled with problems, [00:22:00] so much so that the Defense Department opened up a license and Lockheed Martin now also makes Paveway.
But that also created a bit of a push to develop, you know, a shiny new version of Paveway that could kind of restore the luster of the Paveway brand. That ended up being part of the drive for the second version of the Paveway bunker buster, which would have laser guidance, but also GPS and inertial guidance.
That weapon was used in the opening strike in the Shock and Awe campaign in 2003 when we thought we actually had a chance to take out Saddam before the war started. The first bunker buster was a kind of a collaboration between Lockheed Martin and Texas Instruments because there was the idea of low-level release.
If you could fly in low, below radar- It was a safer, but to give the weapon range, you then had to pop up. That was sort of a point of vulnerability. So what if you get the weapon to pop up on its own? And Texas Instruments was working on this. Air Force wasn't really [00:23:00] all that interested, at least not enough to put any money into it.
But at the same time, Lockheed had this black skunkworks program, Have Blue, which became the stealth bomber. The idea there was that if you could have a plane that was stealth, that was sort of on its own impervious to radar, you didn't have to fly low. But what potential rivals were doing was then kind of building better and better bunkers.
Lockheed was working on a bunker-busting bomb on their own to use with the stealth bomber, and they had the kind of material scientists figured out. But what they found was essentially like no matter how strong a nail is, it won't go into a board if you just like chuck it at the board. It has to hit the board at a particular angle.
That essentially was what Texas Instruments was doing for an entirely different reason. So those two things were fused, and there was this weird period where there was this white program inside of a black program inside of a white program where Texas Instruments engineers were working on a bunker buster inside of the black program at Lock- Lockheed.
But essentially, the Paveway bunker buster and the stealth bomber were kind of co-invented, like purpose-built for one another. The bunker buster was purpose-built for the stealth bomber, [00:24:00] both to get it to sort of drop and delay the detonation, but also to get it so that it could kind of know where it was in space and align itself to land at such an angle that it could penetrate.
Alix: Setting aside briefly, these are tools of carnage and empire, um, it is really interesting. I found myself getting, like, really interested in the technical details and then being like, "No, stop it. These are, these are bad." Um, so we now have a president who kinda thinks the '80s are back generally and is using precision both in this bunker buster context in Iran, but now in Venezuela to basically say, with precision, without congressional authorization, we're gonna go in and in, in a precise way without loss of life on American side.
It also f- feels like he's, like, of a paveway mental- There's, like, something about it that seems like a political orientation almost about war that, like, you can do these elegant, short, sharp [00:25:00] things, reshape the geopolitical landscape, but without there being any unintended impacts that are negative and blow back in your face, even though literally every lesson, every precedent has learned through this series that you've just gone through is the exact opposite outcome.
I don't know, thoughts on what's happening now?
Jeff: I mean, part of me is like, he-- To the extent that, like, perception is reality, he may be more right than ever. The pullback always seems to be the next administration's problem. And even in, even in two thousand and three, even in, in the Iraq War, we didn't have a news cycle like we did now.
We didn't have social media like we did now. We didn't have, you know, two days later. Who even remembers the Iran strike that was, like, twenty minutes ago, but so much has happened since then? I think you can virtually guarantee that there will be blowback, but you can't guarantee that it will necessarily be someone holding a sign that says, "This is because what you did," you know, "This is 'cause of the strike on the bunkers."
And so, you know, someone like Trump can carry out one of these strikes and then move on to the next thing, and most of us kinda move on with him. I actually view it kind of like the US aid cuts, where the cost of that, I mean, the [00:26:00] cost on the ground is obviously felt immediately, but the cost to us of, you know, the lack of goodwill, the lack of intelligence that is shared with us, it's immeasurable both in that it's big and that it is literally hard to measure or to track.
Or, you know, another e-example of this is the, is the shooting of the National Guardsmen, I think National Guardsmen in front of the White House, which was, as best I can tell, you know, a former essentially Afghan CIA contractor who had been kind of on the front lines, had been brought to the US, and then kind of abandoned.
And you know, there was some whispering of like, "Oh, is this some kind of, like, sleeper cell?" It seems like, no, it was just someone who was really struggling and kinda lost his mind. And that is another sort of sequelae of, of an intervention that ended really terribly, and in a way, kind of the chickens coming home to roost.
But, you know, I don't think anyone looked at that and said, "We should really revisit how we go in and pull out of countries." It's, it's a shame because I think that's why we won't, we don't learn the lesson that we have just stepped on that rake again, I think arguably in Iran and I, I think probably in Venezuela.
Even though right now we're not seeing what's happening on the ground or what [00:27:00] two years from now or what... It's a bit of a boiling frog. I've, and I've used like four different metaphors for this one analogy, but yeah.
Alix: But I mean, history is a, is in some ways a crystal ball. And like, I feel like this is a very...
I don't know. It's impossible to imagine that there aren't gonna be negative consequences and also just generally the way that these things have been conducted. I think the precision negating the need for there to be more robust decision-making. Um, I think it's something we felt a lot in the Obama administration with the drone strikes that, like, a constitutional lawyer was all of a sudden killing people extrajudicially, including sometimes US citizens with drones without much control from Congress after an unprecedented laxing of a lot of the accountability infrastructure that exists post 9/11.
But even in, like, watching, like the... Watching, reading those chapters of Reagan and, like, how he was going back and forth with his national security advisor and making the choice about those Libya strikes, it was so interesting that it never crossed their minds to get congressional approval because part of the point of the precision [00:28:00] was that it put the agent, the full agency and decision-making and autonomy in the presidency.
And it feels like the sort of precision or, like, the perceived precision of some of these tools feels really connected to this dramatic, uh, maybe not drama- Maybe it's been like a 40-year project, 50-year project to, like, consolidate military power in the executive.
Jeff: Yeah. And I don't know how deliberate that was, but I feel pretty comfortable saying that Paveway and precision provided the capability that allows presidents to act relatively unilaterally.
Because if the Libya mission, for example, had required, you know, 200,000 ground troops to be moved, that's a lot harder to do without- It wouldn't have happened ... people knowing about oversight. It wouldn't have happened, yeah.
Alix: So there's the current political moment of Venezuela and Iran and Trump being Trump, but there's also this just gold rush for technology companies that, similar to Texas Instruments, are saying, "I have this product, it's generative AI, pretty sure it's gonna be transformational.
I'm not quite sure what the use cases [00:29:00] are. Can't get individual people to buy it because who wants to use this stuff in the same way that I wouldn't necessarily know as an individual what to do with a semiconductor. So I'm gonna drum up military contracts and try and basically make a pitch for this headlong rush into what is a modern military.
It should incorporate chatbots or something." How did- That would
Jeff: be a pretty good weapon against our rivals if you have to-
Alix: Yeah, send them an AI
Jeff: psychologist ... to talk to customer service
Alix: chatbots. Yeah. We'll give them all an AI girlfriend and then hope
Jeff: that,
Alix: um-
Jeff: Yeah. Yes. Not a bad
Alix: idea. Uh, yeah. Um, but so how do you...
I don't know, when you think about Paveway, like, what lessons do you think we should have learned from how a new technology can, might, should affect the way that the military goes about its business? Any thoughts on the political economy of some of this stuff now that you've spent so much time thinking about Paveway?
Jeff: One thing that comes to mind that I think is kind of interesting is during that defense industry consolidation, you know, the post-Cold War defense industry [00:30:00] consolidation, one of the reasons why Texas Instruments' missile and defense work was appealing was because even if you're trying to cut back on spending and if you're no longer preparing for a land war with Russia or, you know, you no longer need to find a way to deliver thousands of tons of explosive power to deep into Russian territory or whatever.
I think it's sort of a fair generalization that the Pentagon tends to wanna maintain, you know, a posture of, uh, preparedness for the next conflict and also a technological edge. So Texas Instruments at the time kind of provided both of those things. I mean, their work was in, you know, electronics and technology and the next thing.
I mean, a, a part of the company lore with Texas Instruments was that it had started as a company called, I think it was called Geophysical Services International, but essentially
Alix: they- Oh, right. Wait, we gotta make the fossil fuel connection. I totally forgot this. They were originally trying to sell semiconductors or, like, intelligence to find oil.
Jeff: They would develop something. This was... So this was [00:31:00] pre Texas Instruments, but it was called a magnetic anomaly detector. It was like a dome you hung out of a car or whatever and drove across the fields, the idea being that, uh, it, it detected magnetic anomalies underground which could signal the presence of hydrocarbons.
So it was like an oil prospecting thing. They incorporated it- And it was
Alix: incorporated on P- Was it in- or incorporated on Pearl Harbor?
Jeff: The day before. It was the day before Pearl Harbor.
Alix: Yeah. So all
Jeff: of a sudden, you know, oil fields were nationalized or whatever and, and the lesson... You know, I don't know how much this is true and how much of this is just as the way they tell the story, but we no longer wanna be in a position where we're preparing for the last, for yesterday's thing.
We wanna prepare for tomorrow's thing, even if we don't quite know what it is, and that's why they ended up spending so much money trying to make semiconductors out of silicon, which kind of at the time the first transistors were being made with germanium, and everyone kinda knew if you could figure out a way to do it with silicon it'd be much better.
But everyone thought that was, like, the material sciences guys at Harvard or whatever, and Texas Instruments was like, "This is our thing. We're gonna spend a ton of money on this thing that we don't understand and we don't know what it's for." The lore is true at least in that I think that, like, the corporate kind of posture was always like, [00:32:00] "Let's look over the horizon a little bit."
I think the legacy of that is partially why it was an a- attractive acquisition target even as a lot of different defense products were becoming less, less appealing. And so I think there's a bit of a corollary there with AI because, you know, you also hear a lot of like, "Well, we need to, we need to sort of take the brakes off because what's China doing?
What's Russia doing?" It can be a little intoxicating, uh, and it's a little dangerous because it's... I think it's pretty easy to convince someone that, like, we should be investing in this so we can maintain an edge even though often it's like, well what's actually the application and does it work? And it's sort of like doesn't matter.
And it also reminds me a little bit of the nuclear arms race where- You know, we had to keep building and building and building because the Russians were way ahead of us, and then we come to find out after the Cold War that they were actually way behind us.
Alix: And also all we ha- now have is, like, a massive nuclear stockpile and a bunch of non-functioning nuclear power plants.
Like
Jeff: Mm-hmm.
Alix: Like, we don't- Yeah ... have that much to show for all those investments as well.
Jeff: Yeah. So we're gonna have Hey, uh, we're gonna have a lot of data centers. We're gonna have no water or electricity. [00:33:00] But- Yeah ... we'll have chatbots.
Alix: Yeah. And a, a recession/depression in there at some point because of misappropriation of invest-
Jeff: Mm-hmm
Alix: of investment capital.
Jeff: Yeah. Yeah. Right.
Alix: Gonna be great. Well, so what, um, what are you working on now? Like, what's next? I mean, this book, I d- I, I-
Jeff: Why don't we be working on something now?
Alix: You don't have to be. I really love the book, though. Like, I think it's a real- Oh, thank you ... it was such a good read, and, like, it was such a, a journey for me 'cause, like, I, I knew a little bit about some of the things, but, like, putting it all together in the way you did was just really, really interesting.
Thank you. And there were so many corollaries and parallels that were, were informative and helpful and, like, um, I'm still kind of... It's still unfurling in my head, um, which is a good sign, I think, for a book.
Jeff: In mine, too, in probably a different way. I appreciate that. I am do- I am doing, like, a few film things.
I'm excited to begin working on finding out about something new that I have no idea about and pretending to be an expert. It's been a while since I've started a project fresh.
Alix: How did you pick this topic?
Jeff: So it started with a story I did about one airstrike in Yemen, and been [00:34:00] talking about how to get people to care about Yemen, which was obviously far away and exotic, and people don't look like us or talk like us or eat the foods we eat, and it was this massive humanitarian catastrophe that was happening.
There had been reporting on it, but it began to seem to me like part of the reason it wasn't sticking was actually in spite of the fact of how, of how big it was, and it sort of needed to distill it down to sort of an, a smaller aperture to tell the story. But also, how do you get people to care about something that's so far away where everything is exotic and different?
It's hard to empathize with someone, you know, named Muhammad Akbar even if you want to because you don't, you can't really picture it. And so the idea emerged to track one of the bombs from the White people in Arizona putting it together and sort of starting there. I had read an article about, I think, architecture, and there was this term MAYA, most advanced yet acceptable.
And I think the idea was, like, you wanna build some sort of nouveau type of building that doesn't respond to any of the conventions, great, but, like, the doorway [00:35:00] still needs to be a doorway. There has to be some entry point. So just in service of trying to get readers from the land of White people to Yemen, began researching the bomb that was used there, and at some point turned out that this bomb made by Raytheon had been invented by Texas Instruments.
When the article came out, we were talking about is this a book. I'd always liked the idea of telling a bunch of different stories and finding a way to connect them, and I thought it could just be sort of a collection of short stories that each use Paveway and- Turned out it had to like have a point and a thesis and whatever, and so, so yeah, that's the- And
Alix: characters
that's the long
Jeff: version. Yeah. Main characters.
Alix: Okay. Well, eight years later, here you are. Mm-hmm. Um, and it's good. It's a good book. Um, okay. Yeah, thanks. Well, um, Jeff Stern, book is linked in the show notes. Good luck with the book tour, um-
Jeff: Appreciate it ... and hopefully we'll be talking to you soon.
Alix: Understanding the history of these weapons is an important part of the story we're telling in this series, and as you heard in Jeff's description of the change in American practices, [00:36:00] weapons can be as precise as you want, but that precision doesn't mean much without the intelligence needed to accurately identify targets, and also systems of accountability that control and constrain executive power.
Because as we've learned, there were a series of successive US presidencies that failed to learn from past mistakes and actually understand that a clean, short intervention using laser-guided missiles is not as simple as it seems. There are always radiating issues that emerge after something like that, and there's always casualties and harm that American presidents have sort of not, not managed to learn lessons behind.
And now, when we have an American president who is essentially an unrestrained autocrat, um, who doesn't really care at all about casualties it seems, that these weapons introduce capabilities and possibilities in that scenario that are really terrifying. In part three next week, Amos Toh is gonna bring us into the present day and explain how this history has culminated in a military landscape [00:37:00] full of lots of turbocharged autonomous weaponry with a fraction of the accountability, and he's also going to explain how procurement, so the purchase of weapons by the US military, has changed over time and sets the stage for the sale of AI into, uh, departments that maybe shouldn't be buying it.
Uh, so stay tuned next week for the third installment of our series, Computer Says Kill. And I wanna thank the production team, Sarah Myles, Georgia Iacovou, Van Newman, Kushal Dev, Zoe Trout, and Marion Wellington, who have helped put the series together. And thanks also to Jeff, both for writing the book, which you should read, it's linked in the show notes, and also for coming on and I think giving a really important longer range history to some of the conversations we want to platform in this series.
So we will see you next week.
Stay up to speed on tech politics
Subscribe for updates and insights delivered right to your inbox. (We won’t overdo it. Unsubscribe anytime.)
