Skip to content

Drone panic: New weapon, old anxieties

December 23, 2011

Poor drones and drone-operators: as the latest generation of weaponry on the battlefield, it’s now their turn to be subject to that great generational scrutiny of moral and ethical suspicion. Poor thinking public: we get treated to these arguments as if they were all original or unique to drones.

This piece from the Atlantic, addresses (among many other things) many of these themes. Unfortunately, it perpetuates a number of very tired tropes about military technology and tactics.

The first overused and under-scrutinized argument is the fear that drones make war “easier to wage” because “we can safely strike from longer distances.” Well, we’ve had that ability since the birth of air power and missile power, it just makes it a lot easier to hit certain kinds of targets at a certain tempo. After all, it’s only the pilots who are “far away,” the drones themselves still operate from bases with real, flesh-and-blood people who are potentially exposed to retaliation, and those bases are not necessarily any further away than bases for manned aircraft (in most cases they service both). It’s notable, actually, that the US actually requires significant international cooperation to maintain the bases it wants to launch drone strikes from, and only in Libya has the US ever used drones against a government it admitted to being at war with (as opposed to Yemen, Pakistan, and Somalia, where the US strikes non-state actors with some degree of compliance from the local government).

This does of course make war-making easier to some extent, but why we should find this alarming is unclear. Similar effects could be achieved with advances to manned aircraft, stand-off guided missiles, advanced naval weapons, and so on. Drones are an incremental advancement in technology, so far they have no prevented a major alteration in the offense-defense balance, they have merely made military operations we already preferred to fight – such as the US targeting of al Qaeda with air strikes in 1998 – more effective, but without the acquisition of bases capable of generating large volumes of combat sorties, a sustained war would not be “easy” by any means – and those bases would also increase manned-aircraft sortie generation. The loss-of-strength gradient still applies.

Lin, in the same thought, notes:

He compared our use of drones with the biblical David’s use of a sling against Goliath: both are about using missile or long-range weapons and presumably have righteousness on their side. Now, whether or not you’re Christian, it’s clear that our adversaries might not be. So rhetoric like this might inflame or exacerbate tensions, and this reflects badly on our use of technology.

I’m going to have to assume he’s alluding to Buddhists or Hindus or some other faith, because the story of David and Goliath, being in the Old Testament (it’s, a Jewish reference, if that’s not clear), is present not just in Christianity but in Islam! Let’s review (2:251):

So they defeated [the Philistines] by permission of Allah , and David killed Goliath, and Allah gave him the kingship and prophethood and taught him from that which He willed. And if it were not for Allah checking [some] people by means of others, the earth would have been corrupted, but Allah is full of bounty to the worlds.

So, that covers that. I really doubt that China is concerned about Abrahamic rhetoric by American policymakers, even if it does distrust the expansion of Christianity in China (the CCP is far more piqued by their secular political self-righteousness), and – wait, weren’t we talking about drones?

On that subject, we get to one of my favorite charges against new military technology, that it will embolden our enemies because they will think we are cowards (emphasis mine):

Relatedly, we already hear criticisms that the use of technology in war or peacekeeping missions aren’t helping to win the hearts and minds of local foreign populations. For instance, sending in robot patrols into Baghdad to keep the peace would send the wrong message about our willingness to connect with the residents; we will still need human diplomacy for that. In war, this could backfire against us, as our enemies mark us as dishonorable and cowardly for not willing to engage them man to man. This serves to make them more resolute in fighting us; it fuels their propaganda and recruitment efforts; and this leads to a new crop of determined terrorists.

This has been said, I think, about virtually every military innovation since the machine gun, and they should have said it about artillery too if they didn’t. Sebastian Junger, in his excellent War, explains it well:

A man with a machine gun can conceivably hold off a whole battalion, at least for a while, which changes the whole equation of what it means to be brave in battle…. Machine guns forced infantry to disperse, to camouflage themselves, and to fight in small and independent units. All that promoted stealth over honor and squad loyalty over blind obedience….

As a result much of modern military tactics is geared toward maneuvering the enemy into a position where they can essentially be massacred from safety. It sounds dishonorable only if you imagine that modern war is about honor: it’s not. It’s about winning, which means killing the enemy on the most unequal terms possible. Anything less simply results in the loss of more of your own men.

Does anyone think it matters that soldiers of counter-insurgent or counter-guerrilla forces, think insurgent tactics are dishonorable? Do you think Taliban or Badr Brigade members wrote missives to each other worrying if Dragunovs, L-shaped ambushes and IEDs with explosively formed penetrators were going to cause them to lose the respect of the American occupiers? War is not won with style points. It is conducted with the maximum amount of lethal force a country feels appropriate for the accomplishment of its political goals, and policymakers must intervene to ensure it does not exceed that. Now, yes, avoiding being hated by the civilian population does matter, but if you replace “robot patrols” with “tank patrols” or “white men with body armor and wraparound sunglasses,” you get a similar answer – the problem isn’t unique to drones, because people tend to have a basic and healthy distrust of armed foreigners anyway.

As for winning the enemy’s respect, we should remember, as Betty White once put it, “I’m going to attack you with this, and you use respect to defend yourself.” The enemy already thinks American soldiers are cowards with expensive machines, this is nothing new. Beyond the tanks, the over-the-horizon artillery barrages, and the enormous bases, there is air power. To the average terrified insurgent soldier, there is little moral difference whether it is a Harvest Hawk or a Reaper slinging Griffins at them. In either case, it is a nigh-invincible machine which the Americans can dispense death from. To a guy with an Kalashnikov or a DShK, that Harvest Hawk’s crew might as well be in Creech – they can’t be stopped. It’s not that we’re killing people with drones that drives terrorist recruitment, it’s that we’re killing people. If we just sent American fighting men and women into hostile villages wearing nothing but loincloths and swinging heavy sticks, people would still be very mad at you when you beat somebody to death.

The additional disrespect that weapons technology earns soldiers is generally more than outweighed by the casualties and incurs and fear it induces in the enemy. To the extent that it causes anger, it is through collateral damage and the death of compatriots, but drones are hardly unique in doing so. The way in which weapons – any weapons – are used in war generally far outweighs the weapons themselves in the overall psychological effect.

Many of these new ethical questions are in fact very old. Take this example:

Without defenses, robot could be easy targets for capture, yet they may contain critical technologies and classified data that we don’t want to fall into the wrong hands. Robotic self-destruct measures could go off at the wrong time and place, injuring people and creating an international crisis. So do we give them defensive capabilities, such as evasive maneuvers or maybe nonlethal weapons like repellent spray or Taser guns or rubber bullets? Well, any of these “nonlethal” measures could turn deadly too. In running away, a robot could mow down a small child or enemy combatant, which would escalate a crisis. And we see news reports all too often about unintended deaths caused by Tasers and other supposedly nonlethal weapons.

So what we’re worrying about is a robot creating a situation where an American covert asset uses lethal force against armed pursuers and tries to evade capture, creating a massive diplomatic crisis? Gee, I didn’t know Ray Davis was a Terminator!

An additional misplaced concern the drone program is disproportionately saddled with is that of collateral damage:

Another worry is that the use of lethal robots represents a disproportionate use of force, relative to the military objective. This speaks to the collateral damage, or unintended death of nearby innocent civilians, caused by, say, a Hellfire missile launched by a Reaper UAV. What’s an acceptable rate of innocents killed for every bad guy killed: 2:1, 10:1, 50:1? That number hasn’t been nailed down and continues to be a source of criticism. It’s conceivable that there might be a target of such high value that even a 1,000:1 collateral-damage rate, or greater, would be acceptable to us.

How on earth is this a new problem because of drones? This was a problem when we were using manned aircraft and it is a problem with naval missile and gunfire support and artillery as well. We’ve never had a magic ratio that resolved this debate and never will. This problem, again, precedes drone warfare and will outlive it. But the concern is particularly strange since drones actually allow us to reduce collateral damage significantly compared to many alternatives, since it is easier for a drone to loiter and strike with greater precision than it would be for the pilot of a jet aircraft.

This next problem isn’t Lin’s, but it is reflective of one of the stranger standards in IHL:

Let’s say we were able to create a robot that targets only combatants and that leaves no collateral damage–an armed robot with a perfectly accurate targeting system. Well, oddly enough, this may violate a rule by the International Committee of the Red Cross (ICRC), which bans weapons that cause more than 25% field mortality and 5% hospital mortality.

This rule does not particularly make sense to me, since, again, the lethality of the weapon seems to depend on the circumstances under which it is employed. A well-trained sniper with a rifle, who’s setting up his shots to kill, is going to be more lethal than an infantryman who is going to use a rifle to provide suppressing fire. In some cases ammunition which is extremely lethal is effectively banned – hollowpoints and exploding bullets come to mind (although countries have no qualms about firing explosive-tipped rounds from vehicular chain guns at people). But again, I don’t see this problem as being perfectly unique to robots, since in theory a sufficiently advanced targeting system on another vehicle-mounted weapon would accomplish the same thing, except the human pressing the button would be physically closer to the action.

Then, there’s the final proliferation argument, which I’ve tangentially addressed before in a previous post.

Related to this is the all-too-real worry about proliferation, that our adversaries will develop or acquire the same technologies and use them against us. This has borne out already with every military technology we have, from tanks to nuclear bombs to stealth technologies. Already, over 50 nations have or are developing military robots like we have, including China, Iran, Libyan rebels, and others.

So what? Our adversaries are going to develop technologies to fight us with regardless of what we do. Does anyone really think that if we stopped using drones, our enemies would do the same? Of course not, countries develop military technologies to further their own interests, not to spite Americans. My previous skepticism about the drone war “coming home” still holds, as it should be extremely clear by now that any country stupid enough to host a drone base outside US borders for striking American targets, let alone the country stupid enough to try flying the drones in the first place, would be on the receiving end of a very large amount of firepower.

The piece concludes:

Integrating ethics may be more cautious and less agile than a “do first, think later” (or worse “do first, apologize later”) approach, but it helps us win the moral high ground–perhaps the most strategic of battlefields.

No, the moral high ground is not the “most strategic of battlefields.” War is not a morality play. In every war, the victor has done horrible, horrible things, not necessarily any better than the things the vanquished did to avoid their fate. There are plenty of legitimate ethical trade-offs in war, particularly because ethical questions intersect with the political nature of war itself. However, the notion that we can “win the moral high ground” and that this is a “strategic battlefield” is a quixotic endeavor. What adversaries, neutrals, and observers understand as “moral” is not going to be the same in war, and war can absolutely be won without proving one’s moral worth to the enemy population: just ask the Japanese after World War II. Yes, there are ethical considerations about drones, but these are largely the same questions that have dogged us since the advent of the machine gun and air power. Overemphasizing the supposed uniqueness of drone war ethics and the importance of being “respected” by the enemy the US is trying to maim and kill will just set us up for another round of disappointment, when the new supposedly ethical and humanitarian variations on warfare fail, yet again, to make the violent imposition of a foreign power’s will a palatable experience for those on the receiving end.

12 Comments leave one →
  1. Jonathan Jeckell permalink
    December 23, 2011 5:09 pm

    Some minor quibbles with an otherwise excellent article…

    The irrational dimension of human behavior means that there is not a linear relationship between killing and winning in war. The means do matter in war and can cause an disproportionate backlash, and a willingness to suffer more death and destruction without bending to the political aims of an attacker. So sometimes style and the perception of honorable victory are very important. Pearl Harbor may be one such example. I think you are right, though, that this concern does not apply any more to drones than any other weapon we use, and is, if anything, more fair and discriminate than other weapons.

    Moreover, this does carry over to the strategic aims and the moral high ground. Regimes that flagrantly violate norms (or indicated dangerous intentions) have lost support of allies or had other states bandwagon against them. Unrestricted submarine warfare was not the only reason the US joined the alliance in WWI, but it was a consideration. Again though, this is just a quibble on that point and this does not apply to drones, which are, if anything, more discriminate.

    • December 23, 2011 5:37 pm

      Even with the case of Pearl Harbor, I don’t think a more ‘honorable’ attack would have had a ‘better’ psychological response as far as the Japanese were concerned. We still would have wanted to destroy Japan. Even then, our perception of what’s dishonorable is subjective – many Westerners (Americans, and I think even TR, included) praised the genius of Japan’s very similar surprise attack raid on Port Arthur in the Russo-Japanese War.

      As for norm-violation, we have to ask, “whose norms?” Not even all Western states agreed with our anger at unrestricted submarine warfare, which only trade-dependent maritime states found particularly offensive (There’s a quote from Raoul Castex in this post that shows our continental allies didn’t necessarily find our outrage over submarine warfare particularly sensible: Notably, to return to WWII, the United States actually waged the most successful campaign of unrestricted submarine warfare in history against the Japanese (in addition to other ethically questionable things, like firebombing and nuking Japanese cities, or considering the use of poison gas for an invasion of the home islands). Norms are simply too slippery, and the moral high ground too ephemeral, to be a Clausewitzian center of gravity important to the winning of the war.

  2. December 23, 2011 6:11 pm

    While I totally agree with your rebuttals of Lin’s ethical arguments, I think that drones do make certain types of warfare “easier to wage,” and that this poses a strategic problem for US policymakers. You note that using drones requires an on-the-ground presence and oftentimes cooperation with local governments, but it appears that US policymakers are much more willing to violate another country’s sovereignty with a drone strike than with an airstrike, and that local governments are much more willing to permit and cooperate with US drone strikes as compared to airstrikes or deployment of US troops. In difficult situations, such as our relationship with Pakistan, the relative ease of resorting to drone strikes-a tactical measure-as compared to the challenge of formulating a serious strategy to address the broader issues in the region causes policymakers to eschew strategizing entirely and focus only on the tactical issues a drone strike can address, as they may appear more tractable and the results of a strike are immediately visible. I imagine this mindset will fade over time as its deficiencies become visible, and that drones will be viewed as another important yet limited tool in the US arsenal rather than as a shiny new toy to be played with, but in the meantime the ease with which we can strike militants in the tribal regions encourages us to avoid asking harder questions about why they are there in the first place, if the strikes are actually reducing their numbers in the long run, and what effects our actions might have on the stability of the Pakistani government and the region as a whole.

    • December 23, 2011 6:22 pm

      I still don’t really agree – the “violation of Pakistan’s sovereignty” is done with a very high degree of complicity, and considering Pakistan is only just now deploying even shoulder-launched SAMs to the border, Pakistan has never put up a serious military defense against the drone strikes.

      Sure, there’s an appeal to using drones, but policymakers were also willing to use helicopters and SOF when the stakes were high enough. Drones just allow us to conduct those kind of missions with greater frequency. Even without drones, I doubt the fundamental strategic mismatch between US and Pakistani issues would be resolved – the employment of drones as a stopgap measure are a symptom of the problem, not the cause, and that we have currently suspended our drone strikes in Pakistan I think demonstrates that point. It’s a secondary issue, the greater reason we’ve avoided formulating a comprehensive strategy is that we’re avoiding the ugly fact that US and Pakistani interests in Afghanistan are diametrically opposed in a number of critical areas.


  1. Failing to learn: US resumes drone attacks in Pakistan | Today's Defense
  2. Failing to learn: US resumes drone attacks in Pakistan | Pakistan: Now or Never?
  3. So-called “Drone ethics” are a matter of context | Wings Over Iraq
  4. The Moral Dilemma of Covert Warfare | Today's Defense
  5. More Than Just Drones: The Moral Dilemma of Covert Warfare | Today's Defense
  7. Rug Pundits | Hijacked Drones I
  8. Hijacked Drones | The Tuqay

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: