Modern conflict is increasingly a struggle for strategic influence above territory. This struggle is, at its essence, a battle over perceptions and narratives within a psychological terrain under the influence of local and global pressures. One of the unspoken lessons embedded in the Counterinsurgency Manual (FM3-24) is that we risk strategic success relying on a lawyerly conduct of war that rests on finely tuned arguments of why and why not. When too much defense and too much offense can be detrimental, we must consider the impact of our actions, the information effects. The propaganda of the deed must match the propaganda of the word.
As Giulio Douhet wrote in 1928,
“A man who wants to make a good instrument must first have a precise understanding of what the instrument is to be used for; and he who intends to build a good instrument of war must first ask himself what the next war will be like.”
Secretary of Defense Robert M. Gates has said that there is too much spending geared toward the wrong way of war. I find this to be particularly true in the area of battlefield robots. Much (if not all) of the unmanned systems planning and discussion, especially with regards to unmanned ground combat vehicles, is not taking into account the nature of the next war, let alone the current conflict.
Last year I posted an unscientific survey that explored how a ground combat robot operating away from humans (remote controlled or autonomous) might shape the opinions of the local host family. The survey also explored the propaganda value of these systems to the enemy, in the media markets of our allies, Muslim countries, and here in the United States. The survey results weren’t surprising.
Serviam Magazine just published what could be construed as an executive summary of a larger paper of mine to be published by Proteus later this year. That paper is about four times longer and adds a few points with more details. In the meantime, my article that appeared in Serviam, “Combat Robots and Perception Management,” is below.
This article originally appeared in the June 2008 edition of the magazine Serviam. It is based on a paper and presentation I gave at the U.S. Army War College.
Robots will figure prominently in the future of warfare, whether we like it or not. They will provide perimeter security, logistics, surveillance, explosive ordinance disposal, and more because they fit strategic, operational, and tactical requirements for both the irregular and “traditional” warfare of the future. While American policymakers have finally realized that the so-called “war on terror” is a war of ideas and a war of information, virtually all reports on unmanned systems ignore the substantial impact that “warbots” will have on strategic communications, from public diplomacy to psychological operations. It is imperative that the U.S. military and civilian leadership discuss, anticipate, and plan for each robot to be a real strategic corporal (or “strategic captain,” if you consider their role as a coordinating hub).
As unmanned systems mature, ground systems operating among and interacting with foreign populations will substantially affect perceptions of our mission, both at home and abroad. Robots will exert significant influence in three overlapping information domains. The first domain is the change on the calculus of foreign engagement as the public, Congress, and future administrations perceive a reduction in the human cost of war (on our side). The second domain is the psychological struggle of the local populations in conflict and post-conflict zones, and the third is the overarching global information environment.
The first domain and the most touted benefit of robots is their ability to reduce the exposure and vulnerability of America’s warfighters. The Defense Department’s Unmanned Systems Roadmap 2007-2032, approved in December 2007, leads with this point and repeatedly emphasizes it. Unlike President Clinton’s lobbing cruise missiles against Al-Qaeda in Sudan and Afghanistan, a future president will be able to deploy remote-controlled and autonomous robots to accomplish the same mission with greater precision. However, few have considered the true cost of lowering the bar for kinetic action in a world of instant communications. There are parallels here between outsourcing to machines and outsourcing to private military contractors that circumvent public and congressional oversight by avoiding the use of uniformed soldiers.
The second critical domain is in the psychological struggle for the minds and hearts of the men and women in conflict and post-conflict zones. There is a real risk of undoing the lessons learned on the importance of personal contact with local populations that was earned at such a high price in Iraq and Afghanistan. Mapping the human terrain becomes, by implication at least, not only unnecessary but impossible in the sterility of robot-human interfaces.
In 2007, Lieutenant General Raymond Odierno issued guidance emphasizing the importance of engaging the local population and building a “feel” for the street. This guidance instructed Coalition forces to “get out and walk” and noted that up-armored Humvees limit “situational awareness and insulates us from the Iraqi people we intend to secure.” Criticism of mine-resistant ambush-protected vehicles that prevent local engagement is just as applicable to robots operating in the sea of the people.
If deployments are not accompanied by intelligent and constant two-way conversations with the people and the media, the propaganda about our deeds becomes how the United States is not willing to risk lives for the mission or the host population. The media must not create the idea that the mission is not important enough to sacrifice our own men and women, lest the local population wonders why they should sacrifice theirs. The result may be more than replaying improvised explosive device attacks against robots on YouTube; it may lead to a modern propaganda contest and an escalation of spectacular attacks to reach humans in order to influence U.S. public opinion and increase extra-regional sympathy for the insurgents.
The third domain is the discourse in the global media, both formal and informal, with foes and their base, allies, “swing voters,” and our own public. This discourse includes not only justifying actions but also containing and managing failures. On the former, work is underway today to formulate rules of engagement for robots designed around Western notions of an ethical practice of war codified in the laws of war. But the collapse of traditional concepts of time and space by new media prevents consideration of information by consumers and reporters. The noble pursuit of “lawfare,” of knowing the truth through careful reflection and analysis to validate Western-justified ends and means, just does not work. Attempting to justify acts based on what can be done according to Western laws actually permits an engagement model that is too permissive and ultimately detrimental to a mission where, as Lieutenant General James Mattis put it, “ideas are more important than [artillery] rounds.” In other words, international law may permit firing into a house with women and children, but the blowback will be significant. Further, if private military contractors are perceived as skirting the laws of war, then the application of those laws to a robot and its human handler (if one exists) is even more unclear.
Without capable information management from the strategic to the tactical level, accidents and failures of unmanned systems will receive harsh treatment in the global media, amplifying an endemic view in the Middle East and elsewhere that the United States commoditizes death. The United States cannot afford technological failures or induced failures (i.e., hacking) that kill civilians. The U.S. military can blame “out-of-control” human contractors, even if they were operating under the rules of engagement set by their government clients, but the principal is absolved from responsibility to a much lesser degree if the agent is a machine. Previous incidents of “technical failure” causing civilian deaths, including the USS Vincennes shoot down of Iran Air Flight 655 in 1988, are examples of a strategic communications apparatus that cannot handle technical failure.
It is essential that the information effects of what we do be considered from the outset, including the impact of information campaigns. Strategic communicators, public diplomats, and information operators must be involved from the inception of unmanned warfare, but they are not. Conversations with proponents of unmanned systems in the Defense Department and think-tanks make it clear the U.S. military has yet to understand that deploying robots to augment the human warfighter is not the same as changing out the M-16 for the M-4 carbine. The uniformed warfighters the robots will replace reflect the country’s commitment to the mission, shaping local and global opinions that garner or destroy support for the mission. Robots, regardless of their real or perceived autonomy, will also represent, reflect, and shape these opinions. The informational effect of robots is substantial, but little research has been done on the subject. Failing to recognize the effect that unmanned systems may have on the struggle for the minds and wills of men and women will have tragic unintended consequences.