Robot Wars: The Hal Factor

As Predator drones, ground robots and other military technologies become bigger, better and deadlier, our need to account for the moral and ethical challenges they represent becomes ever more important, or so argues Simon Roughneen.

Over the past decade, the use of unmanned aircraft and ground-based robots in combat has grown rapidly.

The first drone kill was on an al-Qaida suspect in Yemen in 2002. The US military does not release precise figures, but since 2008, at least 40 people have been killed in dozens of UAV strikes in Pakistan, including Pakistani Taliban leader Baitullah Mehsud, who is thought to have been behind the 2007 assassination of former Pakistani prime minister Benazir Bhutto. When the US invaded Iraq in 2003, there were no unmanned systems in use on the ground.

However, according to external pagePW Singer – author of Wired for War: The Robotics Revolution and Conflict in the 21st Century – by 2008 there were 12,000 of these in a variety of forms and carrying out different tasks, as part of the US deployment in Iraq.
Technological advances mean that an increasing share of the battlefield workload is being carried out by machines, a trend that looks set to grow exponentially.

As John Pike, founder and director of external pageGlobalSecurity.org told ISN Security Watch, “Most of the technical advances of the past few decades are manifestations of Moore's Law - the doubling of constant cost computer power every 18 months.”

Some of the devices currently in use include the Predator, a UAV (unmanned aerial vehicle) which can spy on and attack positions and personnel in a war zone, as well as supply images and real-time information to troops on the ground, while being controlled by a single 'pilot' from a US base in Nevada. These can hover for long periods of time and return to base for refueling and reloading and pilots are not at risk.

On terra firma, between 6,000 and 12,000 robots help locate and disarm improvised explosive devices (IEDs), or roadside bombs – one of the major threats to western troops in Afghanistan and Iraq. Others, equipped with cameras and sensors, are used by soldiers to carry out surveillance of caves, side streets and buildings.

Some of the devices resemble mini-tanks, with caterpillar wheels and mounted turrets, and have drawn on home/civilian technology such as robotic vacuum cleaners. More intimidating models have been developed, such as the external pageTalon, complete with chemical, gas, temperature and radiation sensors. It can be mounted with a grenade launcher, machine gun, incendiary weapon or 50-calibre rifle, and heavier firepower versions have been developed.

There are many different shapes and sizes of unmanned vehicle, aircraft or robot, some of which are as small as a fly. The biggest is a 700-ton robotic dump truck capable of hauling 240 tons of earth at a time, and was the inspiration for the Long Haul character in the Transformers series.

The drone revolution

As befits its position as the world's largest defense spender, the US leads the way in research and development in this field. However, other countries are getting in on the act.

external pageProfessor Noel Sharkey of Sheffield University tells ISN Security Watch that the Israel Aeronautics Industry (IAI) is the second biggest producer and exporter of unmanned vehicles, and supplies India and Turkey. All in all, about 50 countries have, are acquiring or are developing unmanned systems, including Russia and China, while South Korea has unmanned military capabilities deployed along the DMZ bordering North Korea. Even Hizbollah used what appeared to be relatively unsophisticated drones during the 2006 war with Israel.

These developments appear to constitute a revolution in how wars can be fought. Machine is not yet ready to replace man, but as PW Singer, Director of the 21st Century Defense Initiative at the external pageBrookings Institute outlines in his book, military officers think that new and emerging prototypes will soon make human fighter pilots obsolete.

For now, however, the systems in use, that are known, entail what is called ‘man in the loop,’ which means that a person makes the decisions, such as the ‘pilot’ operating the unmanned drone flying over Afghanistan, from his or her terminal in Nevada.

As Professor Sharkey explains, “These are not simply remote controlled systems – they have a certain amount of autonomous function and this is increasing continually.”

The logical next step, albeit one that could be some years away, is fully autonomous robots, or mean self-directed machines fighting without human control, and ‘taking decisions’ on the battlefield or its environs.

It all sounds like sci-fi made real: akin to Isaac Asimov's I, Robot, or worse still, Stanley Kubrick's 2001: a Space Odyssey, where a computer named Hal9000 took control of the spaceship.

Weighing risks

Unmanned vehicles and aircraft reduce risk to those who deploy them, and can reduce the number of lives lost in combat. Moreover, UAVs played a key role in killing al-Qaida's Iraq-based leader Abu Musab al Zarqawi in 2006, and have killed more than half of al-Qaida's top 20 leaders worldwide, as noted by PW Singer.

However, these developments throw up some difficult, perhaps unanswerable, questions. It is possible that technological advances could reduce the threshold for going to war, albeit on a different scale and in a different way, as soldier and airforce casualties are minimized, and warfare takes on quasi video-game attributes.

The ethical and moral issues around the new and still-to-come technology have to be fleshed out. Should autonomous unmanned systems become a reality, it is not clear how these will be able to tell between civilians and combatants, irrespective of how advanced the technology gets.

Friendly fire deaths and unintended attacks on civilians by even the most professional and advanced militaries in the world remind that such discrimination “is difficult enough for humans to calculate,” according to Professor Sharkey.

Beyond the currently feasible prospect of autonomous systems, more alarmist voices have revived the 'technological singularity' thesis – that machines could self-generate a level of artificial intelligence beyond that which their human designers intended. This is Kubrick's Hal9000 nightmare. This has had on-off currency among scholars, technologists (including the founder of Sun Microsystems) and sci-fi writers, and has alternatively been rubbished by skeptics.

In the meantime, however, new regulations will almost certainly be needed, and these require international discussion, which has yet to take place.

As Ronald Arkin of the University of Georgia, and author of external pageGoverning Lethal Behavior in Autonomous Robots put it to ISN Security Watch, “The laws of war have changed over the years in response to new technology - for example blinding lasers are now explicitly banned.”

To kickstart this, Professor Sharkey and others have set up the International Committee for Robot Arms Control (ICRAC) to investigate how some of these armed systems could be limited, perhaps even prohibited, if necessary. The first gathering is scheduled for next summer.

JavaScript has been disabled in your browser