Intended for healthcare professionals

CCBYNC Open access

Lethal autonomous weapons

BMJ 2019; 364 doi: (Published 25 March 2019) Cite this as: BMJ 2019;364:l1171
  1. Emilia Javorsky, director1,
  2. Max Tegmark, professor2,
  3. Ira Helfand, co-president3
  1. 1Scientists Against Inhumane Weapons, Watertown, MA, USA
  2. 2Department of Physics & Center for Brains, Minds and Machines, Massachusetts Institute of Technology, Cambridge, MA, USA
  3. 3International Physicians for the Prevention of Nuclear War, Malden, MA, USA
  1. Correspondence to: E Javorsky ejavorsky{at}

It’s not too late to stop this new and potentially catastrophic force

Advances in artificial intelligence are creating the potential to develop fully autonomous lethal weapons.1 These weapons would remove all human control over the use of deadly force. The medical community has a long history of advocacy against the development of lethal weapons, and the World and American Medical Associations both advocate total bans on nuclear, chemical, and biological weapons.2 But while some nations and non-governmental organisations have called for a legally binding ban on these new weapons,34 the medical community has been conspicuously absent from this discourse.

Third revolution in warfare

Several countries are conducting research to develop lethal autonomous weapons. Many commentators have argued that the development of lethal autonomous weapon systems for military use would represent a third revolution in warfare, after the invention of gunpowder and nuclear weapons.5 Although semi-autonomous weapons, such as unmanned drones, are in widespread use, they require human oversight, control, and decision making …

View Full Text

Log in

Log in through your institution


* For online subscription