Countries spending billions on ‘third revolution in warfare’ and
UN debates regulation of AI-powered weapons.

The US is currently working on the prototype for a tail-less, unmanned X-47B aircraft, which will be able to land and take off in extreme weather conditions and refuel in mid-air. The country has also completed testing of an autonomous anti-submarine vessel, Sea Hunter, that can stay at sea for months without a single person onboard and is able to sink other submarines and ships. A 6,000kg autonomous tank, Crusher, is capable of navigating incredibly difficult terrain and is advertised as being able to “tackle almost any mission imaginable”.

The UK is developing its own unmanned vehicles, which could be weaponised in the future. Taranis, an unmanned aerial combat vehicle drone named after the Celtic god of thunder, can avoid radar detection and fly in autonomous mode.

Russia, meanwhile, is amassing an arsenal of unmanned vehicles, both in the air and on the ground; commentators say the country sees this as a way to compensate for its conventional military inferiority compared with the US. “Whoever leads in AI will rule the world,” said Vladimir Putin, the recently re-elected Russian president, last year. “Artificial intelligence is the future, not only for Russia but for all humankind.”

Noel Sharkey, professor of artificial intelligence and robots at the University of Sheffield, who first wrote about the reality of robot war in 2007 affirms, “This could automatically invoke a battle that no human could understand or untangle. It is not even possible for us to know how the systems would interact in conflict. It could all be over in minutes with mass devastation and loss of life.” (The Guardian, 4/9/2018)

Will UN, Governments and the manufacturers be able to prevent a situation like this? Would regulate and program the unmanned vehicles based on a predetermined code of ethics be possible?

See HERE our bioethics approach about autonomous cars code of ethics




Subscribe to our newsletter:

We don’t spam! Read our privacy policy for more info.