Blog Archives

Was the meeting of UN’s Governmental Experts on Lethal Autonomous Weapon Systems cancelled to delay action affecting UK and US investment?

In 2015 Max Tegmark (professor, MITT) reported, in the Future of Life Institute, that Artificial Intelligence & Robotics researchers warned in an open letter:

“Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is—practically if not legally—feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.”

Today (Aug. 21), Quartz reports that in a second open letter a group of specialists from 26 nations, including Tesla CEO Elon Musk and DeepMind co-founder Mustafa Suleyman, as well as other leaders in robotics and artificial-intelligence companies, called for the United Nations to ban the development and use of autonomous weapons.

In recent years Musk has repeatedly warned against the dangers of AI, donating millions to fund research that ensures artificial intelligence will be used for good, not evil. He joined other tech luminaries in establishing OpenAI, a nonprofit with the same goal in mind and part of his donation went to create the Future of Life Institute.

“As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm. We warmly welcome the decision of the UN’s Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations . . .

“Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”

The first meeting for the UN’s recently established Group of Governmental Experts on Lethal Autonomous Weapon Systems is now planned for November. It was to be held today, but was cancelled, the letter notes, “due to a small number of states failing to pay their financial contributions to the UN.”

Critics have argued for years that UN action on autonomous weapons is taking too long.

The UK and the US have increased investment on robotic and autonomous systems by committing to a joint programme (announced by UK Defence Minister Philip Dunne and US Under Secretary of Defense Frank Kendall, right).

Observers say the UK and US are seeking to protect their heavy investment in these technologies – some directly harmful and others servicing  military operations – by ‘watering down’ an agreement so that it only includes emerging technology, meaning that any weapons put into practice while discussions continue are beyond the reach of a ban.

 

 

 

g

Advertisements