Summary
15 November 2013
Countries are meeting in Geneva to decide whether to consider banning 'killer robots' which can make decisions by themselves about when to kill people. Human rights groups claim weapons like these raise serious moral questions about how we conduct war.
Reporter:
Imogen Foulkes
Listen
Click to hear the report
Report
Drones have already raised questions about 21st century warfare – but while they have no pilots, they are controlled by humans on the ground. Lethal autonomous weapons, or 'killer robots', are programmed in advance; on the battlefield it could be the robot, not the human, which decides who to kill.
The United States, Britain and Israel are all developing lethal autonomous weapons, although all three countries say they don’t plan to take humans out of the decision-making loop.
Supporters of the new technology say it could save lives, by reducing the number of soldiers on the battlefield, but human rights groups question the ethics of allowing machines to take decisions over life and death.
Now the 50 countries which have ratified the convention on conventional weapons – the countries which have already approved a ban on blinding laser weapons – will consider whether to begin talks on banning killer robots.
Listen
Click here to hear the vocabulary
Vocabulary
- drones
aircraft which are controlled by people on the ground
- warfare
the activity of fighting a war
- lethal
causing death
- autonomous
independent, able to make its own decisions
- out of the decision-making loop
not part of the process of making decisions
- ethics
set of beliefs or principles that tell people what is right and wrong
- ratified
made (an agreement become) official
- blinding
causing blindness (not able to see)