Machines With The Power To Kill
The next generation of autonomous weapons dubbed “killer robots” could cause mass atrocities or even start a war, a former Google Project Maven engineer has revealed.
Laura Nolan left Google in in 2018 inprotest at being sent to work on Project Maven to improve US military drone tech. She said killer robots not controlled by humans should be prohibited by similar types of treaty banning chemical weapons and has now called for a ban of all AI killing machines not operated by humans.
Google will not renew a contract to do artificial intelligence work for the US Pentagon, company sources say. The decision follows strong opposition in the technology giant's workforce. A number of Google employees resigned, including a software enginerr, Laura Nolan, who quit Google when she discovered the project she was working on was for the US military.
Project Maven involved using machine learning to distinguish people and objects in drone videos. They feared it was the first step towards using artificial intelligence for lethal purposes.
According to company sources, there would be no follow-up after the current contract expired next March, but a journalist for the technology news website Gizmodo says that Google had not cancelled Project Maven and did not appear to have ruled out future work with the military.
“The likelihood of a disaster is in proportion to how many of these machines will be in a particular area at once. What you are looking at are possible atrocities and unlawful killings even under laws of warfare, especially if hundreds or thousands of these machines are deployed..... There could be large-scale accidents because these things will start to behave in unexpected ways. Which is why any advanced weapons systems should be subject to meaningful human control, otherwise they have to be banned because they are far too unpredictable and dangerous.”
Unlike drones, which are controlled by military teams often thousands of miles away from where the flying weapon is being deployed, Nolan said killer robots have the potential to do “calamitous things that they were not originally programmed for”.
The contract is reported to be worth less than $10m (£7.5m) to Google but could potentially lead to more extensive cooperation with the Pentagon.
Project Maven involves using machine learning and engineering talent to distinguish people and objects in drone videos. In April up to 4,000 Google employees signed an open letter saying that by its involvement in the project the internet giant was putting users' trust at risk, as well as ignoring its "moral and ethical responsibility".
Gizmodo: Guardian: BBC: Expres:
You Might Also Read:
Distinguished AI Expert Is Concerned About ‘Killer Robots’: