US Military Is Building Advanced Military Drones
For the Pentagon, drones are cheaper to buy and to operate than regular fighter jets. An armed MQ-9 Reaper drone runs about $14 million, compared to $180 million or more for an F-35 Joint Strike Fighter. But unlike barrel-rolling a jet, the business of actually operating a unmanned aerial vehicle, UAV, for the military is lonely, thankless, and incredibly difficult.
It's no wonder the Pentagon doesn't have enough drone pilots to meet its needs, a problem certain to persist as the military increases its reliance on unmanned systems, especially in areas where it has no interest in putting boots on the ground, like Pakistan or Iraq. The solution that the military is exploring: increasing the level of autonomy in UAVs to allow one pilot to manage several drones at once.
The Defense Advanced Projects Research Agency, DARPA, put out a call for ideas as part of the "Collaborative Operations in Denied Environment" or CODE project. Today, the majority of the drones that the military is using in the fight against ISIL require two pilots. The agency is looking to build packs of flying machines that communicate more with one another as with their operator, which, in turn, would allow a single operator to preside over a unit of six or more drones. Together, the flying robot pack would "collaborate to find, track, identify and engage targets," according to a press release.
It's the "engage" portion of that release that rings of Skynet, the robotic tyrant system made famous by the "Terminator" movie franchise. But the drones that DARPA is envisioning would not be firing on human targets without approval from another human. The request also states that the targeting would be under "under established rules of engagement." What are those rules when it comes to robots? In deciding what drones should and should not be allowed to do, the Defense Department relies on a 2012 directive that states that autonomous weapons "shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force."
Even so, the legality of US drone strikes, particularly those in which civilians die as a result of the strike, remains a matter of some dispute. Ben Emmerson, the United Nation's special rapporteur on human rights and counter-terrorism, authored a report in 2013 that found that 33 drone strikes may have violated International Humanitarian Law.
A separate U.N. report by Christof Heyns, the UN special rapporteur for extrajudicial, summary or arbitrary executions, noted that improvements to military drones would inevitability trickle down to civilian systems. The report questioned whether any government could hold true to a promise to never allow a robot to pull the trigger.
Likewise, military technology is easily transferred into the civilian sphere. Even proponents of greater military investment in unmanned systems have cautioned that increasing the amount of autonomy in armed flying robots carries some big risks.
Improving the autonomy in armed drones decreases the likelihood of uplink communication hacking. The only thing scarier than a heavily armed robot that can do (some) thinking for itself, is a flying missile-tipped droid that's been hijacked by the enemy.
http://www.defenseone.com/technology/2015/01/us-military-building-gangs