Google and the Department of Defense are exploring the use of artificial intelligence to identify objects in drone footage. The tech giant has been working with the Pentagon’s Project Maven, an initiative focused on big data and machine learning. According to sources, when the pilot project became an object of discussion at Google, some employees were angry that the company was working with the military on surveillance tech for drone operations. Google’s Eric Schmidt admitted that the tech community is concerned that the military-industrial complex will use Google’s research to kill innocent people.
Gizmodo reports that, “Project Maven, a fast-moving Pentagon project also known as the Algorithmic Warfare Cross-Functional Team (AWCFT), was established in April 2017,” with the mission to “accelerate DoD’s integration of big data and machine learning.”
The Wall Street Journal reported that, in 2017, the Defense Department spent $7.4 billion on artificial intelligence-related areas. Center for a New American Security adjunct fellow Greg Allen reports that the first assignment for Project Maven is “to help the Pentagon efficiently process the deluge of video footage collected daily by its aerial drones — an amount of footage so vast that human analysts can’t keep up.”
“Before Maven, nobody in the department had a clue how to properly buy, field, and implement AI,” wrote Allen. The initial goal was, said the Pentagon, “to provide the military with advanced computer vision, enabling the automated detection and identification of objects in as many as 38 categories captured by a drone’s full-motion camera.”
Gizmodo notes that, “although Google’s involvement stirred up concern among employees, it’s possible that Google’s own product offerings limit its access to sensitive government data.” According to Google, the company is “providing the Defense Department with TensorFlow APIs, which are used in machine learning applications, to help military analysts detect objects in images.” The company is also developing “policies and safeguards,” in acknowledgement of “the controversial nature of using machine learning for military purposes.”