AI drone “terminator” project meets internal resistance
By Ethan Huff
More than a dozen Google employees have quit working for the tech monolith in recent weeks as a response to the company’s forging ahead with a Terminator-esque artificial intelligence (AI) program that could end up weaponizing machine-learning technologies for use in military war games.
Known as “Project Maven,” the Department of Defense (DoD) collaboration seeks to expand the functionality of AI technologies by “teaching” AI drones how to better identify “people and objects of interest.” According to a Google spokesperson, the program won’t be used to harm anyone, but will instead by “scoped for non-offensive purposes.”
But many Google employees aren’t convinced, as almost 4,000 of them have signed onto an internal company petition calling on Google to abort the project and get back to not being evil, which is what the world’s largest search engine claims as part of its mission statement.
According to reports, those in the know recognize that, despite Google’s claims about its supposedly innocuous nature, the technology “is being built for the military, and once it’s delivered, it could easily be used to assist in [lethal] attacks.” The petition letter also states: “We believe that Google should not be in the business of war.”
“Amid growing fears of biased and weaponized AI, Google is already struggling to keep the public’s trust,” the letter adds. “The argument that other firms, like Microsoft and Amazon, are also participating doesn’t make this any less risky for Google. Google’s unique history, its motto Don’t Be Evil, and its direct reach into the lives of billions of users set it apart.”