Documents Reveal Advanced AI Tools Google Is Selling to Israel
Training materials reviewed by The Intercept confirm that Google is offering advanced artificial intelligence and machine-learning capabilities to the Israeli government through its controversial "Project Nimbus" contract. The project is intended to provide the government, the defense establishment and others with an all-encompassing cloud solution," the ministry said in its announcement. Though some of the documents bear a hybridized symbol of the Google logo and Israeli flag, for the most part they are not unique to Nimbus. The documents obtained by The Intercept detail for the first time the Google Cloud features provided through the Nimbus contract. Related Google does not appear to share Microsoft’s concerns. An AI can comb through collected surveillance feeds in a way a human cannot to find specific people and to identify people, with some error, who look like someone. Credit: Google The employee - who, like other Google workers who spoke to The Intercept, requested anonymity to avoid workplace reprisals - added that they were further alarmed by potential surveillance or other militarized applications of AutoML, another Google AI tool offered through Nimbus. This training process yields what’s known as a "model" - a body of computerized education that can be applied to automatically recognize certain objects and traits in future data. This is not so much of a problem for a world-spanning company like Google, with an unfathomable volume of both money and computing hardware at the ready. AutoML, on the other hand, streamlines the process of training a custom-tailored model, using a customer’s own data for a customer’s own designs. Google has placed some limits on Vision - for instance limiting it to face detection, or whether it sees a face, rather than recognition that would identify a person. Google’s machine learning capabilities along with the Israeli state’s surveillance infrastructure poses a real threat to the human rights of Palestinians," said Damini Satija, who leads Amnesty International’s Algorithmic Accountability Lab. " The option to use the vast volumes of surveillance data already held by the Israeli government to train the systems only exacerbates these risks." Custom models generated through AutoML, one presentation noted, can be downloaded for offline "edge" use - unplugged from the cloud and deployed in the field. Another Google representative then jumped in: "It is possible, assuming that you have the right data, to use the Google infrastructure to train a model to identify how likely it is that a certain person is lying, given the sound of their own voice." The answer should have been ‘no,’ because that does not exist," the worker said. " Google CEO Sundar Pichai speaks at the Google I/O conference in Mountain View, Calif. Google pledges that it will not use artificial intelligence in applications related to weapons or surveillance, part of a new set of principles designed to govern how it uses AI. Israel, though, has set up its relationship with Google to shield it from both the company’s principles and any outside scrutiny.
Read full article here:
(warning: ads & trackers)
This news aggregator is non-commercial and provided
as a public service by the Magnusson Institute,
a Nevada 501(c)(3) non-profit. All articles
and images are copyright of their respective