Google And Amazon Choosing Profit Over Privacy? Take a Hint from Project Nimbus

Google And Amazon Choosing Profit Over Privacy? Take a Hint from Project Nimbus

The primary concern for the employees is the 'potential breach of data privacy' and misuse of technology against Palestinians

When the Cambridge Analytica scandal broke out in 2019, no one assumed that it is only going to get worse. Now, it seems the technology is ripe enough to reveal its obnoxious side. Knowing some facts about a person is one thing while being able to predict what they are going to do is quite another – a creepy aspect of technology – that many people do not know exists. The recent protests by Google employees alleging that the company is offering advanced machine learning technology and AI tools to the Israeli Government through Google cloud as part of Project Nimbus, which is purportedly designed to exploit online data to further the military interests of the country is a stark example of our collective ignorance. The $1.2 billion project announced in April 2021 by the Israeli Government involves a cloud computing system built by Google and Amazon, intended to provide the Government, people, and army with a comprehensive cloud solution. The technology under the contract is meant for using facial recognition, object tracking, and sentiment analysis to help understand the emotional undertones of pictures, speech, and writing. The primary concern for the employees is the 'potential breach of data privacy' and misuse of technology by the Israeli Government against Palestinians.  The protests spread across four cities – San Francisco, New York, Seattle, and Durham, North Carolina – protesting and chanting, "No Tech For Apartheid", "Another Google Worker Against Apartheid".

The protests took off a week after Koren, a Google employer resigned over a tiff with the company for publicly criticizing the contract. She was offered the impossible choice of either shifting to Brazil or leaving the company. Koren, a Jewish by ethnicity, was against Google using its influence to suppress Palestinian, Jewish, Arab, and Muslim voices. More importantly, Google employees are worried if their work would indirectly support the Israeli occupation of Palestine as they believe Google's tie-up might add to the existing tech arsenal, resulting in the militarization of data. Surprisingly, employees, the majority of them who were involved in creating the technology for Nimbus don't have an idea of how the project will be operationalized. "The former head of Security for Google Enterprise — who now heads Oracle's Israel branch — has publicly argued that one of the goals of Nimbus is preventing the German government from requesting data relating to the Israel Defence Forces for the International Criminal Court," said Jack Poulson, who resigned in protest from his job as a research scientist at Google in 2018, in a message to The Intercept. The Intercept's review reveals Google's training material having references to government personnel and the Ministry of Defence undergoing training at the online portal Coursera. Google is denying the allegations and steers clear of the military angle. Its spokesperson Atle Erlingson in a statement to Forbes said, "As we have stated many times, the contract is for workloads running on our commercial platform by Israeli government ministries such as finance, healthcare, transportation, and education. Our work is not directed at highly sensitive or classified military workloads relevant to weapons or intelligence services."

The case for 'benefit of the doubt' may not apply here as this is not the first time, Google's intentions turned out suspicious. In 2020, Google Cloud and image recognition AI technologies were used for setting up a surveillance system by US Customs and Border Protection.  Earlier in the year 2019, around 3,000 Google employees signed a petition protesting against the company's involvement with the US Department of Defense of artificial intelligence (AI) project to improve its drone strikes on the battlefield with Google's ariel imagery data. However, some experts opine that the issue is edgy and tech companies should not advent into areas they do not have access to; it creates a risk of abuse, says Liz O'Sullivan, CEO of the AI auditing startup Parity and a member of the U.S. National Artificial Intelligence Advisory Committee.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net