Google's work with the Pentagon to analyse drone footage could prove to be a watershed moment for ethics at the tech giant, as the uproar led staff to call for ethics training to ensure they were developing virtuous technology.
The revelation that Google had been providing open-source object recognition software that helped military analysts detect objects in images triggered deep internal discussions. User experience engineer Kenric McDowell told Techworld that his Google colleagues wanted guidance on how they could make ethical design decisions.
"There was a call for internal classes in ethics, and that type of education is necessary when you're talking about technology having a human impact," said the softly-spoken McDowell at the and& Festival in Leuven, Belgium.
"For some reason we were able to build digital technologies from the last 40, 50, 60 years with this assumption that they had no ethical impact, which is really bizarre in hindsight."
The collaboration with the US Department of Defense added to concerns that homogeneous teams of technologists can develop an insular mentality that leaks into their products without absorbing the ethical codes of their diverse range of users.
McDowell and his team at Google intend to change this.
He heads the Artists and Machine Intelligence (AMI) programme, which brings together artists, philosophers and engineers that aims to find new ways to develop intelligent systems that reflect the breadth of human experience by incorporating dissenting views and alternative ideas.
His objective is to help Google understand the needs and creeds of its billions of users, which helps the company design better and more ethical technology.
"There's a big problem in Silicon Valley, which is that STEM education makes it possible to miss out on a lot of knowledge about human culture and humanities," said McDowell.
"And when you're developing a civilisation-level technology like AI, that much like the internet will have unforeseen consequences, it's really important to get a broad range of expertise and visibility into aspects of culture that are excluded from big technology corporations right now.
"Just by bringing in these different points of view you suddenly shift a lot of assumptions about what we're actually doing at Google."
The Google AMI programme
The work at AMI ranges from exhibitions of art made by neural networks to road trips from New York to New Orleans in a Cadillac sedan connected to an AI poetry system, which generated 5,000 words of verse by interpreting images captured by a camera mounted to the trunk.
Other projects are more prosaic. McDowell recently held a workshop on the ontology of users, which generated a new design paradigm for AI interaction.
Artists and thinkers including sociologist Benjamin H. Bratton, philosopher Tim Morton and professor of computer art Sheldon Brown explored how landscapes, social hierarchies and urban environments affected people's views and needs and their relationship with technology.
"Just by shifting our point of view ontologically we were able to escape from certain assumptions that always sneak into the design process," said McDowell.
"That type of thinking will never happen unless you bring in experts from other fields."
The engineer's long black hair and array of arm tattoos may not be the stereotypical attire of a technologist, but they give a glimpse into his other job as an electronic music producer. McDowell believes the arts can teach us much about the relationship between humans and technology.
"Artists enable the cognitive modes that are latent in technology to be felt, and feeling technologies gives you a totally different sense of what they are," he said. "Sensing them through your eyes and ears and your body is very different than using them the way that they're normally used.
"That kind of expanded engagement sensorially with technology opens up a whole range of perceptions of what's happening that's generally hidden."
Google's famous mantra of "Don't Be Evil" rings more hollow with every controversy at the corporate giant, but rarely had protests arose internally before its partnership with the Pentagon.
The collaboration led more than 3,100 Google employees to sign a letter addressed to CEO Sundar Pichai that called for the project to be cancelled as the company "should not be in the business of war".
McDowell believes the company has both the will and the resources to place ethics over profits and peace over war.
"Everybody of course always wants more money-printing machines, but a big benefit to any massive entity like Google is good will and benefit to humanity," he said.
"If we can actually deliver that, that's worth a lot, but you can't deliver that with a short-term mentality."