Consumers could develop closer personal relationships with their digital assistants, if plans to make Google Assistant more personalised and proactive come to fruition.

Humans are already forming intimate bonds with technology. Digital anthropomorphism akin to that which led Joaquin Phoenix to fall in love with his AI operating system in 2013 film Her is already upon us.

Image: iStock/Marjan_Apostolovic

In 2015, almost half the 12,000 surveyed users of digital assistant Assistant.ai admitted they could fall in love with the software. Google Assistant could have more attraction to the hearts of humans than love rivals such as Siri, believes product lead Yariv Adan. 

"It's fun, it's witty, it's nice," says Adan, who heads the team behind the AI capabilities of Google Assistant. "I don't think it's sassy like Siri - which is fine, that's their concept - I think the personality is super-rich, super-powerful."

Adan gave a demo of an internal version of Google Assistant that included a number of unreleased features at RE•WORK Deep Learning Summit. Personalised intelligence is already deeply embedded in the one-year-old system, and looks set to grow more sophisticated as it ages.

"Personalisation goes all across the assistant," Adan explains. "Really knowing you and understanding you and understanding your preferences, from the fact that you are vegetarian, the fact that you like daily updates, to where you're currently traveling and how you like to interact and your family members and whether you allow us to look at your email, or you don't allow us to look at your email, a bunch of stuff." 

This deeper understanding of the user will be complemented by enhanced humanisation, such as the new release of a male voice option. Adan says this will be followed by an option to nickname the assistant.

Finding your voice

Voice assistants have traditionally relied on direct queries as the input for their response. Google Assistant is increasingly recognising indirect methods of eliciting information that allow it to offer advice without the user explicitly commanding it.

Adan barks an order into his phone as an example.

"Translate to German: 'Hey I have a problem can you help me'."

The AI Assistant responds by repeating his words in German. 

"What about: 'I lost my baggage in the train'?"

Again it translates his words.

That "what about" would throw a less sophisticated system off track, but this one recognised the connection to the request that preceded it. That connection indicated that the intention was still to translate.

This can extend to overhearing a conversation between two people and interjecting with a suggestion.

"If I ask a friend of mine: 'Do you know what the weather's like this weekend in London?' it would be nice if the system could understand person-to-person talking and actually jump in and say: 'Hey, maybe I can help you out'?" says Adan. "A lot of our work is to support this kind of query."

Contextual information such as location, time of day, planned tasks and external events allow it to make situationally aware smart suggestions. This combines with its constantly growing understanding of the user to create a system that understands how to help an individual in a specific situation according to their personal needs and tastes.

"How is my team doing?" Adan asks his Google Assistant.

"Barcelona is first in group A," it replies.

At some point in the past, Adan had told the system that he supports FC Barcelona. It can then access that fact whenever it’s relevant to a query, and thus become a more personal assistant as it picks up details about the user.

Google's huge database of queries provides a gives it a head start on the competition. Extensive search logs provide a deep understanding of the conversational terms and vocal triggers that lie behind human desires and how to manage their intent.

This vast set of examples is used to define what specific questions mean. They can then match the queries to one of their wide range of model answers.

Even if a user garbles their speech or mispronounces a word the voice recognition can identify the error and correct it before responding to the intended use of language. AI assistants are already determining our intentions.

Making decisions for you

Google Assistant's growing understanding of its user will be augmented by more autonomy. Adan says that it will "definitely" become more proactive, by knowing the user well enough to anticipate what they need. This will allow it to either suggest an action to the user, such as buying a train ticket, or even make that purchase for them independently.

Personalisation makes consumers more likely to let their digital assistant make decisions for them, as it humanises the software by showing that it understands them. It also makes users more comfortable talking with or to technology. This gives them the confidence to reveal more sensitive information, from embarrassing personal details to email passwords. 

Telling your digital assistant your deepest, darkest secrets comes with an inevitable risk. The microphone is listening and may even be recording.

There is also a danger that the software could misinterpret your speech, such as thinking a user's colleague is a spouse and letting it slip in front of their other half.

Privacy advocates and cyber security specialists have long raised concerns over how digital assistants collect personal data and the potential for it to be intercepted by hackers.

Adan adds that Google has a range of measures in place to reduce these risks. They include extensive internal testing, a safety-first approach to releasing features, restricting access to sensitive information and developing quick and easy recovery solutions to any bugs that do get through the net.

"We know there are errors," Adan admits. “We assume that we will fail, so we need to make sure that these failures are not drastic and we need to make sure that they are fixable."

He acknowledges that users often ask their digital assistants personal questions, such as advice on their relationships, but he doesn’t yet expect them to replace interpersonal conversations.

"We speak a lot with each other and we chat a lot with each other and I think we need to bring the assistant to these places and not me with my assistant talking. That's a little bit bizarre in my opinion."

Find your next job with techworld jobs