Skip to content

By Roger Highfield on

Digital assistants: Trust-personality link revealed by MSF survey

Many of us have welcomed Alexa, Siri, Google Assistant and Cortana into our homes. Roger Highfield, Science Director, reports on the surprising results of a new survey of how the public see them to celebrate the theme of the Manchester Science Festival, the future of humanity.

We have let digital assistants into our lives, but who do we think they are? Do we see them like our actual friends and family? Do we really trust them?

To find out, a survey, which involved over 500 people, was carried out by Professor Richard Wiseman at the University of Hertfordshire and Professor Caroline Watt at the University of Edinburgh.

‘Anthropomorphism is the tendency to see inanimate objects as having human-like characteristics. Many people think that cars and computers have a personality, prefer products when they have something that resembles a face, and trust robots when they look more like humans,’ said Professor Wiseman. ‘We examined if people also anthropomorphise digital assistants and how this affects whether they trust them.’

Rather than being a soulless machine trapped in a smart speaker, the survey reveals that people felt that their digital assistant had a personality, with the clear consensus being that it was sympathetic, calm, dependable and conventional.

In line with this notion, the respondents interacted with their digital assistants as they would a fellow human, with 40% saying ‘please’ and ‘thank you’ to them.

Perhaps most remarkable of all, 7% of our respondents were even more polite to their digital assistants than their actual friends and family.

The survey also focused on whether people trusted their new digital pals. Data showed that older people tend to trust digital assistants more than younger people. ‘This is consistent with previous work in the area, with some researchers arguing that younger people are more suspicious about what technology might be up to,’ noted Professor Wiseman.

However, the most striking findings concerned trust and personality. In real life, some people trust those around them and see them in a positive way, whilst others are far more suspicious and see others in a more negative way. The survey showed that our relationship with digital assistants holds up a mirror to real life.

About a fifth of people didn’t trust digital assistants and believed that their assistant had negative personality traits, including being unsympathetic, overly anxious, dishonest and undependable.

In contrast, the 60% who trusted assistants had a much rosier view of their assistant’s personality. The differences were striking. For example, 30% of those who trusted digital assistants perceived them as sympathetic and warm, compared to just 9% of people who didn’t trust them. The same effects emerged when it came to respondents viewing their assistant as dependable (48% vs 21%), calm (64% vs 33%) and creative (53% vs 45%).

‘In real life, people have different friends and colleagues, but with digital assistants, everyone is interacting with the same technology, and so our findings suggest that the perceived differences are in the eye of the beholder,’ added Wiseman.

The term anthropomorphism originated from the Greek philosopher Xenophanes when describing the similarity between religious believers and their gods. Anthropomorphism appears to be built into the human brain and has influenced behaviour throughout history: people—even some members of royalty—talk to their house plants, dress their pets up in clothing, pray to humanlike gods, and name their cars and other inanimate objects.

The results have yet to be peer reviewed so I approached my Alexa for a comment about our survey. When asked: ‘Alexa, do you think that, to trust artificial intelligence, we have to like it?’ Alexa responded, somewhat enigmatically: ‘Hmm, I don’t know that one’.

AI is playing an increasingly significant role in our lives and this survey highlights the importance of humans and machines getting along. It seems we treat digital assistants like actual humans. As in real life, some people treat those around them with courtesy and respect, and others view them with deep suspicion.

‘Over the course of thousands of years, we have developed ways of perceiving and making judgements about people. When new technology resembles a human, we fall back on these ways of thinking to make sense of the world,’ said Prof Wiseman. ‘AI marches on, but psychology is always going to play a vital role in understanding how we interact with any new innovations and inventions.’

Leave a comment

Your email address will not be published. Required fields are marked *