Skip to main content

Who's storing your conversations?

It may feel like chatting with Alexa and Siri is fairly innocuous, but it can be disconcerting to know that your conversations are stored somewhere beyond your control. A research team from the department is investigating how to make virtual assistants more respectful through a combination of technical and social techniques. DPhil student William Seymour explains.

There is a large body of human-computer interaction literature that describes the potential perils of treating a machine as if it was human. At the same time, people over the years have anthropomorphised everything from their kitchen appliances to their Roombas. While this behaviour is mostly harmless, new technology threatens to change that.

Virtual assistants such as Siri and Alexa offer us a means of automating simple and mundane tasks. But more so than the devices that came before them, they invite us to personify them. They have names, speak in human voices, and can, to a limited extent, converse with us and tell jokes. So what's the problem? This behaviour primes us to expect our technology to behave in a human-like manner.

The data collected by your Echo belongs to Amazon and not you.

Combined with distinctively human interaction methods (such as speech) we can start expecting our technology to behave in ways that would be obvious to a human, but are simply beyond the current capabilities of machines.

In addition to this, the relationships that we have with these devices are now built on a very different footing than they were in the past. While Cortana might tell you that she likes you, the legal agreement that exists between you and the company that created your virtual assistant's software does not contain a clause obliging them to like or respect you (and when was the last time you read any terms and conditions?) Normally this is not something we notice, but when the divide collapses it can lead to creepy behaviour, and a feeling akin to betrayal. In a 2017 murder trial in Arkansas, recordings from an Alexa owned by the accused were turned over to the prosecution as evidence. This is not something that one expects from a gadget that is primarily used to play music. We expect to enjoy privacy in our own homes, but this is ultimately incompatible with the reality that the data collected by your Echo belongs to Amazon, and not you.

Is the solution to start reading privacy policies? An analysis of 184,000 privacy policies reveals that the reading comprehension needed to understand the average privacy policy is roughly the same as that required to read an article in Nature. Simplifying policies might help, but only if they are actually read. In order to find a longer term solution, Professor Max Van Kleek, Reuben Binns and I have been asking what a respectful virtual assistant might look like, as part of the EPSRC-funded Respectful Things In Private Spaces (ReTIPS) project (http://goo.gl/VkDyoW).

Perhaps a respectful assistant stores and processes the data it collects locally, so that it never leaves your home, or offers different voice recognition options so that users can trade off the accuracy of cloud-based models against the privacy of local ones. We also think that the most respectful devices will incorporate tools which help users make their other devices more respectful if they are unable to do it themselves.

Along the same lines, we are also putting together an 'honest' version of the Amazon Echo. When using a modified Alexa, users are informed when they make a request how that information is going to be used: where it goes, if it’s used to profile them, and which third parties will have access to it. By doing this we hope to examine how people use this information to make decisions about the devices in their homes, and particularly how they change their behaviours after interacting with our speculative lab prototypes.

So while virtual assistants themselves are not inherently problematic, they do bring to the fore many of the problems that have built up around our use of centralised data-driven services, and especially so for home Internet of Things devices. We believe that using the notion of respect when thinking about smart devices, while not solving all the problems identified above, will help make the smart home of the future a better place to live.

This article first appeared in the summer 2018 issue of Inspired Research.