TRA Journal

Being human-centric amidst the technology hype

Antonia Mann

Artificial intelligence (AI), machine learning, wearables, virtual reality (VR) and other emerging technologies hold exciting possibilities for insight research. However, it’s useful to take a human-centric (rather than technology-centric) look at how people are adopting new technologies into their lives and use this to make our research more meaningful and authentic. In particular, the adoption of AI voice assistants into homes and lives could offer researchers new avenues for understanding people more meaningfully.

Omnipresent technology

Technology is becoming a presence without a central focus. It’s becoming an entity we interact with by voice – unconstrained by screen, device, time or place, with less effort or deliberation and more instinct.

Interactive, voice-activated AI personal assistants are becoming the most mainstream utilisation of this shift. Unlike chatbots, which live on-screen and are related to a narrow task, AI voice assistants like Amazon’s Alexa, embodied by the Echo or Echo Dot, live in the home and can help out in a variety of roles. With an open application programming interface (API) that allows developers to build new ‘skills’ for Alexa, her capabilities are always growing in both expected and unexpected ways. 

The race between the major technology

companies to improve their AI assistants and get them into homes is about much more than the device. The voice of the software that ultimately wins (currently lead by Amazon’s Alexa) will be the voice (and software) that people will expect all of their devices, appliances and programmes to interact seamlessly with. 

AI: part of the family?

As AI becomes more human-like and practical, we are becoming more comfortable interacting with it. Early adopters of the Amazon Echo frequently report that the device quickly becomes more like a member of the family. One user is quoted in Nellie Bowles’ article in The Guardian last April as saying, ‘Even when I’ve tried to call her “it”, it feels wrong. She has a name. She’s Alexa.’ Close to 90 per cent of people in the US who have purchased an Amazon Echo since 2014 say they are satisfied with the device. 

AI can help us change behaviour

Consumer products such as Vi, an AI bio-sensing voice-activated personal coach, show that these voice-activated assistants can be powerful coaches and mentors, helping people to achieve goals. Put Vi earphones in and she will monitor your physiology, answer questions like ‘Vi, what’s my heart rate?’, check in with questions like ‘Looks like you’re fatigued, are your legs done?’, motivate you to keep going or advise you to take it easy.

Alexa can do the same. One user programmed his Echo to help him quit smoking by telling him how many days it had been and the amount of money he had saved since his last cigarette.

According to a Computer World UK article published in November 2016, Capital One is working on an Alexa ‘skill’ to promote money-saving behaviour. When asked what they did the night before, Alexa will reply with, ‘I don’t know what happened to you but I know what happened to your money,’ and then encourage the ostensible partygoer to make better choices. 

AI assistants capture authentic data

Researchers ideally want sources and methodologies that capture people naturally – unadulterated, in the moment and unbiased. When this is not possible, the second best option is to piggyback on existing behaviour. 

With AI assistants accompanying and helping people through everyday life, tapping into or creating AI assistants can provide researchers with data about moments and content they would not normally be able to access. AI assistants can provide real time, native, uninfluenced qualitative and quantifiable data, all efficiently analysed. But how would this be facilitated in practical terms?

Could there be an Alexa ‘skill’ that willing participants install, allowing their behaviour to be recorded and even asking questions at the appropriate moment, where Alexa essentially becomes a proxy for the researcher? There are already Alexa skills that can detect mood in voice and such psychographic and biometric skills are bound to advance, either designed for AI assistants or in combination with other AI technologies. It is a rich area in which to experiment and explore.

For research, being artificial can be beneficial

As we have seen with the praises of Alexa and other research, humans find it quite easy to “humanise” robots. A famous anecdote about the first chatbot ELIZA created by MIT professor and computer scientist Joseph Weizenbaum tells of how his secretary asked him to leave the room as having another human in there when she was speaking with the chatbot made her uncomfortable. Studies have also shown that, in certain contexts, people open up more to robots than humans, perceiving them as unbiased and non-judgemental.

While much effort is going into making robots and artificial intelligence more human, the fact that robots have always been presented as non-emotional, without judgement or reaction, can work in their favour, particularly when the context may be taboo or illegal. Had polls been conducted via AI assistants in the recent US election, with people feeling unjudged by the voice they were talking to, would the results have been less shocking? Depending on the context, making AI more or less human can be beneficial to the research objective.

This is not to say researchers should worry that they are out of a job. Fazekas suggests that machines lack the creativity to imitate the human analytical ability to understand context. Though this may change as technology progresses, “we believe that humans working hand in hand with machines is the winning route.” There are still many dots amongst the data that only humans can connect, but the dots are now more fruitful and have the potential to add immeasurable depth and breadth to the insights.

Note: There is still the issue of privacy, security and permissions and the importance of designing ethics into AI and research, not to mention what it would mean for society if our best friends become disembodied voices, but these are topics for another article.

 

Contents