Attitudes Towards Emotional Artificial Intelligence Use: Transcripts of Citizen Workshops Collected Using an Innovative Narrative Approach, 2021

The data were collected during citizen workshops, conducted online via Zoom, exploring attitudes towards emotional artificial intelligence use (EAI). EAI is the use of affective computing and AI techniques to try to sense and interact with human emotional life, ranging from monitoring emotions through biometric data to more active interventions. 10 sets of participants (n=46) were recruited for the following groups: 3 older (65+) groups: n=13 3 younger (18-34) groups: n=12 2 groups, people self-identifying as disabled: n=10 2 groups, members of UK ethnic minorities: n=11 There was an attempt to balance other demographic categories where possible. Participants were grouped in relation to age as this has been shown to be the biggest indicator of differences in attitude towards emotional AI (Bakir & McStay, 2020; McStay, 2020). It was also considered important to include the views of those who have traditionally been ignored in the development of technology or suffered further discrimination through its use, and so the opinions and perspectives of minority groups and disabled people were sought. Participants were recruited through a research panel for the workshops, which took place in August 2021. A novel narrative approach was used, with participants taken through a piece of interactive fiction (developed using Twine, viewable here: https://eaitwine.neocities.org/), a day-in-the life story of a protagonist encountering seven mundane use-cases of emotional AI, each structured as a) a neutral introduction to the technology; b) a binary choice involving the use of the technology; c) a ContraVision component demonstrating positive and negative events/outcomes. The use cases were: • Home-hub smart assistant • Bus station surveillance sensor • Social Media Fake news/Disinformation and profiling. • Spotify music recommendations (using voice and ambient data). • Sales call evaluation and prompt tool • Emotoy that collects and responds to children's emotional data. • Hire car in-cabin customisation and driving support. Each workshop lasted 2 hours. Audio files were transcribed using a transcription service before being corrected and formatted by a project researcher. References: Bakir, V., & McStay, A. (2020). Profiling & Targeting Emotions in Digital Political Campaigns. Briefing Paper for All Party Parliamentary Group on Electoral Campaigning Transparency. McStay, A. (2020). Emotional AI, soft biometrics and the surveillance of emotional life: An unusual consensus on privacy. Big Data & Society, 7(1), 1–12. https://doi.org/10.1177/2053951720904386CONTEXT Emotional AI (EAI) technologies sense, learn and interact with citizens' emotions, moods, attention and intentions. Using weak and narrow rather than strong AI, machines read and react to emotion via text, images, voice, computer vision and biometric sensing. Concurrently, life in cities is increasingly technologically mediated. Data-driven sensors, actuators, robots and pervasive networking are changing how citizens experience cities, but not always for the better. Citizen needs and perspectives are often ancillary in emerging smart city deployments, resulting in mistrust in new civic infrastructure and its management (e.g. Alphabet's Sidewalk Labs). We need to avoid these issues repeating as EAI is rolled out in cities. Reading the body is an increasingly prevalent concern, as recent pushback against facial detection and recognition technologies demonstrates. EAI is an extension of this, and as it becomes normalised across the next decade we are concerned about how these systems are governed, social impacts on citizens, and how EAI can be designed in a more ethical manner. In both Japan and UK, we are at a critical juncture where these social, technological and governance structures can be appropriately prepared before mass adoption of EAI, to enable citizens, in all their diversity, to live ethically and well with EAI in cities-as-platforms. Building on our ESRC/AHRC seminars in Tokyo (2019) that considered cross-cultural ethics and EAI, our research will enable a multi-stakeholder (commerce, security, media) and citizen-led interdisciplinary response to EAI for Japan and UK. While these are two of the most advanced nations in regard to AI, the social contexts and histories from which these technologies emerge differ, providing rich scope for reflection and mutual learning. AIMS/OBJECTIVES 1. To assess what it means to live ethically and well with EAI in cities in cross-cultural (UK-Japan) commercial, security and media contexts. 2. To map and engage with the ecology of influential actors developing and working with EAI in UK-Japan. 3.

Show More

Geographic Coverage:

United Kingdon

Temporal Coverage:

2021-08-04/2021-08-13

Resource Type:

dataset

Available in Data Catalogs:

UK Data Service

Topics: