430

Computers as Bad Social Actors: Dark Patterns and Anti-Patterns in Interfaces that Act Socially

Abstract

The Computers Are Social Actors (CASA) paradigm suggests people exhibit social/anthropomorphic biases in their treatment of technology. Such insights have encouraged interaction designers to make automated systems act in more social (chatty or even friend-like) ways. However, like typical dark patterns, social-emotional responses to systems as (seemingly sentient) agents can be harnessed to manipulate user behaviour. An increasingly common example is app notifications that assume person-like tones to persuade or pressure users into compliance. Even without manipulative intent, difficulties meeting contextual social expectations can make automated social acting seem rude, invasive, tactless, and even disrespectful -- constituting social `anti-patterns'. This paper explores ways to improve how automated systems treat people in interactions. We mixed four qualitative methods to elicit user experiences and preferences regarding how interfaces ``talk'' to/at them. We identify an emerging `social' class of dark and anti-patterns, and propose guidelines for helping (`social') interfaces treat users in more respectful, tactful, and autonomy-supportive ways.

View on arXiv
Comments on this paper