HOW DOES IT MAKE YOU FEEL? GAPS, CRACKS AND CARE TECHNOLOGIES

Daphne Dragona

Care technologies are already here. Virtual assistants, online bots and social robots are becoming more and more common at work, at home, on the go, as well as at places like hotels or hospitals. Alexa, Siri, Cortana, Pepper and other artificial companions have come to inform, entertain, provide guidance and keep company. Equipped with systems like face or voice recognition, they are machines that can see or hear their users, and respond to their requests while learning from their interests and habits. They are here to organize and optimize everyday life, offering their services continuously and tirelessly. Being present even when one almost forgets their existence, they change not only how we relate to technology, but also we relate to one another. What feelings, though, do technologies of care evoke and which behaviors do they encourage? What role do they play –or promise to play– in different contexts? This text tries to offer possible answers through artistic projects that address the topic.

 A form of post-love

 Rory Pilgrim speaks of ‘post-love’, a form of love felt by the inanimate, the machinic, the robotic…[1] In his Software Garden[2] performance, the social robot Pepper is present. Pepper is equipped with cameras, sensors, mics and leds for his interactions with humans. In countries like Japan, the robot is currently being used to detect if people are wearing masks and to remind them to stay safe. In Software Garden, Pepper interacts with the poet and disability advocate Carol Kellend who contributes with her poems to the project. Carol dreams of a world where robots would live in harmony with humans; a robot like Pepper would help her, for instance, respond to the harsh reality she is facing after the disability cuts in UK. Social robots are considered ideal companions for the children, the elderly or the ones in need. For the moment, though, the capabilities of robots in care labour are still limited, while their operation is based on the constant capturing, identification and classification of human behaviour[3].

Alexa, you creep me!

 Alexa Stop! is a song from the album Alexiety[4] by !Mediengruppe Bitnik and Low Jack. The song, as the artists explain, captures the feelings users develop towards their intelligent personal assistants –from carefree love to discomfort and even anxiety–. The album –and the installation it is part of– aims to disrupt the functioning of intelligent personal assistants with lyrics that involve common queries and commands. “Alexa, you are getting better and better at anticipating the voices, the moods around you” a lyric says. Intelligent personal assistants are designed not only to inform and entertain but also to control an environment and its devices. Alexa just like Google home has “voice features” that detect the physical characteristics and the emotional tone of a human voice; they can locate the ethnic origin, the language accent, the gender, the age and the mood of the user. For this reason, they prove to be ideal for data mining and targeted advertising and they, therefore, expose their users to continuous surveillance.

It is the voice of a girlfriend, a wife, a mom

 The animated text of the project Macho Sounds / Gender Noise[5] by Sofia Dona and myself comments on the gender of the machinic voice. “The voice that helps you navigate, calls home, plays music and podcasts, adjusts the temperature, shows gas stations and proposes restaurants is female” one reads in the text. The female voice possibly creates a feeling of comfort as it builds associations to roles traditionally undertaken by women, being considered more attentive to one’s needs. The project specifically looks into the features of in-car assistants while discussing the role of sound design in the reproduction of gender stereotypes. The car, a standard example of a patriarchal technology, turns more and more today into a caring machine providing multiple services to the driver. The driver multitasks while an intelligent assistant with a gentle female voice undertakes the small tasks for them. When machines seen as female undertake all the small things, though, a certain risks appears; women might be seen as technology in return, as Sarah Sharma points out[6]. Behaviors towards machines can influence behaviors towards the ones that used to undertake forms of invisibilized and affective undervalued labor.

I am a bot, not a therapist

 This is what the Care Bot[7] of Caroline Sinders clarifies, reminding the user that a machine does not have sentient or cognitive capacities. It performs as it has been programmed to perform; in the Care Bot’s case, it informs users of social platforms about online harassment or it underlines to possible victims the importance to ask for help, indicating to them useful resources. The bot has been created in order to discuss the need for victims of online harassment to have support based on the principles of care, respect and feeling safe. The project leaves no space for any illusion or confusion. It points towards the lack of any support provided by the platforms, as well as it questions the solutions being promised based on machine learning. As the artist clarifies: “I wanted a bot that acknowledges that it is a bot, but that at the same time says to the victim: it’s not your fault; it’s the platform’s fault.[8]” The bot is there to discuss “the un-caring social media landscape” and to expose failures of the systems, but it cannot offer psychological support.

Calling home

 Voice recognition systems and more specifically accent recognition software is used not only for artificial companions. Similar software is nowadays utilized in asylum procedures or for border control in the name of so called ‘humanitarian care’[9]. Technologies of care are, therefore, interestingly linked to technologies of border control, and in this case the predominant feeling for the ones dependent on them is anguish or fear. The phrase ‘call home’, accordingly, brings to mind not a casual person speaking comfortably to an intelligent personal assistant, but rather a person fleeing and trying to communicate with their family. Pedro Oliveira is commenting on this use of accent recognition software in his sound performances and sound essays like the recently produced On the apparently meaningless texture of noise[10]. Measuring, classifying, ranking and taxonomising human traits – like the accent – is, as he argues, a colonial construct which finds today its new violent manifestations through automation. This supposed objectivity of human traits offers the ground for a reduction of one’s identity to identification where the use of software constitutes an ‘act of dehumanisation.’[11]

 Technologies such as the ones discussed above are the affective infrastructures of our times. They are affective not only for the different feelings that they evoke and possibly process -from anticipation, calmness and comfort, to uneasiness, anxiety and fear-, but also for the promises they are meant to fulfill. The social robots, intelligent personal assistants, bots, and other specially designed software programs have all appeared at a time of a generalized crisis linked to crisis of care where societal bonds have been broken. The role of these technologies at this specific moment has been no other but to fix or repair these bonds, to fill in gaps appearing on different levels. As Nancy Fraser explains “the current, financialized form of capitalism is systematically consuming our capacities to sustain social bonds, like a tiger that eats its own tail.”[12] Within this context, care is instrumentalized; it is weaponized; it is called to serve interests of governments and markets. This condition, though, does not have to be seen as definitive. Gaps can also become ‘sites for productive interventions’ revealing or exposing how care infrastructures work[13]. They can be the cracks in the systems from which perceptions and perspectives can shift and change[14] as the artistic projects mentioned above aim to do. Care can also be unsettling, critical, collective, radical[15]. The future of care technologies depends to a great extent on the attention paid to them, and on the critical reflection needed about their use. As Maria Puig de la Bellacasa argues, if one ignores how human-machine associations are formed, there is the risk of of allowing technologies to reinforce asymmetries that devalue caring[16] and to allow new forms of regulation and control.

All in all, it is not about how Alexa makes you feel, but rather about the world being built around systems like Alexa.


ENDNOTES

[1]              https://rorypilgrim.com/text/evolution-of-care-interview-aqnb/
[2]              https://rorypilgrim.com/software-garden/
[3]              Oliver Schürer explained this at the Ludic Method Soirée which took place on the 19th of November 2019 at the Zentrum Fokus Forschung die Angewandte in Vienna. https://tinyurl.com/y542nfad
[4]               http://www.roehrsboetsch.com/artists/detail/mediengruppe-bitnik/work/alexiety/
[5]              https://daphnedragona.net/projects/macho-sounds-gender-noise
[6]              Sarah Sharma, “A feminism for the broken machine”, Camera Obscura Journal. (forthcoming)
[7]              https://care-bot.schloss-post.com/
[8]              “Bridging the Care Gap of Social Media Systems. Interview with Caroline Sinders”. https://schloss-post.com/bridging-the-care-gap-of-social-media-systems/
[9]              In Germany they put in implementation such a software in 2017 to detect where refugees are coming from and confirm that they are telling the truth.
[10]            https://schloss-post.com/meaningless-texture-of-noise/
[11]            “The Timbral Matter of Voice and the Right to Opacity. Interview with Pedro Oliveira”. https://schloss-post.com/the-timbral-matter-of-voice-and-the-right-to-opacity/
[12]         Sarah Leonard and Nancy Fraser, “Capitalism’s Crisis of Care”. Dissent Magazine. https://www.dissentmagazine.org/article/nancy-fraser-interview-capitalism-crisis-of-care
[13]            Ibid “The Timbral Matter of Voice and the Right to Opacity.”
[14]            This is based on Anzaldúa’s thinking on the possibility of seeing from the cracks. Gloria E. Anzaldúa, Light in the Dark/ Luz en lo Oscuro: Rewriting Identity, Spirituality, Reality, ed. Analouise Keating (Durhan and London: Duke University Press, 2015),
[15]            The problematics of care and the potential of unsettling and critical care are discussed in Aryn Martin, Myers Natasha and Ana Viseu. “The politics of care in technoscience.” Social Studies of Science 45.5 (2015). pp 625-641.
[16]            Maria Puig de la Bellacasa, “Matters of care in technoscience: Assembling neglected things.” Social Studies of Science 2011 41: 85. DOI: 10.1177/0306312710380301


Daphne Dragona
Daphne Dragona is a curator and writer based in Berlin. Through her work, she engages with artistic practices, methodologies and pedagogies that challenge contemporary forms of power. Among her topics of interest have been: the controversies of connectivity, the promises of the commons, the challenges of artistic subversion, the instrumentalization of play, the possibilities of non-sovereign infrastructures, the problematics of automated care, and most recently the potential of kin-making technologies in the time of climate crisis.
Among her curated -or co-curated- projects are the exhibitions: Kyriaki Goni, Counting Craters on the Moon (Aksioma, 2019), Tomorrows, Fictions spéculatives pour l’avenir méditerranéen (Le Lieu Unique, Nantes, 2019), “…” an archeology of silence in the digital age (Aksioma, Ljubljana, 2017), Tomorrows, Urban fictions for possible futures (Diplareios, Athens, 2017), Capture All (transmediale, Berlin, 2015), New Babylon Revisited (Goethe Institut Athen, 2014), Afresh, a new generation of Greek artists (ΕΜSΤ, 2013), Data Bodies – Networked Portraits (Fundacion Telefonica & Alta Tecnologia Andina, Lima, 2011), Mapping the Commons Athens (EMST, 2010), Homo Ludens Ludens (Laboral, Gijon, 2008). Articles of hers have been published in various books, journals, magazines, and exhibition catalogs by the likes of Springer, Sternberg Press, and Leonardo Electronic Almanac. Talks of hers have been hosted at ViZ (Athens), Mapping Festival (Geneva), MoMA (New York), Hek (Basel), Arts in Society (London), Leuphana University (Lueneburg) and Goethe University (Frankfurt).
Dragona worked as a curator for transmediale festival (Berlin) from 2015 until 2019. In the past –from 2001 until 2007–, she was the general coordinator of medi@terra festival, organised by Fournos (Athens). She has been a member of several committees for conferences and festivals and recently she was a jury member and mentor for ARTWORKS, Fellowship for Greek Young Artists of the Stavros Niarchos Foundation.
She holds a PhD from the Faculty of Communication & Media Studies of the University of Athens, an MA in Museum Studies from UCL, and a BA in Archaeology and History of Art from the University of Athens