Link Search Menu Expand Document
  1. Designing Virtual Assistants as Virtual Workers

Designing Virtual Assistants as Virtual Workers

Virtual assistants are software technologies that assist people with computer-related tasks using sociality as a key feature of the interface.5 These technologies can range in sophistication from chatbots (e.g. Microsoft’s Clippy) that use basic dialog systems, to artificially intelligent “smart” systems that incorporate speech recognition, voice user interfaces, and natural language processing (e.g. Amazon’s Alexa) into machine learning processes.6 Smart virtual assistants like Alexa are becoming popular as interfaces to personal and home technologies, connecting through the “Internet of Things” (IoT) to create seamless networks of service applications. Outside of personal domestic use, there is a growing market for virtual assistants of all kinds in customer service applications, where they often are employed as a first-line approach to handling customer queries and as a supplement to, or replacement for, human customer service representatives.7

Airus Media’s AVAs fall into this customer service category and are implemented as public-facing information kiosks that can be placed throughout airports wherever “wayfinding, public guidance, and advertising” applications are needed.8 The AVAs are designed as life-sized holographic avatars that “create the illusion of a real person” by simulating an airline service worker or TSA security officer (figure 2). The holograms are constructed from scripted video recordings of real actors delivering responses such as, “You can speed up the inspection process by removing all items in your pockets and placing them in your carry-on baggage.” 9 The holographic videos simulate the experience of receiving information from a human worker, though the interactivity of the technology is limited (at this time) to a mostly one-way encounter. The current AVA models in service are not “smart” technologies, meaning they do not currently use machine learning, have voice recognition, or fully interactive capabilities. The absence of these features is noted by Airus Media as a limitation of budget, not desire, that may be bridged in future versions as it becomes economically viable for the company to upgrade the technology.10 Nonetheless, Airus Media describes the AVAs as “effective in capturing attention” and passengers as “amazed by the technology.”

Figure 2. Examples of a Latina AVA installed at the San Antonio airport Figure 2. Examples of a Latina AVA installed at the San Antonio airport

All of the airport AVA products featured on Airus Media’s website are represented as women, which is consistent with the design norms for virtual assistants performing customer service roles.11 The problematic gendering of virtual assistants has a long history and has been well-documented by scholars in interdisciplinary domains. Stereotypes about women’s ‘natural’ abilities for service work have long been encoded into the programming and design of computer interfaces, marking continuation from the feminized labor of human computers to the design of conversational agent technologies.12 Representations of virtual women as computer assistants have only become more normalized and culturally embedded with the overwhelming popularity of smart voice assistants like Amazon’s Alexa and Apple’s Siri. The logics that guide virtual assistant design in human-computer interaction continually recycle consumer market models that find that people prefer interacting with feminized virtual assistants for service-related and domestic tasks.13 These models are circulated non-critically in ways that tend to harness and repurpose prevalent gender stereotypes, creating a cycle of virtual assistant gendering that appears natural over time. Miriam Sweeney describes how these logics become embedded “as a kind of cultural “common sense” design practice, obscuring their linkages to historically specific and socially produced systems of oppression.”14 However, the mainstreaming of virtual assistant technologies has prompted more public reflection on the cultural ideologies that shape gendered design, along with their real-world consequences for women. Notably the United Nations Education, Scientific and Cultural Organization (UNESCO) published an influential report outlining the ways gendered representation of virtual assistants has the potential for further normalizing gendered abuse and harassment for women, globally.15

Virtual assistants are specifically gendered and racialized in anticipation of their use, functions, and of the audiences they are targeting. For instance virtual assistants fulfilling customer service, domestic assistance, or care-giving roles are overwhelmingly designed either explicitly or implicitly as women.16 These industries are heavily feminized, meaning they are overrepresented by women workers and are often low-paid, precarious, low-status positions that are heavily surveilled with little or no worker autonomy. The digital labor force is imagined as having many of these same qualities, and identity functions rhetorically to amplify, obscure, or cohere various social scripts around gender, race, and labor as “a key part of user experience (UX)”.17 For example, Microsoft’s Ms. Dewey search engine made use of a racially ambiguous virtual assistant that enabled a range of sexist and racist fantasies to be encoded in the interface and enjoyed by white, heterosexual male searchers.18 Meanwhile, Thao Phan observes that Amazon’s virtual assistant, “Alexa”, is designed according to white middle-class aesthetics, which imagine an idealized domestic service model that offers an aspirational pathway to whitemiddle class membership.19 Phan argues that Alexa’s cultural representation does important political work in eliding the historical realities of Black women and other women of color in domestic service roles, obscuring important questions about the interlocking systems of whiteness, gender, labor, class, and surveillance in the process.20 Through these examples, virtual assistant design and representation emerge as a site of politics where ideologies about labor and identity are encoded and mobilized in specific and strategic ways.

Virtual assistant identity is integral, not incidental, to understanding the global labor environment that is shaped through legacies of racialized capitalism and legacies of colonialism. Winifred Poster argues in her research on virtual receptionists that employers make conscious decisions about the aspects of identity and “humanness” that they want customers to interact with, and that this selective visibility of the worker lies “at the heart of reconfiguring the labor processes of these services.”21 Jennifer Rhee identifies these dynamics as a key part of “the robotic imaginary”, which she describes as the twinned processes of anthropomorphization of robots and the dehumanization of labor.22 Rhee deftly argues that the robotic imaginary inscribes humanness through normative, “familiar” frameworks that reflect existing racial and gendered hierarchies. Using these frameworks she demonstrates the processes through which marginalized people are dehumanized as “unfamiliar” and “nonnormative,” an erasure that is fundamental for enabling labor alienation and exploitation. Rhee identifies virtual assistants, specifically, as critical technologies where these ideologies play out and processes of dehumanization take place. Similarly, Jessa Lingel and Kate Crawford argue that the fashioning of virtual assistants as feminized digital secretaries is integral for achieving their dehumanization, an outcome that is viewed as an asset by designers to ensure users entrust these programs with their private data.23

Scholarly research that focuses specifically on Latina identity and virtual assistants is still nascent. However, recent research suggests that Latina identity is being strategically deployed in applications, markets, and geographic regions that target Latinx audiences.24 For instance, research on “Emma”, the United States Citizenship and Immigration Services’ (USCIS) Latina virtual assistant, finds that Latina identity in this interface is specifically constructed in accordance with colonial-capitalist ideologies to make normative claims about what identity markers and consumer models constitute the “ideal” Latina/o citizen.25 Similarly, we are interested in exploring the normative claims that Latina AVAs may make about the interconnected issues of labor, immigration, and national identity. In the next sections, we situate the design and marketing of Airus Media’s Latina virtual assistants in the historical context of Latinas’ information service work and immigrant labor. Weaving this together reveals the deep ideological continuities between the historical and digital constructions of the Latina information service worker.


Table of Contents