The technology claimed as objective, scientific or progressive too often replicates and reinforces sexism, ableism, transphobia, racism, classism and other forms of inequity.
Traditionally, designers and engineers are not trained to include reflection and practices that tackle social inequity; thus —willingly or unwillingly—encode certain (negative) values into the systems they design. At times, designers, developers and engineers claim that the racist, sexist, ableist results of their designs are entirely exterior to the development process and that they have not included judgements, biases, stereotypes and their values into technical systems.
For this reason, it is important to nurture awareness of the narratives and values that inform technology development, as well as update the methods, techniques and ways of working to develop technology. Ultimately, embodied AI technology should serve human flourishing rather than contributing to inequality and social divide.
Designers, researchers, and stakeholders have the responsibility to reflect about the values, perspectives, biases and stereotypes they embed in embodied AI technology. All those things dubbed as ordinary are so cultural as they embody values, beliefs and narratives that influence how we collect and use data, how we craft algorithms, how we define agency, how we mold AI embodiment, how we design interaction and how we define embodied AI interventions.
What are we going to do?
Transdisciplinary workshops for the future of EAI
We plan four workshops and a final event, each with international guests and open to the 4TU NIRICT community to reflect and co-develop inclusive narratives and methodologies.
Tangible outcomes
The workshops' outcome is a catalog of inclusive narratives for embodied AI that describe the DEI future of embodied AI and a methodological roadmap to provide practical guidance in research and engineering practices of embodied AI.
Critical and practical activities
Through critical design methods, we will bring intersectional feminism to the center of the debate with the aim to reflect and re-imagine a more inclusive future with embodied AI.
Academics, artists and societal players
We target faculty and students, but we are also open to non-academic audiences (e.g., Diversity officers at Universities, LGBTQ+ associations) within 4TU universities. To promote inclusiveness and diversity bottom-up, we will also engage with societal associations in the Netherlands and invite them to the workshops.
Not only social justice, but also good scientific practice.
Any community hoping to eliminate inequity, exclusion and bias must sustain attention, resources, and effort toward meaningful change. Change in the name of social justice is not the only reason to adopt DEI practices. A field that does not embrace DEI cannot make scientifically accurate claims about the efficacy of their object of study once it is embedded in society. Society at large (however defined) is heterogeneous, and reflects a diversity of races, gender identities, bodies, ages, cultural practices and so on.
Scientific knowledge built on homogenous views of society that do not do justice to DEI practices, fails to represent the lived experience of the general population, who most often do not fit the WEIRD moniker (Western, educated, industrialized, rich and democratic) that has been prevalent in science. Therefore embracing DEI is not 'just’ a matter of social justice, but is actually also good scientific practice.
Listen, make space, enable.
We want to take action as academics, aware of the privileged role and powerful role that we have: We come from different positions of privilege and marginalization. We have had a range of experiences navigating issues related to diversity, equity, and inclusion. Our experiences and outlooks cannot and do not represent everyone who shares a particular identity. We hope to engage in a meaningful conversation with the embodied AI community at large: listening and co-creating in a spirit of reflexivity.