Gender Bias in Social Chatbots: A Conversation Test Study Based on Xiaoice Series of Chatbots

MA Zhonghong WU Xichang

PDF(1803 KB)
Chinese Journal of Journalism & Communication ›› 2024, Vol. 46 ›› Issue (4) : 72-89.

Gender Bias in Social Chatbots: A Conversation Test Study Based on Xiaoice Series of Chatbots

  • MA Zhonghong WU Xichang
Author information +
History +

Abstract

As AI social chatbots are seen as human communicators, it is crucial to understand the problems of gender bias in their interactions with humans. Using the method of conversation test, this paper designs a series of questions for testing gender bias of robots and to test the gender bias of three mainstream social chatbots in China. The interaction texts are analyzed through qualitative coding analysis.The results indicate that social chatbots exhibit significant gender bias in self- perception of gender, gender stereotypes, gender equality, and response to gender harassment, which are unrelated to the male and female gender roles of the social chatbots themselves. The gender bias of social chatbots as products of human-computer interaction technology, they are constructed by user participation, dialog system technical support, technology companies and program developers. The result is that AI, as represented by social chatbots, replicates and reinforces the construct power of gender bias in the gender culture of human society in learning and imitation.

Key words

artificial intelligence / social chatbots / Xiaoice series / gender bias / conversation test

Cite this article

Download Citations
MA Zhonghong WU Xichang. Gender Bias in Social Chatbots: A Conversation Test Study Based on Xiaoice Series of Chatbots. Chinese Journal of Journalism & Communication. 2024, 46(4): 72-89

References

Funding

This paper is a phased achievement of the National Social Science Fund project “Research on the Practice of Youth Women’s Digital Media Culture” (No. 19BXW112).
PDF(1803 KB)

3808

Accesses

0

Citation

Detail

Sections
Recommended

/