首页|社交聊天机器人的性别偏见——基于小冰系列的对话测试研究

社交聊天机器人的性别偏见——基于小冰系列的对话测试研究

扫码查看
当人工智能社交聊天机器人被视作具有人性特征的通信者,了解它们与人类进行交互过程中的性别偏见问题十分重要.本文使用对话测试的方法,设计了一系列用于测试机器人性别偏见的问题,对国内三款主流社交聊天机器人进行测试,并基于交互文本展开质性编码分析.结果表明,社交聊天机器人在自我性别认知、性别刻板印象、性别平等、应对性别骚扰等方面均表现出明显的性别偏见,且与社交聊天机器人本身被赋予的男女性别角色无关.作为人机交互技术产物的社交聊天机器人的性别偏见,由用户参与、对话系统技术支持、科技公司和程序开发者共同建构,而以社交聊天机器人为代表的AI在学习和模仿中复刻与强化了人类社会性别结构性力量的性别偏见.
Gender Bias in Social Chatbots:A Conversation Test Study Based on Xiaoice Series of Chatbots
As AI social chatbots are seen as human communicators,it is crucial to understand the problems of gender bias in their interactions with humans.Using the method of conversation test,this paper designs a series of questions for testing gender bias of robots and to test the gender bias of three mainstream social chatbots in China.The interaction texts are analyzed through qualitative coding analysis.The results indicate that social chatbots exhibit significant gender bias in self-perception of gender,gender stereotypes,gender equality,and response to gender harassment,which are unrelated to the male and female gender roles of the social chatbots themselves.The gender bias of social chatbots as products of human-computer interaction technology,they are constructed by user participation,dialog system technical support,technology companies and program developers.The result is that Al,as represented by social chatbots,replicates and reinforces the construct power of gender bias in the gender culture of human society in learning and imitation.

artificial intelligencesocial chatbotsXiaoice seriesgender biasconversation test

马中红、吴熙倡

展开 >

苏州大学传媒学院

人工智能 社交聊天机器人 小冰系列 性别偏见 对话测试

国家社会科学基金

19BXW112

2024

国际新闻界
中国人民大学

国际新闻界

CSSCICHSSCD北大核心
影响因子:1.141
ISSN:1002-5685
年,卷(期):2024.46(4)