Interface Effects of ChatGPT:Do Parasocial Interaction Experiences Reduce the Willingness to Accept Technology in Attachment Avoidant Individuals?
ChatGPT has evolved into an artificial intelligent(AI)assistant for everyday work.However,the term"intelligent assistant"embraces the interface metaphor of"assistant"and thus determines the form and the norm of interactions between AI assistive technology and human beings.At present,the user interface of"intelligent assistant"often appears in the form of dialog,which is essentially a social interaction context.This paper argues that the adoption of the interface metaphor of"assistants"may lead to a digital divide in technology acceptance and use based on the social characteristics of personality.Drawing upon the parasocial interaction perspective,cognitive load theory,and attachment theory,this paper surveyed 453 college students on ChatGPT use.It was found that while the parasocial interaction with ChatGPT directly and positively predicted users'continuance intention to use ChatGPT,parasocial interaction also indirectly predicted continuance intention to use through cognitive load among users with varying levels of attachment avoidance.For individuals with high levels of attachment avoidance,parasocial interaction positively predicted cognitive load,which in turn negatively predicted their continued intention to use ChatGPT.In contrast,for individuals with low levels of attachment avoidance,parasocial interaction negatively predicted cognitive load and in turn,positively associated with continued intention to use the technology.The above findings indicate that researchers should pay attention to the impact of the interactive interface design of AI-assisted technologies on the technology acceptance of people with different personality traits.The findings also advocate that designers should develop a variety of interactive interfaces to adapt to the usage preferences of people with different social traits.At the same time,the tendency of interactive interface design implied by academic terms such as"intelligent assistant"and"human-computer collaboration"should be treated with caution,so as to prevent the use of related terms from becoming a means of technological and cultural hegemony.