Do robots have a moral status,and what kind of moral status should they have?This is not only a major demand arising from technological pro-gress,but also an important issue that ethical development must address.John Danaher argues that if robots are roughly equivalent in behavior to other en-tities that have an important moral status,then robots should also have an important moral status.Although the theory has the theoretical simplicity and ethical foresight of"behaviorism",there are still many doubts about its conceptualization,justification,and defense.By distinguishing between the as-pects of"program"rather than"understanding"and"I do"rather than"I think",it can be seen that the theory of"signs"is not the same as that of"understanding".The"signs"are not sufficient for the"mind":assigning a moral status to robots based on behavioral similarities alone is a"knowing-ly"judgment,but to make a sufficient"knowingly"statement,one must also consider whether there are human characteristics such as autonomy and com-prehension that underlie the robot's behavior.