There are numerous knowledge base question answering(KBQA)researches on complex semantics and complex syn-tax,but most of them are based on the premise that the subject entity of the question has been obtained,and insufficient attention has been paid to the multi-intentions and multi-entities in the question,and the identification of the core entity in the interrogative sentence is the key to natural language understanding.To address this problem,a KBQA model introducing core entity attention is proposed.Based on the attention mechanism and attention enhancement techniques,the proposed model assesses the importance of the recognized entity mention,obtains the entity mention attention,removes the potential interfering items,captures the core entity of the user's question,so as to solve the semantic understanding problem of multi-entity and multi-intention interrogative sentences.Evaluated results are introduced into the subsequent Q&A reasoning as importance weights.Finally,comparative ex-periments are conducted with KVMem,GraftNet,PullNet and other models in English MetaQA dataset,multi-entity question MetaQA dataset,and multi-entity question HotpotQA dataset.For multi-entity question,the proposed model achieves better ex-perimental results on Hits@n,accuracy,recall and other evaluation indexes.