In the task of Chinese named entity recognition,the vectorization of word is an important step.However,the traditional word vector representation method can only map the word to a sin-gle vector and cannot represent the ambiguity of the word.In this paper,BERT pre-training lan-guage model is introduced.BERT model can enhance the semantic representation of words and dy-namically generate semantic vectors according to their context.In order to solve the problem that BERT fine tuning training requires high computer performance,this paper applies BERT with fixed parameter embedding method,and builds the Bert-BiLSTM-CRF model.The experimental results show that the F1-Score index of the named entity recognition model based on BERT rea-ches 94.48%on the MSRA dataset.It is superior to traditional machine learning models and other methods based on deep learning models.The results of this paper show that BERT model has a good application prospect in named entity recognition tasks.
identity of named entityBERTfine-tuningBERT-BiLSTM-CRF modeldeep learning