Abstract:This paper proposes a method based on BERT (bidirectional encoder representations from transformers) and TextRank keyword extraction for entity linking. BERT pre-training language model is introduced into the entity linking task for an analysis of the correlation between entity reference context and related information of candidate entity, thus enhancing the result of entity linking by improving the effect of semantic analysis. By using TextRank keyword extraction, an enhancement can be achieved of the subject information of the comprehensive description information of the target entity, with the accuracy of text similarity measurement increased, and the effect of the model optimized as well. Based on the verification of the model effect by the data set of ccks2019 evaluation task II, the experimental results show that the proposed method, which can effectively solve the entity linking problem, is characterized with an entity linking effect which is significantly superior to that of other entity linking methods.