A text-Image feature mapping algorithm based on transfer learning
Abstract The traditional uniform distribution algorithm does not filter the image data when extracting the approximate features of text-image data under the event, so the similarity between the image data and the text is low, which leads to low accuracy of the algorithm. This paper proposes a text-image feature mapping algorithm based on transfer learning. The existing data is filtered by ‘clustering technology’ to obtain similar data with the target data. The significant text features are calculated through the latent Dirichlet allocation (LDA) model and information gain based on Gibbs sampling. Bag of visual word (BOVW) model and Naive Bayesian method are used to model image data. With the help of the text-image co-occurrence data in the same event, the text feature distribution is mapped to the image feature space, and the feature distribution of image data under the same event is approximated. Experimental results show that the proposed algorithm can obtain the feature distribution of image data under different events, and the average cosine similarity is as high as 92%, the average dispersion is as low as 0.06%, and the accuracy of the algorithm is high.