Application of GRU based feature fusion in machine translation
Neural machine translation models are basically encoder and decoder structures.Before coding,characters need to be abstracted into mathematical notation,which are passed into the encoder in the form of word vectors.Most of the methods for processing the internal information of word vectors need a lot of resources for training,which cannot simply and quickly im-prove the performance of machine translation.To address this issue,a technique using gated recurrent unit(GRU)to filter word vectors based on the Transformer model is proposed,which directly operates on training data.Firstly,the word vector and posi-tion vector are added and transmitted to the GRU.The output filtered by the GRU is concatenated with the word vector.Then,the concatenated feature vectors are deeply fused together through a linear layer.The resulting word vector is represented by the word vector filtered by the gating unit,and finally transmitted to the encoder and decoder.The experimental results show that in the two data sets of Multi30k and IWSLT2016 and different models,the proposed method can make the BLEU of machine trans-lation worth improving and achieve better translation performance.