Ancient poetry is an important part of Chinese culture. There have been projects like Jiuge to combine ancient poetry with deep learning. The language of ancient poetry is often refined, and it needs rich imagination to understand its meaning. As a result, it is difficult to automatically implement the translation. This paper makes a preliminary attempt in this aspect, based on the data set collected by ourselves, adopts deep encoder-decoder model, such as GRU, LSTM and Transformer models, to train our model. We compare the results of the three models, which have their own advantages and disadvantages. However, due to the size of the data set and the model itself, the effect is not very ideal, and still needs to be improved.
Title: A Comparative Study of Different Models in Ancient Poetry Translation Authors: Wang Boyuan, Le Xiangli, Wang Hainan, Zhang Baochang