LIU Wenbin(刘文斌),HE Yanqing,LAN Tian,WU Zhenfeng.[J].高技术通讯(英文),2023,29(3):310~317 |
|
Research on system combination of machine translation based on Transformer |
|
DOI:10. 3772/ j. issn. 1006-6748. 2023. 03. 010 |
中文关键词: |
英文关键词: Transformer, system combination, neural machine translation(NMT), attention mechanism, multi-encoder. |
基金项目: |
Author Name | Affiliation | LIU Wenbin(刘文斌) | (Research Center for Information Science Theory and Methodology, Institute of Scientific and Technical Information of China, Beijing 100038, P. R. China) | HE Yanqing | | LAN Tian | | WU Zhenfeng | |
|
Hits: 590 |
Download times: 678 |
中文摘要: |
|
英文摘要: |
Influenced by its training corpus, the performance of different machine translation systems varies greatly. Aiming at achieving higher quality translations, system combination methods combine the translation results of multiple systems through statistical combination or neural network combination. This paper proposes a new multi-system translation combination method based on the Transformer architecture, which uses a multi-encoder to encode source sentences and the translation results of each system in order to realize encoder combination and decoder combination. The experimental verification on the Chinese-English translation task shows that this method has 1. 2 -2. 35 more bilingual evaluation understudy (BLEU) points compared with the best single system results, 0. 71 - 3. 12 more BLEU points compared with the statistical combination method, and 0. 14 -0. 62 more BLEU points compared with the state-of-the-art neural network combination method. The experimental results demonstrate the effectiveness of the proposed system combination method based on Transformer. |
View Full Text
View/Add Comment Download reader |
Close |
|
|
|