Follow
Longyue Wang
Longyue Wang
Tencent AI Lab
Verified email at tencent.com - Homepage
Title
Cited by
Cited by
Year
Exploiting cross-sentence context for neural machine translation
L Wang, Z Tu, A Way, Q Liu
arXiv preprint arXiv:1704.04347, 2017
1802017
UM-Corpus: A Large English-Chinese Parallel Corpus for Statistical Machine Translation.
L Tian, DF Wong, LS Chao, P Quaresma, F Oliveira, L Yi, S Li, Y Wang, ...
LREC, 1837-1842, 2014
1052014
Convolutional self-attention networks
B Yang, L Wang, D Wong, LS Chao, Z Tu
arXiv preprint arXiv:1904.03107, 2019
1012019
Modeling recurrence for transformer
J Hao, X Wang, B Yang, L Wang, J Zhang, Z Tu
arXiv preprint arXiv:1904.03092, 2019
672019
Self-attention with structural position representations
X Wang, Z Tu, L Wang, S Shi
arXiv preprint arXiv:1909.00383, 2019
602019
Dynamic layer aggregation for neural machine translation with routing-by-agreement
ZY Dou, Z Tu, X Wang, L Wang, S Shi, T Zhang
Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 86-93, 2019
452019
Translating pro-drop languages with reconstruction models
L Wang, Z Tu, S Shi, T Zhang, Y Graham, Q Liu
Proceedings of the AAAI Conference on Artificial Intelligence 32 (1), 2018
392018
Understanding and improving lexical choice in non-autoregressive translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
arXiv preprint arXiv:2012.14583, 2020
362020
A novel approach to dropped pronoun translation
L Wang, Z Tu, X Zhang, H Li, A Way, Q Liu
arXiv preprint arXiv:1604.06285, 2016
352016
A systematic comparison of data selection criteria for smt domain adaptation
L Wang, DF Wong, LS Chao, Y Lu, J Xing
The Scientific World Journal 2014, 2014
322014
Towards understanding neural machine translation with word importance
S He, Z Tu, X Wang, L Wang, MR Lyu, S Shi
arXiv preprint arXiv:1909.00326, 2019
312019
Automatic construction of discourse corpora for dialogue translation
L Wang, X Zhang, Z Tu, A Way, Q Liu
arXiv preprint arXiv:1605.06770, 2016
312016
Context-aware cross-attention for non-autoregressive translation
L Ding, L Wang, D Wu, D Tao, Z Tu
arXiv preprint arXiv:2011.00770, 2020
282020
Assessing the ability of self-attention networks to learn word order
B Yang, L Wang, DF Wong, LS Chao, Z Tu
arXiv preprint arXiv:1906.00592, 2019
252019
How does selective mechanism improve self-attention networks?
X Geng, L Wang, X Wang, B Qin, T Liu, Z Tu
arXiv preprint arXiv:2005.00979, 2020
242020
Self-attention with cross-lingual position representation
L Ding, L Wang, D Tao
arXiv preprint arXiv:2004.13310, 2020
232020
Rejuvenating low-frequency words: Making the most of parallel data in non-autoregressive translation
L Ding, L Wang, X Liu, DF Wong, D Tao, Z Tu
arXiv preprint arXiv:2106.00903, 2021
222021
Understanding and improving encoder layer fusion in sequence-to-sequence learning
X Liu, L Wang, DF Wong, L Ding, LS Chao, Z Tu
arXiv preprint arXiv:2012.14768, 2020
222020
Exploiting sentential context for neural machine translation
X Wang, Z Tu, L Wang, S Shi
arXiv preprint arXiv:1906.01268, 2019
222019
Pivot machine translation using chinese as pivot language
CH Liu, CC Silva, L Wang, A Way
Machine Translation: 14th China Workshop, CWMT 2018, Wuyishan, China …, 2019
202019
The system can't perform the operation now. Try again later.
Articles 1–20