Follow
Nan Duan
Nan Duan
Senior Principal Research Manager, Microsoft Research
Verified email at microsoft.com - Homepage
Title
Cited by
Cited by
Year
Codebert: A pre-trained model for programming and natural languages
Z Feng, D Guo, D Tang, N Duan, X Feng, M Gong, L Shou, B Qin, T Liu, ...
arXiv preprint arXiv:2002.08155, 2020
18152020
Unicoder-vl: A universal encoder for vision and language by cross-modal pre-training
G Li, N Duan, Y Fang, M Gong, D Jiang
Proceedings of the AAAI Conference on Artificial Intelligence 34 (07), 11336 …, 2020
8282020
Graphcodebert: Pre-training code representations with data flow
D Guo, S Ren, S Lu, Z Feng, D Tang, S Liu, L Zhou, N Duan, ...
arXiv preprint arXiv:2009.08366, 2020
6852020
CLIP4Clip: An empirical study of CLIP for end to end video clip retrieval and captioning
H Luo, L Ji, M Zhong, Y Chen, W Lei, N Duan, T Li
Neurocomputing 508, 293-304, 2022
5602022
Codexglue: A machine learning benchmark dataset for code understanding and generation
S Lu, D Guo, S Ren, J Huang, A Svyatkovskiy, A Blanco, C Clement, ...
arXiv preprint arXiv:2102.04664, 2021
5192021
K-adapter: Infusing knowledge into pre-trained models with adapters
R Wang, D Tang, N Duan, Z Wei, X Huang, G Cao, D Jiang, M Zhou
arXiv preprint arXiv:2002.01808, 2020
4832020
Prophetnet: Predicting future n-gram for sequence-to-sequence pre-training
W Qi, Y Yan, Y Gong, D Liu, N Duan, J Chen, R Zhang, M Zhou
arXiv preprint arXiv:2001.04063, 2020
4162020
Visual chatgpt: Talking, drawing and editing with visual foundation models
C Wu, S Yin, W Qi, X Wang, Z Tang, N Duan
arXiv preprint arXiv:2303.04671, 2023
4042023
Univl: A unified video and language pre-training model for multimodal understanding and generation
H Luo, L Ji, B Shi, H Huang, N Duan, T Li, J Li, T Bharti, M Zhou
arXiv preprint arXiv:2002.06353, 2020
3742020
Question generation for question answering
N Duan, D Tang, P Chen, M Zhou
Proceedings of the 2017 conference on empirical methods in natural language …, 2017
3232017
Unixcoder: Unified cross-modal pre-training for code representation
D Guo, S Lu, N Duan, Y Wang, M Zhou, J Yin
arXiv preprint arXiv:2203.03850, 2022
3102022
Xglue: A new benchmark dataset for cross-lingual pre-training, understanding and generation
Y Liang, N Duan, Y Gong, N Wu, F Guo, W Qi, M Gong, L Shou, D Jiang, ...
arXiv preprint arXiv:2004.01401, 2020
2712020
Codexglue: A machine learning benchmark dataset for code understanding and generation
S Lu, D Guo, S Ren, J Huang, A Svyatkovskiy, A Blanco, C Clement, ...
arXiv preprint arXiv:2102.04664, 2021
268*2021
Constraint-based question answering with knowledge graph
J Bao, N Duan, Z Yan, M Zhou, T Zhao
Proceedings of COLING 2016, the 26th international conference on …, 2016
2682016
Imagebert: Cross-modal pre-training with large-scale weak-supervised image-text data
D Qi, L Su, J Song, E Cui, T Bharti, A Sacheti
arXiv preprint arXiv:2001.07966, 2020
2562020
Pretraining-based natural language generation for text summarization
H Zhang, J Xu, J Wang
arXiv preprint arXiv:1902.09243, 2019
2402019
Nüwa: Visual synthesis pre-training for neural visual world creation
C Wu, J Liang, L Ji, F Yang, Y Fang, D Jiang, N Duan
European conference on computer vision, 720-736, 2022
2242022
Building task-oriented dialogue systems for online shopping
Z Yan, N Duan, P Chen, M Zhou, J Zhou, Z Li
Proceedings of the AAAI Conference on Artificial Intelligence 31 (1), 2017
2182017
Unicoder: A universal language encoder by pre-training with multiple cross-lingual tasks
H Huang, Y Liang, N Duan, M Gong, L Shou, D Jiang, M Zhou
arXiv preprint arXiv:1909.00964, 2019
2162019
Dawn Drain, Neel Sundaresan, Jian Yin, Daxin Jiang, and Ming Zhou. 2021. GraphCodeBERT: Pre-training Code Representations with Data Flow
D Guo, S Ren, S Lu, Z Feng, D Tang, S Liu
9th International Conference on Learning Representations, ICLR, 3-7, 2021
2022021
The system can't perform the operation now. Try again later.
Articles 1–20