Folgen
Marjan Ghazvininejad
Marjan Ghazvininejad
Research Scientist, FAIR (Facebook AI Research)
Bestätigte E-Mail-Adresse bei fb.com - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension
M Lewis, Y Liu, N Goyal, M Ghazvininejad, A Mohamed, O Levy, ...
arXiv preprint arXiv:1910.13461, 2019
54342019
Multilingual denoising pre-training for neural machine translation
Y Liu, J Gu, N Goyal, X Li, S Edunov, M Ghazvininejad, M Lewis, ...
Transactions of the Association for Computational Linguistics 8, 726-742, 2020
9972020
A knowledge-grounded neural conversation model
M Ghazvininejad, C Brockett, MW Chang, B Dolan, J Gao, W Yih, ...
Proceedings of the AAAI Conference on Artificial Intelligence 32 (1), 2018
5412018
Mask-predict: Parallel decoding of conditional masked language models
M Ghazvininejad, O Levy, Y Liu, L Zettlemoyer
arXiv preprint arXiv:1904.09324, 2019
4112019
Generating Topical Poetry
M Ghazvininejad, X Shi, Y Choi, K Knight
Empirical Methods on Natural Language Processing, 2016
1682016
Hafez: an Interactive Poetry Generation System
M Ghazvininejad, X Shi, J Priyadarshi, K Knight
proceeding of ACL Demo Track, 2017
1452017
Towards controllable story generation
N Peng, M Ghazvininejad, J May, K Knight
Proceedings of the First Workshop on Storytelling, 43-49, 2018
1292018
Pre-training via paraphrasing
M Lewis, M Ghazvininejad, G Ghosh, A Aghajanyan, S Wang, ...
Advances in Neural Information Processing Systems 33, 18470-18481, 2020
1172020
Non-autoregressive machine translation with disentangled context transformer
J Kasai, J Cross, M Ghazvininejad, J Gu
International conference on machine learning, 5144-5155, 2020
95*2020
Delight: Deep and light-weight transformer
S Mehta, M Ghazvininejad, S Iyer, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2008.00623, 2020
872020
Aligned cross entropy for non-autoregressive machine translation
M Ghazvininejad, V Karpukhin, L Zettlemoyer, O Levy
International Conference on Machine Learning, 3515-3523, 2020
862020
Training on synthetic noise improves robustness to natural noise in machine translation
V Karpukhin, O Levy, J Eisenstein, M Ghazvininejad
arXiv preprint arXiv:1902.01509, 2019
762019
Detecting hallucinated content in conditional neural sequence generation
C Zhou, G Neubig, J Gu, M Diab, P Guzman, L Zettlemoyer, ...
arXiv preprint arXiv:2011.02593, 2020
682020
Improving zero and few-shot abstractive summarization with intermediate fine-tuning and data augmentation
AR Fabbri, S Han, H Li, H Li, M Ghazvininejad, S Joty, D Radev, ...
arXiv preprint arXiv:2010.12836, 2020
582020
Semi-autoregressive training improves mask-predict decoding
M Ghazvininejad, O Levy, L Zettlemoyer
arXiv preprint arXiv:2001.08785, 2020
502020
From local similarity to global coding: An application to image classification
A Shaban, HR Rabiee, M Farajtabar, M Ghazvininejad
Proceedings of the IEEE Conference on Computer Vision and Pattern …, 2013
402013
A review on language models as knowledge bases
B AlKhamissi, M Li, A Celikyilmaz, M Diab, M Ghazvininejad
arXiv preprint arXiv:2204.06031, 2022
332022
Prompting contrastive explanations for commonsense reasoning tasks
B Paranjape, J Michael, M Ghazvininejad, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2106.06823, 2021
322021
Simple and effective retrieve-edit-rerank text generation
N Hossain, M Ghazvininejad, L Zettlemoyer
Proceedings of the 58th Annual Meeting of the Association for Computational …, 2020
302020
Neural poetry translation
M Ghazvininejad, Y Choi, K Knight
Proceedings of the 2018 Conference of the North American Chapter of the …, 2018
252018
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20