Llion Jones
Llion Jones
Bestätigte E-Mail-Adresse bei google.com
Titel
Zitiert von
Zitiert von
Jahr
Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems, 5998-6008, 2017
271652017
Natural questions: a benchmark for question answering research
T Kwiatkowski, J Palomaki, O Redfield, M Collins, A Parikh, C Alberti, ...
Transactions of the Association for Computational Linguistics 7, 453-466, 2019
5332019
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
3962018
The best of both worlds: Combining recent advances in neural machine translation
MX Chen, O Firat, A Bapna, M Johnson, W Macherey, G Foster, L Jones, ...
arXiv preprint arXiv:1804.09849, 2018
3062018
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
2622017
Character-level language modeling with deeper self-attention
R Al-Rfou, D Choe, N Constant, M Guo, L Jones
Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 3159-3166, 2019
1912019
Attention is all you need. arXiv 2017
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2017
1302017
Wikireading: A novel large-scale language understanding task over wikipedia
D Hewlett, A Lacoste, L Jones, I Polosukhin, A Fandrianto, J Han, ...
arXiv preprint arXiv:1608.03542, 2016
1252016
Advances in neural information processing systems
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Proceedings of Machine Learning Research, 5998-6008, 2017
1172017
ukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Advances in neural information processing systems, 5998-6008, 2017
114*2017
Lingvo: a modular and scalable framework for sequence-to-sequence modeling
J Shen, P Nguyen, Y Wu, Z Chen, MX Chen, Y Jia, A Kannan, T Sainath, ...
arXiv preprint arXiv:1902.08295, 2019
992019
ProtTrans: towards cracking the language of Life's code through self-supervised deep learning and high performance computing
A Elnaggar, M Heinzinger, C Dallago, G Rihawi, Y Wang, L Jones, ...
arXiv preprint arXiv:2007.06225, 2020
78*2020
Attention is all you need. CoRR abs/1706.03762 (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2017
782017
Accurate supervised and semi-supervised machine reading for long documents
D Hewlett, L Jones, A Lacoste, I Gur
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
202017
Byte-level machine reading across morphologically varied languages
T Kenter, L Jones, D Hewlett
Thirty-Second AAAI Conference on Artificial Intelligence, 2018
122018
Machine translation using neural network models
Z Chen, MR Hughes, Y Wu, M Schuster, X Chen, LO Jones, NJ Parmar, ...
US Patent App. 16/521,780, 2020
42020
Kaiser. 2017. Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
arXiv preprint arXiv:1601.03317, 0
4
Attention-based sequence transduction neural networks
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent 10,452,978, 2019
22019
CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing
A Elnaggar, W Ding, L Jones, T Gibbs, T Feher, C Angerer, S Severini, ...
arXiv preprint arXiv:2104.02443, 2021
12021
DF-Conformer: Integrated architecture of Conv-TasNet and Conformer using linear complexity self-attention for speech enhancement
Y Koizumi, S Karita, S Wisdom, H Erdogan, JR Hershey, L Jones, ...
arXiv preprint arXiv:2106.15813, 2021
2021
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20