Iulia Turc
Iulia Turc
Verified email at google.com
Cited by
Cited by
Well-read students learn better: On the importance of pre-training compact models
I Turc, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1908.08962, 2019
Well-read students learn better: The impact of student initialization on knowledge distillation
I Turc, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1908.08962 13, 2019
Canine: Pre-training an efficient tokenization-free encoder for language representation
JH Clark, D Garrette, I Turc, J Wieting
arXiv preprint arXiv:2103.06874, 2021
The multiberts: Bert reproductions for robustness analysis
T Sellam, S Yadlowsky, J Wei, N Saphra, A D'Amour, T Linzen, J Bastings, ...
arXiv preprint arXiv:2106.16163, 2021
High Performance Natural Language Processing
G Ilharco, C Ilharco, I Turc, T Dettmers, F Ferreira, K Lee
Proceedings of the 2020 Conference on Empirical Methods in Natural Language …, 2020
Revisiting the primacy of english in zero-shot cross-lingual transfer
I Turc, K Lee, J Eisenstein, MW Chang, K Toutanova
arXiv preprint arXiv:2106.16171, 2021
The system can't perform the operation now. Try again later.
Articles 1–6