Folgen
Nikolaos Pappas
Nikolaos Pappas
Amazon (AWS AI Labs)
Bestätigte E-Mail-Adresse bei amazon.com - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Transformers are RNNs: Fast autoregressive transformers with linear attention
A Katharopoulos, A Vyas, N Pappas, F Fleuret
Thirty-seventh International Conference on Machine Learning (ICML), 2020
13372020
Random feature attention
H Peng, N Pappas, D Yogatama, R Schwartz, NA Smith, L Kong
International Conference on Learning Representations (ICLR), 2021
3132021
Document-level neural machine translation with hierarchical attention networks
L Miculicich, D Ram, N Pappas, J Henderson
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2018
3042018
Deep encoder, shallow decoder: Reevaluating non-autoregressive machine translation
J Kasai, N Pappas, H Peng, J Cross, NA Smith
International Conference on Learning Representations (ICLR), 2021
1652021
Visual affect around the world: A large-scale multilingual visual sentiment ontology
B Jou, T Chen, N Pappas, M Redi, M Topkara, SF Chang
23rd ACM International Conference on Multimedia (MM), 159-168, 2015
1592015
Multilingual hierarchical attention networks for document classification
N Pappas, A Popescu-Belis
8th International Joint Conference on Natural Language Processing (IJCNLP), 2017
1402017
Sentiment analysis of user comments for one-class collaborative filtering over TED talks
N Pappas, A Popescu-Belis
36th International ACM SIGIR Conference on Research and Development in …, 2013
1172013
GILE: A generalized input-label embedding for text classification
N Pappas, J Henderson
Transactions of the Association for Computational Linguistics (TACL) 7, 139-155, 2019
832019
Explaining the stars: Weighted multiple-instance learning for aspect-based sentiment analysis
N Pappas, A Popescu-Belis
Conference on Empirical Methods In Natural Language Processing (EMNLP), 455-466, 2014
762014
Explicit document modeling through weighted multiple-instance learning
N Pappas, A Popescu-Belis
Journal of Artificial Intelligence Research (JAIR) 58, 591-626, 2017
612017
Finetuning pretrained transformers into RNNs
J Kasai, H Peng, Y Zhang, D Yogatama, G Ilharco, N Pappas, Y Mao, ...
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
482021
Combining content with user preferences for TED lecture recommendation
N Pappas, A Popescu-Belis
11th International Workshop on Content Based Multimedia Indexing (CBMI), 2013
412013
Integrating weakly supervised word sense disambiguation into neural machine translation
X Pu, N Pappas, J Henderson, A Popescu-Belis
Transactions of the Association for Computational Linguistics (TACL) 6, 635-649, 2018
402018
Plug and play autoencoders for conditional text generation
F Mai, N Pappas, I Montero, NA Smith, J Henderson
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
322020
Adaptive sentiment-aware one-class collaborative filtering
N Pappas, A Popescu-Belis
Expert Systems with Applications (ESWA), 2015
322015
Distinguishing the popularity between topics: A system for up-to-date opinion retrieval and mining in the web
N Pappas, G Katsimpras, E Stamatatos
14th International Conference on Intelligent Text Processing and …, 2013
312013
Sentence bottleneck autoencoders from transformer language models
I Montero, N Pappas, NA Smith
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
262021
Combining content with user preferences for non-fiction multimedia recommendation: A study on TED lectures
N Pappas, A Popescu-Belis
Multimedia Tools and Applications (MTAP), 2015
262015
Multi-factor segmentation for topic visualization and recommendation: the must-vis system
CA Bhatt, A Popescu-Belis, M Habibi, S Ingram, S Masneri, F McInnes, ...
21st ACM International Conference on Multimedia (MM), 365-368, 2013
262013
Self-attentive residual decoder for neural machine translation
LM Werlen, N Pappas, D Ram, A Popescu-Belis
Proceedings of the 2018 Conference of the North American Chapter of the …, 2018
252018
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20