Kazuma Hashimoto
Kazuma Hashimoto
Salesforce Research
Bestätigte E-Mail-Adresse bei salesforce.com - Startseite
Titel
Zitiert von
Zitiert von
Jahr
A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks
K Hashimoto, C Xiong, Y Tsuruoka, R Socher
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
3332017
Tree-to-Sequence Attentional Neural Machine Translation
A Eriguchi, K Hashimoto, Y Tsuruoka
Proceedings of the 54th Annual Meeting of the Association for Computational …, 2016
1892016
Simple Customization of Recursive Neural Networks for Semantic Relation Classification.
K Hashimoto, M Miwa, Y Tsuruoka, T Chikayama
Proceedings of the 2013 Conference on Empirical Methods in Natural Language …, 2013
982013
Topic detection using paragraph vectors to support active learning in systematic reviews
K Hashimoto, G Kontonatsios, M Miwa, S Ananiadou
Journal of biomedical informatics 62, 59-65, 2016
662016
Task-Oriented Learning of Word Embeddings for Semantic Relation Classification
K Hashimoto, P Stenetorp, M Miwa, Y Tsuruoka
Proceedings of the Nineteenth Conference on Computational Natural Language …, 2015
552015
Jointly Learning Word Representations and Composition Functions Using Predicate-Argument Structures.
K Hashimoto, P Stenetorp, M Miwa, Y Tsuruoka
Proceedings of the 2014 Conference on Empirical Methods in Natural Language …, 2014
382014
Learning to retrieve reasoning paths over wikipedia graph for question answering
A Asai, K Hashimoto, H Hajishirzi, R Socher, C Xiong
ICLR 2020, 2020
302020
Neural Machine Translation with Source-Side Latent Graph Parsing
K Hashimoto, Y Tsuruoka
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
302017
Adaptive Joint Learning of Compositional and Non-Compositional Phrase Embeddings
K Hashimoto, Y Tsuruoka
Proceedings of the 54th Annual Meeting of the Association for Computational …, 2016
282016
Multilingual extractive reading comprehension by runtime machine translation
A Asai, A Eriguchi, K Hashimoto, Y Tsuruoka
arXiv preprint arXiv:1809.03275, 2018
192018
Find or classify? dual strategy for slot-value predictions on multi-domain dialog state tracking
JG Zhang, K Hashimoto, CS Wu, Y Wan, PS Yu, R Socher, C Xiong
arXiv preprint arXiv:1910.03544, 2019
172019
Training a Joint Many-Task Neural Network Model using Successive Regularization
K Hashimoto, C Xiong, R Socher
US Patent App. 15/421,431, 2018
152018
Joint Many-Task Neural Network Model for Multiple Natural Language Processing (NLP) Tasks
K Hashimoto, C Xiong, R Socher
US Patent App. 15/421,407, 2018
132018
Character-based Decoding in Tree-to-Sequence Attention-based Neural Machine Translation
A Eriguchi, K Hashimoto, Y Tsuruoka
Proceedings of the 3rd Workshop on Asian Translation (WAT2016), 175-183, 2016
102016
Deep Neural Network Model for Processing Data Through Mutliple Linguistic Task Hiearchies
K Hashimoto, C Xiong, R Socher
US Patent App. 15/421,424, 2018
92018
Domain Adaptation and Attention-Based Unknown Word Replacement in Chinese-to-Japanese Neural Machine Translation
K Hashimoto, A Eriguchi, Y Tsuruoka
Proceedings of the 3rd Workshop on Asian Translation (WAT 2016), 75-83, 2016
92016
Incorporating source-side phrase structures into neural machine translation
A Eriguchi, K Hashimoto, Y Tsuruoka
Computational Linguistics 45 (2), 267-292, 2019
72019
Learning Embeddings for Transitive Verb Disambiguation by Implicit Tensor Factorization
K Hashimoto, Y Tsuruoka
Proceedings of the 3rd Workshop on Continuous Vector Space Models and their …, 2015
72015
Co-search: Covid-19 information retrieval with semantic search, question answering, and abstractive summarization
A Esteva, A Kale, R Paulus, K Hashimoto, W Yin, D Radev, R Socher
arXiv preprint arXiv:2006.09595, 2020
62020
Parallelizing and optimizing neural Encoder–Decoder models without padding on multi-core architecture
Y Qiao, K Hashimoto, A Eriguchi, H Wang, D Wang, Y Tsuruoka, K Taura
Future Generation Computer Systems 108, 1206-1213, 2020
52020
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20