Folgen
Kosuke Nishida
Kosuke Nishida
NTT Human Informatics Laboratories / The University of Tokyo
Bestätigte E-Mail-Adresse bei hco.ntt.co.jp
Titel
Zitiert von
Zitiert von
Jahr
Answering while summarizing: Multi-task learning for multi-hop QA with evidence extraction
K Nishida, K Nishida, M Nagata, A Otsuka, I Saito, H Asano, J Tomita
arXiv preprint arXiv:1905.08511, 2019
1352019
Multi-style generative reading comprehension
K Nishida, I Saito, K Nishida, K Shinoda, A Otsuka, H Asano, J Tomita
arXiv preprint arXiv:1901.02262, 2019
972019
Abstractive summarization with combination of pre-trained sequence-to-sequence and saliency models
I Saito, K Nishida, K Nishida, J Tomita
arXiv preprint arXiv:2003.13028, 2020
392020
Length-controllable abstractive summarization by guiding with summary prototype
I Saito, K Nishida, K Nishida, A Otsuka, H Asano, J Tomita, H Shindo, ...
arXiv preprint arXiv:2001.07331, 2020
262020
Unsupervised domain adaptation of language models for reading comprehension
K Nishida, K Nishida, I Saito, H Asano, J Tomita
arXiv preprint arXiv:1911.10768, 2019
212019
Slidevqa: A dataset for document visual question answering on multiple images
R Tanaka, K Nishida, K Nishida, T Hasegawa, I Saito, K Saito
Proceedings of the AAAI Conference on Artificial Intelligence 37 (11), 13636 …, 2023
102023
Towards interpretable and reliable reading comprehension: A pipeline model with unanswerability prediction
K Nishida, K Nishida, I Saito, S Yoshida
2021 International Joint Conference on Neural Networks (IJCNN), 1-8, 2021
92021
Question responding apparatus, question responding method and program
A Otsuka, K Nishida, I Saito, K Nishida, H Asano, J Tomita
US Patent App. 16/972,187, 2021
62021
Household energy consumption prediction by feature selection of lifestyle data
K Nishida, A Takeda, S Iwata, M Kiho, I Nakayama
2017 IEEE International Conference on Smart Grid Communications …, 2017
62017
Vector generation device, sentence pair learning device, vector generation method, sentence pair learning method, and program
K Nishida, K Nishida, H Asano, J Tomita
US Patent 11,893,353, 2024
52024
Task-adaptive pre-training of language models with word embedding regularization
K Nishida, K Nishida, S Yoshida
arXiv preprint arXiv:2109.08354, 2021
52021
Natural language inference with definition embedding considering context on the fly
K Nishida, K Nishida, H Asano, J Tomita
Proceedings of The Third Workshop on Representation Learning for NLP, 58-63, 2018
42018
Self-adaptive named entity recognition by retrieving unstructured knowledge
K Nishida, N Yoshinaga, K Nishida
arXiv preprint arXiv:2210.07523, 2022
32022
Answer generating device, answer learning device, answer generating method, and answer generating program
K Nishida, A Otsuka, K Nishida, H Asano, J Tomita, I Saito
US Patent App. 17/433,096, 2022
32022
Improving few-shot image classification using machine-and user-generated natural language descriptions
K Nishida, K Nishida, S Nishioka
arXiv preprint arXiv:2207.03133, 2022
22022
Text generation apparatus, text generation method, text generation learning apparatus, text generation learning method and program
I Saito, K Nishida, A Otsuka, K Nishida, H Asano, J Tomita
US Patent App. 17/435,018, 2022
22022
Text generation apparatus, text generation learning apparatus, text generation method, text generation learning method and program
I Saito, K Nishida, K Nishida, H Asano, J Tomita
US Patent App. 17/908,212, 2023
12023
Robust Text-driven Image Editing Method that Adaptively Explores Directions in Latent Spaces of StyleGAN and CLIP
T Baba, K Nishida, K Nishida
arXiv preprint arXiv:2304.00964, 2023
12023
Information processing apparatus, information processing method and program
K Nishida, K Nishida, I Saito, H Asano, J Tomita
US Patent App. 17/770,953, 2022
12022
Text generation apparatus, text generation learning apparatus, text generation method, text generation learning method and program
I Saito, K Nishida, A Otsuka, K Nishida, H Asano, J Tomita
US Patent App. 17/764,169, 2022
12022
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20