Folgen
Mikel Bober-Irizar
Mikel Bober-Irizar
University of Cambridge & University of Surrey
Bestätigte E-Mail-Adresse bei surrey.ac.uk - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Deep Learning for Classical Japanese Literature
T Clanuwat, M Bober-Irizar, A Kitamoto, A Lamb, K Yamamoto, D Ha
NeurIPS 2018 Workshop on Machine Learning for Creativity and Design, 2018
7152018
KaoKore: A Pre-modern Japanese Art Facial Expression Dataset
Y Tian, C Suzuki, T Clanuwat, M Bober-Irizar, A Lamb, A Kitamoto
International Conference on Computational Creativity, 2020, 2020
342020
Deep Architectures and Ensembles for Semantic Video Classification
EJ Ong, S Husain, M Bober-Irizar, M Bober
IEEE Transactions on Circuits and Systems for Video Technology, 2018
252018
Architectural Backdoors in Neural Networks
M Bober-Irizar, I Shumailov, Y Zhao, R Mullins, N Papernot
Computer Vision and Pattern Recognition (CVPR) 2023, 2022
162022
Cultivating DNN Diversity for Large Scale Video Labelling
M Bober-Irizar, S Husain, EJ Ong, M Bober
CVPR 2017, Youtube-8M Workshop, 2017
112017
Single-cell subcellular protein localisation using novel ensembles of diverse deep architectures
SS Husain, EJ Ong, D Minskiy, M Bober-Irizar, A Irizar, M Bober
Communications Biology 6 (1), 489, 2023
5*2023
Neural networks for abstraction and reasoning: Towards broad generalization in machines
M Bober-Irizar, S Banerjee
arXiv preprint arXiv:2402.03507, 2024
32024
Progress and Results of Kaggle Machine Learning Competition for Kuzushiji Recognition
A Kitamoto, T Clanuwat, A Lamb, M Bober-Irizar
Proceedings of IPSJ SIG Computers and the Humanities Symposium 2019, (in press), 2019
32019
Predicting the Ordering of Characters in Japanese Historical Documents
A Lamb, T Clanuwat, S Han, M Bober-Irizar, A Kitamoto
arXiv preprint arXiv:2106.06786, 2021
12021
Learning to Localize Temporal Events in Large-scale Video Data
M Bober-Irizar, M Skalic, D Austin
International Conference on Computer Vision 2019, 3rd Youtube-8M Workshop, 2019
2019
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–10