Folgen
Rathinakumar Appuswamy
Rathinakumar Appuswamy
Senior Research Scientist, IBM Almaden Research Center
Bestätigte E-Mail-Adresse bei us.ibm.com - Startseite
Titel
Zitiert von
Zitiert von
Jahr
A million spiking-neuron integrated circuit with a scalable communication network and interface
PA Merolla, JV Arthur, R Alvarez-Icaza, AS Cassidy, J Sawada, ...
Science 345 (6197), 668-673, 2014
40612014
Backpropagation for energy-efficient neuromorphic computing
SK Esser, R Appuswamy, P Merolla, JV Arthur, DS Modha
Advances in neural information processing systems 28, 2015
11462015
Learned step size quantization
SK Esser, JL McKinstry, D Bablani, R Appuswamy, DS Modha
arXiv preprint arXiv:1902.08153, 2019
6432019
Cognitive computing systems: Algorithms and applications for networks of neurosynaptic cores
SK Esser, A Andreopoulos, R Appuswamy, P Datta, D Barch, A Amir, ...
The 2013 International Joint Conference on Neural Networks (IJCNN), 1-10, 2013
2012013
TrueNorth: Accelerating from zero to 64 million neurons in 10 years
MV DeBole, B Taba, A Amir, F Akopyan, A Andreopoulos, WP Risk, ...
Computer 52 (5), 20-29, 2019
2002019
Deep neural networks are robust to weight binarization and other non-linear distortions
P Merolla, R Appuswamy, J Arthur, SK Esser, D Modha
arXiv preprint arXiv:1606.01981, 2016
1502016
Network coding for computing: Cut-set bounds
R Appuswamy, M Franceschetti, N Karamchandani, K Zeger
IEEE Transactions on Information Theory 57 (2), 1015-1030, 2011
1132011
Complete mutually orthogonal Golay complementary sets from Reed–Muller codes
A Rathinakumar, AK Chaturvedi
IEEE transactions on information theory 54 (3), 1339-1346, 2008
1092008
Real-time scalable cortical computing at 46 giga-synaptic OPS/watt with~ 100× speedup in time-to-solution and~ 100,000× reduction in energy-to-solution
AS Cassidy, R Alvarez-Icaza, F Akopyan, J Sawada, JV Arthur, ...
SC'14: Proceedings of the International Conference for High Performance …, 2014
982014
Discovering low-precision networks close to full-precision networks for efficient embedded inference
JL McKinstry, SK Esser, R Appuswamy, D Bablani, JV Arthur, IB Yildiz, ...
arXiv preprint arXiv:1809.04191, 2018
972018
Truenorth ecosystem for brain-inspired computing: scalable systems, software, and applications
J Sawada, F Akopyan, AS Cassidy, B Taba, MV Debole, P Datta, ...
SC'16: Proceedings of the International Conference for High Performance …, 2016
972016
A new framework for constructing mutually orthogonal complementary sets and ZCZ sequences
R Appuswamy, AK Chaturvedi
IEEE Transactions on Information theory 52 (8), 3817-3826, 2006
942006
From the cover: Convolutional networks for fast, energy-efficient neuromorphic computing
SK Esser, PA Merolla, JV Arthur, AS Cassidy, R Appuswamy, ...
Proceedings of the National Academy of Sciences of the United States of …, 2016
632016
Massively parallel neural inference computing elements
R Appuswamy, JV Arthur, AS Cassidy, P Datta, SK Esser, MD Flickner, ...
US Patent 10,621,489, 2020
532020
Mutually orthogonal sets of ZCZ sequences
A Rathinakumar, AK Chaturvedi
Electronics Letters 40 (18), 1, 2004
462004
Computing linear functions by linear coding over networks
R Appuswamy, M Franceschetti
IEEE transactions on information theory 60 (1), 422-431, 2013
432013
Linear codes, target function classes, and network computing capacity
R Appuswamy, M Franceschetti, N Karamchandani, K Zeger
IEEE Transactions on Information Theory 59 (9), 5741-5753, 2013
352013
Time and energy complexity of function computation over networks
N Karamchandani, R Appuswamy, M Franceschetti
IEEE Transactions on Information Theory 57 (12), 7671-7684, 2011
342011
1014
TM Wong, R Preissl, P Datta, M Flickner, R Singh, SK Esser, E McQuinn, ...
IBM Research Divsion, Research Report RJ10502, 13-15, 2012
332012
Scheduler for mapping neural networks onto an array of neural cores in an inference processing unit
P Datta, AS Cassidy, MD Flickner, H Penner, R Appuswamy, J Sawada, ...
US Patent App. 16/051,034, 2020
302020
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20