Emre Neftci received his MSc degree in physics from EPFL in Switzerland, and his Ph.D. in 2010 at the Institute of Neuroinformatics at the University of Zurich and ETH Zurich. He is currently an institute director at the Jülich Research Centre and Professor at RWTH Aachen. His current research explores the bridges between neuroscience and machine learning, with a focus on the theoretical and computational modeling of learning algorithms that are best suited to neuromorphic hardware and non-von Neumann computing architectures.

Prof. Dr. Emre Neftci
Director
Adresse
Dennewartstraße 25 -27
Peter Grünberg Institut (PGI)
Neuromorphic Software Ecosystems (PGI-15)
Selected 10 Research Publications
- J. Lohoff and E. Neftci. “Optimizing Automatic Differentiation with Deep Reinforcement Learning”. Advances in Neural Information Processing Systems. 2024.
- J. Finkbeiner, T. Gmeinder, M. Pupilli, A. Titterton, and E. Neftci. “Harnessing Manycore Processors with Distributed Memory for Accelerated Training of Sparse and Recurrent Models”. AAAI. 2024.
- K. M. Stewart and E. Neftci, “Metalearning spiking neural networks with surrogate gradient descent,” Neuromorphic Computing and Engineering, 2022. This article demonstrates for the first time, second order meta-learning with SNNs, showing a successful demonstration of bi-level learning on a model of SNN for few-shot learning with event-based datasets.
- F. Zenke and E. O. Neftci, “Brain-inspired learning on neuromorphic substrates,” Proceedings of the IEEE, p. 116, 2021. DOI: 10.1109/JPROC.2020.3045625. In this article, we analyzed SNN learning rules fromn the lens of automatic differentiation and their implementation in neuromorphic hardware.
- E. O. Neftci, H. Mostafa, and F. Zenke, “Surrogate gradient learning in spiking neural networks: Bringing the power of gradientbased optimization to spiking neural networks,” IEEE Signal Processing Magazine, vol. 36, no. 6, p. 5163, Nov. 2019. DOI: 10 . 1109 / MSP . 2019 . 2931595. This seminal paper introduces gradient-based learning in SNNs (i.e. Surrogate Gradients), and how SNN training can be achieved using the tools of machine learning.
- E. O. Neftci, “Data and power efficient intelligence with neuromorphic learning machines,” iScience, vol. 5,. 5268, Jul. 2018, ISSN: 25890042. DOI: https : / / doi . org / 10 . 1016 / j . isci . 2018 . 06 . 010. [Online]. Available: https : / / www . sciencedirect . com / science / article / pii /S2589004218300865. This paper is an important review of neuromorphic computing for continual learning, their role in practical applications, and a roadmap toward designing neuromorphic learning machines.
- J. Kaiser, H. Mostafa, and E. Neftci, “Synaptic plasticity dynamics for deep continuous local learning (decolle),” Frontiers in Neuroscience, vol. 14, p. 424, 2020, ISSN: 1662453X. DOI: 10.3389/fnins.2020.00424.Online]. Available: https://www.frontiersin.org/article/10.3389/fnins.2020.00424. This paper demonstrates three important concepts: learning in SNNs with local losses, their implementation on GPUs, and on a convolutional SNN for classification in event-based datasets.
- K. Stewart, G. Orchard, S. B. Shrestha, and E. Neftci, “Online fewshot gesture learning on a neuromorphic processor,” IEEE Journal on Emerging and Selected Topics in Circuits and Systems, vol. 10, no. 4, p. 512 521, Oct. 2020. DOI: 10.1109/JETCAS.2020.3032058. [Online]. Available: https://ieeexplore.ieee.org/abstract/document/9229141 This paper demonstrated transfer learning for few-shot learning on the Intel Loihi neuromorphic chip using pre-training. We demonstrated real-time, on-line learning of new body gesture learning using a neuromorphic vision sensor.
- K. Stewart, A. Danielescu, T. Shea, and E. Neftci, “Encoding eventbased data with a hybrid snn guided variational autoencoder in neuromorphic hardware,” in NeuroInspired Computational Elements Conference, 2022, p. 8897 Hybrid ANN and SNN neural networks, garnering the "best of both worlds" implementing an event-driven variational autoencoder.
- E. Neftci, S. Das, B. Pedroni, K. KreutzDelgado, and G. Cauwenberghs, “Eventdriven contrastive divergence for spiking neuromorphic systems,” Frontiers in Neuroscience, vol. 7, no. 272, Jan. 2014, DOI : 10.3389/fnins.2013.00272. This article demonstrated one of the first links between SNNs and machine learning. In particular, it showed the learning of a probabilistic inference using an SNN and neural Monte Carlo sampling.
- E. Neftci, J. Binas, U. Rutishauser, E. Chicca, G. Indiveri, and R. J. Douglas, “Synthesizing cognition in neuromorphic electronic systems,” PNAS, vol. 110, no. 37, E3468E3476, Jun. 2013. DOI: 10.1073/pnas.1212083110. This seminal work demonstrated how general-purpose computation in the form of finite state machines can be mapped onto mixed-signal neuromorphic hardware. Here we solved a cognitive task of the type used to evaluate the cognitive abilities of animals.