Scope of the Workshop

The importance and impact of Deep Learning and Deep Neural Network methodologies are by now
widely accepted. This exceptional success is nowadays associated with special purpose hardware
acceleration technologies (e.g., GPUs, TPUs), based primarily on conventional electronic hardware.
However, the observed architecture of biological neural systems fundamentally differs from von
Neumann processors. Following a biological inspiration, unconventional neuromorphic hardware
using photonics, 3D integration, spin-tronic, in-memory substrates and architectures attract
increasing attention as a way to implement Deep Learning algorithms. The common objective is to
leverage substrate or architecture inherent advantages in terms of speed, power consumption,
latency, and scalability.

A significant effort to increase synergy between neuromorphic computing substrate developments
and Deep Learning concepts is needed. This includes developing high-performance Deep Neural
Networks topologies amenable to neuromorphic implementations, finding solutions to manage the
intrinsic physical noise for neuromorphic computation, and exploring learning solutions alternative to
Backpropagation. Simultaneously, inherent properties of neuromorphic substrates motivate novel
Deep Learning models and algorithms with intriguing possibilities, for example leveraging intrinsic
continuous dynamics offered by photonics.

Topics of interest (not limited to)

This workshop intends bringing researchers from different backgrounds (including without being
limited to Physics, Computer Science, and Engineering) together to address the challenges posed
by developing Deep Learning in unconventional neuromorphic hardware. It aims at providing the
ideal platform for cross-pollination of views among the diverse covered fields. Accordingly, we call
for contributions that address (without being limited to) the following topics:

  • Deep Learning concepts for neuromorphic implementations, including
    Deep Neural Networks based on linear dynamics and/or partially untrained layers
    Neural ODEs and Continuous-depth neural architectures
    Spiking Neural Networks
    Noise-engineering (e.g., based on population-coding)
    Learning in deep neural architectures beyond Backpropagation
  • Computational and neuromorphic concepts, including
    Analogue and distributed computing
    Quantum hardware reservoirs
  • Unconventional and next generation hardware, including
    In memory computing
    Massively parallel hardware networks
    3D integrated Neural Network integration
    Photonic computing