If you’d like to be listed here, just send us a URL with the details.

Loading Events

« All Events

  • This event has passed.

Michael Jurado @ INRC – Enhancing Performance and Efficiency of SNNs

5 December, 2023 @ 08:00 - 09:00 PST

Title:
Enhancing Performance and Efficiency of SNNs: From Spike-Based Loss Improvements to Synaptic Sparsification Techniques.

Abstract:
The introduction of offline training capabilities like Spike Layer Error Reassignment in Time (SLAYER) and advancements in the probabilistic interpretations of Spiking Neural Network (SNN) output reinforce SNNs as a viable alternative to Artificial Neural Networks (ANNs). However, special care must be taken during Surrogate Gradient (SG) training to achieve desired performance and efficiency. This talk will cover our recent work in improving spike-based loss functions for SNNs as well as sparsifying SNNs for low cost, high performant neuromorphic computing.

Spikemax was previously introduced as a family of differentiable loss methods which use windowed spike counts to form classification probabilities. We modify the Spikemaxs loss method to use rates and a scaling parameter instead of counts to form Scaled-Spikemax. Our mathematical analysis shows that an appropriate scaling term can yield less coarse probability outputs from the SNN and help smooth the gradient of the loss during training. Experimentally, we show that Scaled-Spikemax achieves faster training convergence than Spikemax and results in relative improvements of 4.2% and 9.9% in accuracy for NMNIST and N-TIDIGITS18, respectively. We then extend Scaled-Spikemax to construct a spike-based loss function for multi-label classification called Spikemoid. The viability of Spikemoid is shown via the first known multi-label classification results on N-TIDIGITS18 and 2NMNIST, a novel variation of NMNIST that superimposes event-driven sensory data.

However, SNNs trained through SG methods oftentimes use dense or convolutional connections which are not always suitable for Loihi2. In order to minimize core usage and power consumption on chip, we employ synaptic pruning techniques as part of our SNN training pipelines. We demonstrate the effectiveness of synaptic pruning techniques for ANN to SNN conversion of vgg16 on Loihi1 as well as for a lava-dl trained SNN for the Intel DNS Challenge. This later approach involved the use of Gradual Magnitude Pruning (GMP) applied during SLAYER training, which reduced the memory footprint of the baseline SDNN by 50-75%. We highlight infrastructure changes to netX which enable conversion of lava-dl trained SNNs into sparsity aware lava processes.

If you are not yet a member of the INRC, please see the “Joining the INRC link” below.

Bio: Michael Jurado is a research engineer at the Georgia Tech Research Institute. He studied computer science at Georgia Tech and received his master’s degree in Machine Learning in 2022. Lately, Michael has been studying and developing neuromorphic algorithms for edge computing and a regular contributor to the lava code base. In his free time, he likes to read and study languages.

For the recording and slides, see the full INRC Forum 2023 Schedule (accessible only to INRC Affiliates and Engaged Members).

If you are interested in becoming a member, here is the information about Joining the INRC.

Details

Date:
5 December, 2023
Time:
08:00 - 09:00 PST
Event Category:
Event Tags:
, ,
Website:
https://intel-ncl.atlassian.net/wiki/spaces/INRC/blog/2023/11/02/2006974465/INRC+Forum+12+5+23+Michael+Jurado

Venue

Online