Brains@Bay Meetup – Alternatives to Backpropagation in Neural Networks (Nov 18, 2020)

To learn more about Brains@Bay, visit our Meetup page: https://www.meetup.com/Brains-Bay/

Brains@Bay Meetups focus on how neuroscience can inspire us to create improved artificial intelligence and machine learning algorithms. In this meetup, we discuss alternatives to backpropagation in neural networks.

From the neuroscience side, Prof. Rafal Bogacz (University of Oxford) discusses the viability of backpropagation in the brain, and the relationship of predictive coding networks and backpropagation.

Then, Sindy Löwe (University of Amsterdam) presents her latest research on self-supervised representation learning, where she shows networks can learn by optimizing the mutual information between representations at each layer of a model in isolation.

Finally, Jack Kendall (RAIN Neuromorphics) shows how equilibrium propagation can be used to train end-to-end analog networks, which can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.

Link to meetup: https://www.meetup.com/Brains-Bay/events/274459844/

0:00 Introduction
4:06 Rafal Bogacz (University of Oxford)
38:59 Sindy Löwe (University of Amsterdam)
1:11:02 Jack Kendall (RAIN Neuromorphics)
1:36:42 Discussion
– – – – –
Numenta is leading the new era of machine intelligence. Our deep experience in theoretical neuroscience research has led to tremendous discoveries on how the brain works. We have developed a framework called the Thousand Brains Theory of Intelligence that will be fundamental to advancing the state of artificial intelligence and machine learning. By applying this theory to existing deep learning systems, we are addressing today’s bottlenecks while enabling tomorrow’s applications.


Leave a Reply

Your email address will not be published. Required fields are marked *

*