Talks
Spring 2018

The Hebbian-Descent Learning Rule

Wednesday, June 19th, 2019, 2:00 pm2:30 pm

Add to Calendar

Speaker: 

Laurenz Wiskott (Ruhr-Universität Bochum)

Abstract neurobiological network models use various learning rules with different pros and cons. Popular learning rules include Hebbian learning and gradient descent. However, Hebbian learning has problems with correlated input data and dos not profit from seeing training patterns several times. Gradient descent has the problem of vanishing gradient for partially flat activation functions, especially in online learning. We analyze here a variant, we refer to as Hebbian-Descent, that addresses these problems by dropping the derivative of the activation function and by centering, i.e. keeping the neural activities mean free, leading to an update rule that is provably convergent, does not suffer from the vanishing gradient problem, can deal with correlated data, profits form seeing patterns several times, and enables successful online learning when centering is used.

AttachmentSize
PDF icon The Hebbian-Descent Learning Rule1.88 MB