30. Learning Equations

Unsupervised learning

Outlines

Learning in Rate Models

  1. PCA
    1. Correlation matrix $C_{ij} = \langle \xi^\mu_i \xi^\mu_j\rangle_\mu$, where $\xi^\mu_i$ is the $i$th component of a vector $\boldsymbol{\xi}^\mu$.
    2. Covariance matrix $V_{ij} = \left\langle (\xi^\mu_i - \langle \xi^\mu_i\rangle)( \xi^\mu_j - \langle \xi^\mu_j\rangle ) \right\rangle_\mu$.
    3. Principal components of the vectors are the eigenvectors of the covariance matrix $V$.
    4. The first principal component is the direction where the variance is maximal.
  2. Evolution of synaptic weights
    1. A neuron takes $\mu$ inputs at each time step, which are either 0’s or 1’s.
    2. At each time step, the input forms a $N$ dimensional vector ($N$ input neurons).
    3. For a total time step of $p$, we have $p$ $N$ dimensional vectors.
    4. At each time step, the weight change according to Hebbian learning rule $\Delta w = \gamma \nu^{\text{p}} \nu^{\text{pre}}_i$, where $\gamma$ is the learning rate.
    5. Using linear model of post synaptic rate, $\nu^{\text{post}} = \sum_i w_i \nu_i^{\text{pre}}$.
    6. The author derived the relation between weight and correlation matrix, as well as the eigenvalues and eigenvectors of it.
    7. The growth of the expectation value of weight will be dominated by the first principal component.
  3. Exponential growth of weight mean blowing up in biological systems, which should not happen for a working brain. Thus modified Hebbian learning rule should be used and tested. Here we talk about normalization of weight.
    1. Three key ideas:
      1. Normalize sum of weights or quadratic norm of weights;
      2. Multiplicative normalization or subtractive normalization (to make sure that $\sum_i w_i$ is a constant).
      3. Might not be just local learning rules.
    2. Subtractive normalization: $\Delta w_i = \Delta \tilde w_i - \sum_j \Delta w_j /N$ so that $\sum_i \Delta w_i=0$.
    3. Multiplicative normalization: Oja’s learning rule as an example. The natural choice of normalization is to divide the weight at each step by its norm.
  4. Neurons of visual system have their receptive fields.

Planted: by ;

Current Ref:

  • snm/30.learning-equations.md