Skip to content
UoL CS Notes

Radial Basis Function Training (RBF NN)

ELEC320 Lectures

Weight Update Function

\[\mathbf w=(\phi^T\phi+\lambda\mathbf I_{M\times M})^{-1}\phi^T\mathbf y\]

where:

  • $\lambda$ if the user-defined regularisation parameter:
    • $\lambda=0$ indicates that the input data is completely trustworthy.
    • $\lambda=\infty$ indicates that the dataset needs significant smoothing.

Radial Basis Functions vs. Multi-Layer Perceptrons

Universal Approximators
Given any function $f(x)=y$, we can approximate such a function using either network type.
  • Radial Basis Functions Require a Matrix inverse to be taken. This is slow and doesn’t scale well.
  • Despite the lengthy weight update process in MLP, they are preferred as it can be parallelised for large datasets.