CS667
Soft Neurons and the LMS Algorithm
- Widrow's adaptive filter
- Noise cancellation problem
- FIR filters
- Adaptive FIR filters
- Minimization of energy
- ADALINE
- Connection between adaptive filter and perceptron
- Extension to function approximation
- Least means square error criterion
- Here no meaning to misclassification - just analog error
- Formal connection between PLA and LMS
- The mean square error criterion
- Derivation of LMS algorithm for linear neurons
- Limitations of linear neurons
- Linear transformations are too simple
- translations
- scalings
- rotations
- All linear layered networks are equivalent to a single layer
- XOR can be implemented with a simple two layer nonlinear network
- Type of nonlinearities
- Hard limiting
- Sigmoid squashers
- logistic sigmoid
- tanh
- derivatives
- symmetric functions
- Multilayer networks
- Derivation of LMS algorithm for sigmoidal neurons
- Gamba perceptron
- Multilayer perceptrons (MLP)
Assignment
- What is the connection between sigma(x) and tanh(x)?
- Derive the formulae for the derivatives of sigma(x) and tanh(x).
Back to the CS667 Home Page