**MLP Topologies**- Feedback, feedforward and recurrent
- Layered and general
- Hidden layers and units
**Capabilities of Binary Output MLPs**- Binary inputs
- One hidden layer suffices
- 2^(N-1) units suffice (and are needed)
- Continuous inputs
- Open and closed areas
- Nonconvex regions
- Disjoint regions
- Steiner - Schlafli Theorem (proof)
- Cover's Theorem
- Making Holes
**Capabilities of Continuous Output MLPs**- Two hidden layers
- Hilbert's 13'th problem
- Arnold and Kolmogorov
- Universal approximation
- One hidden layer
- Limitations
- Approximations
- Cybenko's theorem
**VC Dimension**

**Assignment**

- Find a hard limiting MLP that implements XOR of three inputs
- Is there any reason to use a single hidden layer with a single hidden unit?
- Is there any reason to use three hidden layers?
- For binary inputs and outputs and hard limiting units is there any reason to use a single hidden layer with two hidden units?
- Define the regions

A) -1 < x < 0 AND 1 < y < 2

B) 1 < x < 2 AND -1 < y < 0

C) -1 < x < 0 AND -1 < y < 0

D) 1 < x < 2 AND 1 < y < 2

Find a hard limiting MLP which is active in:

- A, B and C
- A, B and D
- A and B only