# Capabilities of Multilayer Perceptrons

1. MLP Topologies
• Feedback, feedforward and recurrent
• Layered and general
• Hidden layers and units

2. Capabilities of Binary Output MLPs
• Binary inputs
• One hidden layer suffices
• 2^(N-1) units suffice (and are needed)
• Continuous inputs
• Open and closed areas
• Nonconvex regions
• Disjoint regions
• Steiner - Schlafli Theorem (proof)
• Cover's Theorem
• Making Holes

3. Capabilities of Continuous Output MLPs
• Two hidden layers
• Hilbert's 13'th problem
• Arnold and Kolmogorov
• Universal approximation
• One hidden layer
• Limitations
• Approximations
• Cybenko's theorem

4. VC Dimension

Assignment

1. Find a hard limiting MLP that implements XOR of three inputs
2. Is there any reason to use a single hidden layer with a single hidden unit?
3. Is there any reason to use three hidden layers?
4. For binary inputs and outputs and hard limiting units is there any reason to use a single hidden layer with two hidden units?
5. Define the regions
A) -1 < x < 0 AND 1 < y < 2
B) 1 < x < 2 AND -1 < y < 0
C) -1 < x < 0 AND -1 < y < 0
D) 1 < x < 2 AND 1 < y < 2
Find a hard limiting MLP which is active in:
1. A, B and C
2. A, B and D
3. A and B only