Reference no: EM132693183
58072 Neural Networks - Sharif University of Technology
Assignment 1
Use MATLAB NEURAL NETWORKS TOOLBOX or the Neunet (Desire) system to develop several variations of single-layer feed-forward networks that are able to learn sets of patterns. A set of patterns is learned if the TSS becomes zero. Note: it may not be possible to learn a particular set of patterns!
1. The basic single-layer feed-forward (SLFF) program uses a threshold activation function and the Delta learning rule. Try to learn (a) the AND patterns, (b) the OR patterns, (c) the XOR patterns.
2. Change the threshold activation function into a linear activation function. Then, repeat the three parts of Question 1 using the linear activation function.
3. Change the linear activation function into a sigmoid activation function. Then, repeat the three parts of Question 1 using the sigmoid activation function.
Questions
1. What effect does the activation function have on the ability of the network to learn the patterns?
2. What is the effect of removing the Bias term?
3. What is the effect of changing the learning rate (Lrate)? Does changing the learning rate make it possible to learn some patterns that could not otherwise be learned? Does changing the learning rate make it impossible to learn some patterns that could otherwise be learned?
Hand in a listing of each of the programs used in the assignment. Include in your report, the value of the time variable, t, when the TSS reached zero; if the TSS never reached zero, indicate why the patterns could not be learned. Your report should be brief but different aspects of each case be explained clearly and completely.
Assignment 2
Use MATLAB NEURAL NETWORKS TOOLBOX or the Neunet (Desire) system to study
Learning in single-layer feed-forward networks.
1. Use linear activation function and Hebbian learning for a SLFF network to attempt to learn the following two sets of patterns. Why can the patterns be learned, or why can they not be learned?
|
Input
|
Output
|
1 0 0
|
0 1
|
Set (1)
|
0 1 0
|
1 0
|
|
0 0 1
|
1 1
|
|
1 0 0
|
0 1
|
Set (2)
|
0 1 0
|
1 0
|
|
1 1 1
|
1 1
|
2. Repeat Question 1 using the linear activation function, but with Delta learning rule instead of Hebbian learning rule (remember that in the first assignment also you used Delta learning rule). Why can the patterns be learned, or why can they not be learned?
3. Use sigmoid activation function and the Delta learning rule. Try to learn (a) the AND patterns, (b) the OR patterns. What is the effect of the Temperature parameter on the number of epochs required to learn the patterns (i.e. speed of learning)?
4. Use stochastic activation function and the Delta learning rule. Try to learn (a) the AND patterns, (b) the OR patterns. What is the effect of the Temperature parameter on the number of epochs required to learn the patterns (i.e. speed of learning)?
Assignment 3
Use MATLAB NEURAL NETWORKS TOOLBOX or the Neunet (Desire) system to develop and study Learning process in Back-Propagation networks.
1. Train the 2-2-1 network to learn the XOR patterns.
2. Modify the 2-2-1 network so that only one hidden unit is used.
3. Train the 2-2-1 network to learn the XOR patterns when the Bias term is fixed at zero. (Note that you must modify a few statements in the program to accomplish this.)
4. Do not randomize the weights before training the 2-2-1 network. (Remove the initialization to random values statements; by default, the weights will be initialized to zero.)
5. Modify the program to include a Temperature parameter. Then, train the 2-2-1 network with several different Temperature parameter values.
Questions
1. How many epochs are required to learn the patterns correctly?
2. Does the system converge to the same set of weights after each learning session?
3. What is the effect of the learning rate (Lrate) on the speed of learning?
4. What is the effect of setting the initial weights to zero?
5. Can the Network learn the patterns without a Bias term?
6. What is the effect of the Temperature parameter on the speed of learning?
7. Did you encounter any local minima for which there were global minima of zero?
Assignment 4
Use MATLAB NEURAL NETWORKS TOOLBOX or the Neunet (Desire) system to develop and study Learning process in Back-Propagation networks. The patterns to be learned are the mapping from θ to cos(θ). The patterns are generated using a loop in the program which varies θ from 0 to 2Π in steps of 2Π/Npat. The output values are selected to be in the range [0,1] instead of [-1,1].
1. Train a 1-x-1 network to learn the patterns.
2. Modify the program so that a different activation function is used. For example try one of: Bipolar hidden and output units (tanh function); linear output units;
3. Test the ability of the network to generalize by presenting a set of Test patterns that the network was not trained with.
Questions
1. How many epochs are required to learn the patterns reasonably well ( tss ? 0.01)?
2. How many hidden units are required? (There will be a range of hidden units that provide similar results.)
3. What is the effect of changing the activation function?
4. How well is the network able to generalize?
5. Are there any other changes that could be made to reduce tss (not the speed of learning)?
Assignment 5
Use MATLAB NEURAL NETWORKS TOOLBOX or the Neunet (Desire) system to develop and learn patterns using Competitive networks. The patterns to be learned are the 5x5 grid of alphabetic characters. (for an example see page 297 of your TEXT.)
1. Train the competitive network to learn the patterns using a conscience (crit=0).
2. Train the competitive network to learn the patterns without using a conscience (crit<0).
3. Optionally, train the competitive network to learn the patterns using pseudo-Art competition (crit>0).
4. Optionally, train a Back-Propagation Network to learn the patterns. (you would have to copy the patterns into one of the Back-Propagation programs.)
Questions
1. Can the conscience-based competitive network learn the patterns correctly? If so, how many training epochs are required? If not, why not?
2. Can the Basic competitive network learn the patterns correctly? If so, how many training epochs are required? If not, why not?
3. What is the effect of the learning rate (Lrate) on the speed of learning? What is the effect of the Decay parameter on the learning speed?
4. What is the effect of using large initial weights on the learning process?
Assignment 6
Use MATLAB NEURAL NETWORKS TOOLBOX or the Neunet (Desire) system to develop and study Recurrent neural networks.
1. Run the Basic Recurrent Network.
2. Add some additional patterns to the network and rerun the processing.
3. Change the Recall function from a Threshold function to a Stochastic function. Gradually, decrease the Temperature parameter during recall.
Questions
1. What is the distribution of correct patterns, inverse patterns, and local minima with each of the above 3 programs?
Attachment:- Assignments 1 to 6 - NN.rar