Adaptive Process
The final process is the synaptic adaptive process, in the self-organized formation of a feature map. For the network to be self-organizing, the synaptic weight vector Wj of neuron j in the network is needed to change in relation to the input vector X.
The query is how to make the change. In Hebb's postulate of learning, a synaptic weight is increased along with a simultaneous happens of pre-synaptic and post-synaptic activities.
The employ of such a rule is well deserved for associative learning. For the type of un-supervised learning being considered now, the Hebbian hypothesis however, in its basic form is unsatisfactory for the given reasonas:
Changes in connectivity occur in one direction only, which at last derive all the synaptic weights into saturation. To overcome such problem, the Hebbian hypothesis is changed by including a forgetting term - g (yj) Wj, where Wj is the synaptic weight vector of neuron j and g (yj) is some positive scalar function of the response yj. The only requirement imposed on the function g (yi) is that the constraint term in the Taylor series expansion of g (yj) be zero, hence:
g(yi) = 0 for yj = 0................Eqn(18)
The important of such requirement will become apparent momentarily. Specified such a function, the change to the weight vector of neuron j in the lattice can be found as given below:
ΔWj = ηyjx - g(yj) Wj.................Eqn(19)
Whereas η is the learning rate parameter of the algorithm
The first term on right hand side of eq.19 is the Hebbian term and the second term is the forgetting term. To suit the requirement of Eq.1), a liner function for g(yj) is selected as shown by:
g(yj) =ηyj...................................Eqn(20)
Eq. 19 is now simplified by setting as
yj = hj,i(x)..................................Eqn(21)
by using Eqs.19, 20, and 21, this is obtained
ΔWj = ηhj,i(x)(x - wj)....................Eqn(22)
Finally, using discrete-time formalism, given the synaptic weight vector Wj (n) of neuron j at time n, update weight vector Wj (n + 1) at time n + 1 is defined by:
Wj(n + 1) = Wj(n) + η(n)hj,i(x) (n) (x - wj) (n)...............Eqn(23)
Which is applied to every neurons in the lattice that lie inside the topological neighborhood of winning neuron i. Eq. 23 has the effect of moving the synaptic weight vector Wi of winning neuron i towards the input vector X. Therefore algorithm leads to a topological ordering of the feature map in the input space in the sense that neurons are adjacent in the lattice will tend to have same synaptic weight vectors. Eq. 23 is the desired formula for computing the synaptic weights of the feature map.