- Poster presentation
- Open Access
The Hopfield-like neural network with governed ground state
BMC Neuroscience volume 14, Article number: P257 (2013)
Using a vector let us construct a matrix , , where is the Kronecker delta, and . We define a Hopfield-like neural network with a connection matrix proportional to and threshold proportional to coordinates . Real quantities and are our free parameters. The dynamics of the network is defined by the equation , where are binary coordinates of the configuration vector describing the state of the network at the given time . Fixed points of the network are local minima of the energy. The configurations providing for the global minimum are called the ground state. Just the ground state is usually associated with the memory of the network. It turns out that to a considerable extent the ground state of our network can be governed by the parameters , and . The point is that in full the energy is defined by the scalar product of vectors and : . Then the number of different values of the energy is equal to the number of different values of the cosine when ranges over all configurations. Let us arrange different values of cosines in the decreasing order starting the numeration from 0: > > ...> . The set of all the configurations for which the cosine is equal to we define as the class : . It is easy to see that for each the equalities and are fulfilled. The following statement is true:
Theorem. As increases beginning from the initial value equals to 0, the ground state of the network coincides in consecutive order with the classes : . The transition from takes place in the critical point , and while the ground state of the network is the class . The transitions ceases when the denominator of the expression for becomes negative. If , then .
In large part this theorem allows one to regulate the ground state of the network. Let us examine -dimensional hypercube whose side length is 2 and the center is in the origin of coordinates. The configurations coincide with vertices of the hypercube. Possible symmetric directions of the hypercube have to be chosen as vectors . For each choice of the configurations are distributed symmetrically around this vector. Each such symmetrical set of configurations is one of the classes , and using the theorem one can do it the ground state of the network. In particular, we can construct the ground state with very large number (~) of configurations. If nonzero components of the vector equal in moduli, for each the only fixed points of the network are its ground state. The classification of all possible applications of this Theorem is not yet finished.
Computer simulations show that basins of attraction of such fixed points are very small. It is not surprising, since the number of the fixed points is very large, and the volume of each basin of attraction is of the order of the volume of the unit hypersphere divided by the number of fixed points.
The work was supported by Russian Basic Research Foundation (grant 12-07-00259).
About this article
Cite this article
Litinskii, L.B., Malsagov, M.Y. The Hopfield-like neural network with governed ground state. BMC Neurosci 14, P257 (2013) doi:10.1186/1471-2202-14-S1-P257
- Neural Network
- Animal Model
- Computer Simulation
- Local Minimum
- Scalar Product