Travelling Salesman Problem

  1. CLICK inside to stop
  2. CLICK again to reset and start

Left Side:

Legend:
  • Red path should be the shortest path to reach all towns
  • A-B% where A is the town number, B is the percent respect all trains that this town has been presented to the network.
  • A-B% where A is the neuron number and B is the percentage respect all trains that this neuron has been chosed as reference.
Right Side:

Legend:
  • still to implement.
Books Enter keywords...


Amazon.com

ALGORITHMS ...

The main idea of Kohonen Neuronal Networks is to leave the network organise himself. To do this we have to present patterns continuously and randomly until a stability is reached.

Such of networks are composed by two groups of neurons.
In the first group each neuron is connected with each neuron (himself too) of this group and the weight value depends on the distance between neurons. The weight r(i,j) between neuron i and j is given by:

                               2
                  ( - dist(i,j)  ) / ( 2 theta)
	r[i,j] = e
Where dist(i,j) is the Eulerian distance between neuron i and j, and theta is a learning factor (>0).

To enter data in the network we need another layer of neurons (the second group of neurons) , this layer do not belongs to the same topological rules of the previus network. Each neuron of this group is connected with each neuron of the first group. I will note w the weights that connects the two groups.

Main steps are:

  1. Place the neurons of the first group in a desired final pattern (neurons of first group will have an attribute of position X and Y).
    In TSP our final solution is a ring that connects all towns, and this ring should be the shorter. We will place k neurons uniformely in a ring formation, where k is the number of neurons in first group (k >= # of towns).

    Input of this network will be the coordinates of the towns, so second group will count 2 neurons, Xi and Yi (n = 2).

    Initialize all weights with random values.

  2. Take a random town and put his coordinates to input neurons (Xi and Yi values).
  3. Find j neuron for that:
                      2             2
            (Xi - w  )  + (Yi - w  )   is minimal !
                   xj            yj
    
    
    where w(xj) is the weight value between neuron Xi and neuron j (j belongs to first group). Same for w(yj).
    General formula:
            __ 
            \           2
            /  (N - w  )    is minimal !     N  is the nth input neuron value
            --   n   nj                       n
    	n
    
  4. Modify all w weights with:
            w  = w   + phi * r  * (Xi - w  )
             xi   xi          ij         xi
             
      and:                                            
      
            w  = w   + phi * r  * (Yi - w  )
             yi   yi          ij         yi  
    
    For all i neurons in first group.
    where phi is a learning rate parameter.
    General formula:
    
            w  = w   + phi * r  * (N - w  )
             ni   ni          ij    n   ni
    
    For all i neurons in first group and for all n in second group.
  5. decrease theta, phi and recalculate r weights.
  6. goto step 2.

SOURCES ...

BUGS & QUESTIONS ...


You are the th visitor.
More applets here