C++ Neural Networks and Fuzzy Logic C++ Neural Networks and Fuzzy Logic
by Valluru B. Rao
M&T Books, IDG Books Worldwide, Inc.
ISBN: 1558515526   Pub Date: 06/01/95
  

Previous Table of Contents Next


Other Approaches to Solve the Traveling Salesperson Problem

The following describes a few other methods for solving the traveling salesperson problem.

Anzai’s Presentation

Yuichiro Anzai describes the Hopfield network for the traveling salesperson problem in a slightly different way. For one thing, a global inhibition term is not used. A threshold value is associated with each neuron, added to the activation, and taken as the average of A1 and A2, using our earlier notation. The energy function is formulated slightly differently, as follows:

E = A1 Σ1k xik-1)2 + A2 Σik xki-1)2 + A4 Σk Σj≠k Σi≠k dkj xki(xj,i+1 + xj,i-1)

The first term is 0 if the sum of the outputs is 1 in each column. The same is true for the second term with regard to rows.

The output is calculated using a parameter λ, here called the reference activation level, as:

xij = (1 + tan tanh(aij/λ))/2

The parameters used are A1 = 1/2, A2 = 1/2, A4 = 1/2, Δt = 1, τ = 1, and λ = 1. An attempt is made to solve the problem for a tour of 10 cities. The solution obtained is not crisp, in the sense that exactly one 1 occurs in each row and each column, there are values of varying magnitude with one dominating value in each row and column. The prominent value is considered to be part of the solution.

Kohonen’s Approach for the Traveling Salesperson Problem

Kohonen’s self-organizing maps can be used for the traveling salesperson problem. We summarize the discussion of this approach described in Eric Davalo’s and Patrick Naim’s work. Each city considered for the tour is referenced by its x and y coordinates. To each city there corresponds a neuron. The neurons are placed in a single array, unlike the two-dimensional array used in the Hopfield approach. The first and the last neurons in the array are considered to be neighbors.

There is a weight vector for each neuron, and it also has two components. The weight vector of a neuron is the image of the neuron in the map, which is sought to self-organize. There are as many input vectors as there are cities, and the coordinate pair of a city constitutes an input vector. A neuron with a weight vector closest to the input vector is selected. The weights of neurons in a neighborhood of the selected neuron are modified, others are not. A gradually reducing scale factor is also used for the modification of weights.

One neuron is created first, and its weight vector has 0 for its components. Other neurons are created one at a time, at each iteration of learning. Neurons may also be destroyed. The creation of the neuron and destruction of the neuron suggest adding a city provisionally to the final list in the tour and dropping a city also provisionally from that list. Thus the possibility of assigning any neuron to two inputs or two cities is prevented. The same is true about assigning two neurons to the same input.

As the input vectors are presented to the network, if an unselected neuron falls in the neighborhood of the closest twice, it is created. If a created neuron is not selected in three consecutive iterations for modification of weights, along with others being modified, it is destroyed.

That a tour of shortest distance results from this network operation is apparent from the fact that the closest neurons are selected. It is reported that experimental results are very promising. The computation time is small, and solutions somewhat close to the optimal are obtained, if not the optimal solution itself. As was before about the traveling salesperson problem, this is an NP-complete problem and near efficient (leading to suboptimal solutions, but faster) approaches to it should be accepted.

Algorithm for Kohonen’s Approach

A gain parameter λ and a scale factor q are used while modifying the weights. A value between 0.02 and 0.2 was tried in previous examples for q. A distance of a neuron from the selected neuron is defined to be an integer between 0 and n – 1, where n is the number of cities for the tour. This means that these distances are not necessarily the actual distances between the cities. They could be made representative of the actual distances in some way. One such attempt is described in the following discussion on C++ implementation. This distance is denoted by dj for neuron j. A squashing function similar to the Gaussian density function is also used.

The details of the algorithm are in a paper referenced in Davalo. The steps of the algorithm to the extent given by Davalo are:

  Find the weight vector for which the distance from the input vector is the smallest
  Modify the weights using
wjnew = wjold + (lnew - wjold)g(λ,dj), where g(λ,dj) = exp(- dj2/λ) /√
  Reset λ as λ (1 - q)


Previous Table of Contents Next

Copyright © IDG Books Worldwide, Inc.