Java Kohonen Neural Network Library – bug

posted by Karsten

I’ve been preparing a 3 week seminar on AI for my day job at Victoria University of Wellington. In this seminar I want the students to see a few different AI methods in action before they code up some small AI examples.

One of these examples is a Kohonen network for organising colours. Instead of coding from scratch I decided to use the Java Kohonen Neural Network Library which can be found at http://jknnl.sourceforge.net/. It provides all the functionalities I need for my seminar, but when I had made the examples the code simply didn’t run. None of the neurons in the “brain” changed values, even though when running it with comments turned on it clearly was changing the values internally. The setup I was using was a network which used the WTALearningFunction, similar to the example on the the library website.

The initial thought was that I was using the library incorrectly, however I couldn’t find any other internal data structures storing neurons. I then went into the source of the library to investigate and seemingly the code was correct. It was only when I realised that the method that changes the weights was changing the weights on a cloned version of the neuron weights that I spotted that the changes where only stored locally!

To rectify this I had to add the following line at line 261 in WTALearningFunction.java:

I have sent an error report to the project team. I probably spend more time debugging this issue than it would have taken me hard-coding an example, but at least there is an output for the community 😉


One Response to “Java Kohonen Neural Network Library – bug”

  1. Karsten says:

    I just found another bug. In getBestNeuron(double[]vector) of WTMLearningFunction (and possible other classes as well), then return statement should be:

    return bestNeuron+1;

    The issue is that the library uses an internal zero based storage, but outwards (weirdly) uses a One based storage. This was not applied to this function and therefore the changes are applied to the wrong neurons, resulting in weird networks…

Place your comment

Please fill your data and comment below.
Your comment