Show Tag: parameters

Select Other Tags

In SOM learning, shrinking of the neighborhood size and decreasing update strength usually follow predefined schedules i.e. they only depend on the update step.

In the PLSOM algorithm, update strength depends on the difference between a data point and the best-matching unit's weight vector, the quantization error. A large distance, indicating a bad representation of that data point in the SOM, leads to a stronger update than a small distance. The distance is scaled relative to the largest quantization error encountered so far.

PLSOM reduces the number of parameters of the SOM algorithm from four to two.

PLSOM overreacts to outliers: data points which are very unrepresentative of the data in general will change the network more strongly than they should.

PLSOM2 addresses the problem of PLSOM overreacting to outliers.