Show Reference: "Adding a Conscience to Competitive Learning"

Adding a Conscience to Competitive Learning Proceedings of the International Conference on Neural Networks In Neural Networks, 1988., IEEE International Conference on (July 1988), pp. 117-124, doi:10.1109/icnn.1988.23839 by Duane DeSieno
@inproceedings{desieno-1988,
    address = {San Diego, CA},
    abstract = {There are a number of neural networks that self-organize on the basis of what has come to be known as Kohonen learning. The author introduces a modification of Kohonen learning that provides rapid convergence and improved representation of the input data. In many areas of pattern recognition, statistical analysis, and control, it is essential to form a nonparametric model of a probability density function p(x). The purpose of the improvement to Kohonen learning presented is to form a better approximation of p (x). Simulation results are presented to illustrate the operation of this competitive learning algorithm},
    author = {DeSieno, Duane},
    booktitle = {Neural Networks, 1988., IEEE International Conference on},
    doi = {10.1109/icnn.1988.23839},
    institution = {HNC Inc., San Diego, CA, USA},
    journal = {Proceedings of the International Conference on Neural Networks},
    keywords = {learning, som},
    month = jul,
    pages = {117--124},
    posted-at = {2014-03-18 15:29:07},
    priority = {2},
    publisher = {IEEE},
    title = {Adding a Conscience to Competitive Learning},
    url = {http://dx.doi.org/10.1109/icnn.1988.23839},
    year = {1988}
}

See the CiteULike entry for more info, PDF links, BibTex etc.

One way of evening out distribution of SOM units in data space is using `conscience': a value which increases every time a neuron is BMU and decreases whenever it isn't. High conscience values then lead to a lower likelihood of being selected as BMU.