Show Reference: "Noise-enhanced clustering and competitive learning algorithms"

Noise-enhanced clustering and competitive learning algorithms Neural Networks, Vol. 37 (January 2013), pp. 132-140, doi:10.1016/j.neunet.2012.09.012 by Osonde Osoba, Bart Kosko
    abstract = {Noise can provably speed up convergence in many centroid-based clustering algorithms. This includes the popular k-means clustering algorithm. The clustering noise benefit follows from the general noise benefit for the expectation–maximization algorithm because many clustering algorithms are special cases of the expectation–maximization algorithm. Simulations show that noise also speeds up convergence in stochastic unsupervised competitive learning, supervised competitive learning, and differential competitive learning.},
    author = {Osoba, Osonde and Kosko, Bart},
    doi = {10.1016/j.neunet.2012.09.012},
    issn = {08936080},
    journal = {Neural Networks},
    keywords = {clustering, competitive-learning, noise},
    month = jan,
    pages = {132--140},
    posted-at = {2013-02-18 10:37:17},
    priority = {2},
    title = {Noise-enhanced clustering and competitive learning algorithms},
    url = {},
    volume = {37},
    year = {2013}

See the CiteULike entry for more info, PDF links, BibTex etc.

Noise can improve convergence in clustering algorithms.

k-means is a special case of the EM algorithm

"Stochastic competitive learning behaves as a form of adaptive quantization", because the centroids being adapted distribute themselves in the data space such that they minimize the quantization error (according to the distance metric being used).