# Show Tag: empirical-bayes

Select Other Tags

Chen et al. presented a system which uses a SOM to cluster states. After learning, the SOM units are extended with a histogram keeping the number of times the unit was BMU and the input belonged to each of a number of known states $$C={c_1,c_2,\dots,c_n}$$.

The system is used in robot soccer. Each class is connected to an action. Actions are chosen by finding the BMU in the net and selecting the action connected to its most likely class.

In an unsupervised, online phase, these histograms are updated in a reinforcement-learning fashion: whenever the action selected lead to success, the bin in the BMU's histogram which was the most likely class is increased. It is decreased otherwise.

Predictive coding can implement the EM algorithm.

Empirical Bayes methods estimate the prior from the data.

More formally, they choose some parametric form for the prior, and estimate an optimal set of parameters $\theta_{opt}$ by optimizaton: $$\theta_{opt} = \mathrm{arg\;max}_\theta\prod_n\int P_\theta(x)P(m_n\mid x)\;dx,$$ for measurements $m_n$ and possible latent variable values $x$.

In predictive coding, a model iterates the following steps:

• assume values for latent variables,
• predict sensory input (through a generative model),
• observe prediction error,
• adapt assumptions to minimize the error.

The EM algorithm is an iterative algorithm that solves a simplified version of Empirical Bayes.

Friston's predictive coding model predicts a hierarchical cortical system.

A weakness of empirical Bayes is that the prior which explains the data best is "not necessarily the one that leads to the best estimator".