Show Tag: noise

Select Other Tags

Beck et al. reinterpret Osborne et al.'s experiments stating it is more likely that sensory estimation in this task is suboptimal (thereby amplifying variability due to external and internal noise) than that the internal noise in perceptual and motor systems is vastly different.

Osborne et al. modeled performance of monkeys in a visual smooth pursuit task. According to their model, variability in this task is due mostly to estimation errors and not due to motor errors.

Beck et al. acknowledge that the task in Osborne et al.'s experiments was very artificial and the brain circuits involved in smooth pursuit are probably optimized for more natural tasks.

Noise can improve convergence in clustering algorithms.

If an MLP fails to approximate a certain function, this can be due to

  • inadequate learning procedure,
  • inadequate number of hidden units (not layers),
  • noise.

In principle, a three-layer feedforward network should be capable of approximating any (continuous) function.

Neurons' activities are most informative of the value of stimulus properties in the region where their tuning functions are maximal or have maximal slope.

Which of the two regions is the most informative depends on the variability (noise) of the neurons' responses.

Dávila-Chacón et al. show that the Liu et al. model of natural binaural sound source localization can be transferred to the Nao robot and there shows significant resilience to noise.

Their system can localize sounds with a spatial resolution of 15 degrees.

The binaural sound source localization system based on the Liu et al. model does not on its own perform satisfactory on the iCub due to the robot's ego noise which is greater than that of the Nao (~60 dB compared to ~40 dB).

Neural responses to the same stimulus are noisy.

Noise can be beneficial in learning.

Noisy neuronal responses can improve information transmission in populations (especially when neurons are threshold-like).

Hunsberger et al. suggest that neural heterogeneity and response stochasticity both decorrelate and linearize population responses and thus improve transmission of information.

One function, or, what Santangelo and macaluso call `the key element', of selective attention is filtering out distracters—ie. noise filtering.

GTM uses the EM algorithm to fit adaptive parameters $\mathbf{W}$ and $\beta$ of a constrained mixture of Gaussian model to the data.

The constrained mixture of Gaussian model consists of a set $\{\mathbf{x}_i\}$ of points in latent space which are mapped via a general linear model $\mathbf{W}\phi(x)$ into data space, and the inverse variance $\beta$ of the Gaussian noise model.

Simulations can lead researchers to postulate unrealistically reliable sensor data or actuation.

Noise in real experiments can make dynamics more stable.

Embodied robots bring together the complexity of sensing and action the real world poses. These are not present in simple models and simulations.

Beck et al. argue that sub-optimal computations in biological and artificial neural networks can amplify behavioral and perceptual variability caused by internal and external noise.

Beck et al. argue that sub-optimal computations are a greater cause of behavioral and perceptual variability than internal noise.

Optimal operations are often not feasible for complex tasks for two reasons:

  • the generative models necessary to do optimal estimation are too complex and require a lot of knowledge to create
  • applying these models is much too computationally intensive

There seems to be a linear relationship between the mean and variance of neural responses in cortex. This is similar to a Poisson distribution where the variance equals the mean, however, the linearity constant does not seem to be one in biology.

The pure efficient coding hypothesis does not take into account noise which may corrupt signals.

Bauer et al. present a SOM variant which learns the variance of different sensory modalities (assuming Gaussian noise) to model multi-sensory integration in the SC.

Bauer and Wermter present an ANN algorithm which takes from the self-organizing map (SOM) algorithm the ability to learn a latent variable model from its input. They extend the SOM algorithm so it learns about the distribution of noise in the input and computes probability density functions over the latent variables. The algorithm represents these probability density functions using population codes. This is done with very few assumptions about the distribution of noise.

Bauer and Wermter use the algorithm they proposed to model multi-sensory integration in the SC. They show that it can learn to near-optimally integrate noisy multi-sensory information and reproduces spatial register of sensory maps, the spatial principle, the principle of inverse effectiveness, and near-optimal audio-visual integration in object localization.