• Nem Talált Eredményt

G. Computational systems neuroscience

In document Wigner RCP 2016 (Pldal 38-42)

38

39

sequential Monte Carlo approximation algorithm made it possible to apply this model to larger datasets. We have published these results in a proceedings article, while also presenting them in multiple international and Hungarian conferences. The scaling up made possible by the approximation revealed that the selection of episodes from the contents of episodic memory to integrate into the statistical model is critical. Consequently, we have developed a procedure for the optimisation of this selection, however this requires further analysis.

It would be an important extension of our model if we were able to show that episodic memory makes structure learning possible in the more realistic situation where episodes are compressed in a non-invertible way. For modelling this compression, we propose an encoding where the brain uses latent state variables in a hierarchical probabilistic model of the environment, and higher level variables are prioritised over lower level details. This gives diverging predictions from current accounts of episodic reconstruction, which can be tested experimentally.

Perception in the brain, especially the visual processing, what we have studied mainly, has a hierarchical structure. The role of a cortical area, representing a hierarchical level, is essentially the re-representation of information in a form to facilitate recognition and classification of higher level complex objects. To achieve these complex representations the nervous system utilizes non-linear transformations at all levels. Previous studies have characterized this process with the full extent of the information at each level. We put the emphasis on the structure of the information. Easily decodable information is considered to be a more relevant quantity. We quantify this by linear decodability which can be plausibly realized by a neuronal layer at the next hierarchical level.

Contribution of firing rate nonlinearity to optimal cortical computations. — Processing of visual information in the brain is performed in a hierarchical structure. The role of a cortical area that forms a level of the processing hierarchy can be phrased as the re-representation of information in a form to facilitate recognition and classification of higher level, or more complex features. To achieve these complex representations the nervous system utilizes non-linear transformations at all levels. Previous studies have characterized this process with the full extent of the information at each level. We put the emphasis on the structure of the information. Easily decodable information is considered to be a more relevant quantity. We quantify this by linear decodability of information, since a linear decoder can be plausibly realized by a neuronal layer at subsequent levels of the hierarchy.

Recently, several studies have shown that the structure and the statistical characteristics of the nervous system adapt to the statistics of the environment. The subject of our investigation is the adaptation of the dynamics at the cellular level. We examined this through the effect of the so-called firing rate nonlinearity on the quality of information that can be decoded from a population of neurons. A critical feature of sensory neurons is their mixed sensitivity: a neuron is sensitive to multiple features of the stimulus, which in the case of simple cells of V1 comprises orientation, phase, spatial frequency and contrast. These mixed sensitivities pose a critical challenge for decodability: when some of these parameters are unknown, areas downstream in the processing hierarchy need to integrate over these features. We point out, however, that when integration occurs, linear decoding becomes ineffective: orientation cannot be decoded while other parameters are unknown. In our

40

investigations we have pointed out that in the absence of unknown parameters decoding from membrane potential is effective and no nonlinearity is required. However, in the presence of any of these so-called nuisance parameters a nonlinearity is necessary for efficient decoding. At higher levels of the computational hierarchy the number of nuisance parameters grows, therefore the importance of nonlinearity becomes even more pronounced.

Beyond demonstrating the necessity of a non-linearity in the processing we have also shown quantitatively that the form of the nonlinearity implemented by cortical neurons is optimal for decoding. Based on the form of firing rate nonlinearity we have derived a prediction for the optimal firing threshold of V1 neurons that could be contrasted with intracellular measurements for V1 simple cells. We found a qualitative match between predicted and measured thresholds. Our results were presented on a poster at an international conference and in Hungary and at a prestigious international conference in the US. A publication is in preparation. In the future, we plan to study consistency between shapes of non-linearities at adjacent hierarchical levels on the basis of our normative linear decoding principle. We plan to extend our investigations on higher level with more realistic non-local decoding tasks connected to pattern recognition.

Disentangling learning-dependent and learning-independent processes in human implicit learning. — Investigating human learning and decision making in dynamical environments in a general setting could allow one to understand the common principles relating intuitive physics, natural language understanding and theory of mind. Higher-level representations in temporal domains could then be measured for each individual.

We contributed to developing and improving methods for inferring human representations.

To gather information in high-dimensional spaces, one requires a large number of data points during a learning process to identify the model forms individuals use during a learning task. The generative process of behavioural responses is, however, highly confounded with learning-independent effects. We developed a method for segregating the variation in response time measurements that are related to such confounds from the variation induced by learning. As a result of our analysis, we concluded that the confounds may impose a much larger effect on the response times than learning itself, rendering filtering or other form of accounting for confounds essential for inference. We could demonstrate that using the method developed in this study can increase the predictive power a learning-based model. Our work was presented at two international conferences and is now in review at a journal for publication.

Grants

“Momentum” Program of the HAS (G. Orbán, 2012-2017)

NAP-B National Programme for Brain Research (G. Orbán, 2015-)

International cooperation

University of Cambridge (Cambridge, UK), M. Lengyel

University of California, Los Angeles (Los Angeles, CA, USA) P. Golshani Columbia University (New York, NY, USA), A. Losonczy

41 Central European University (Budapest), J. Fiser

Ernst Strüngmann Institute (Frankfurt, Germany), W. Singer, A. Lazar

Publications

Article

1. Orbán G, Berkes P, Fiser J, Lengyel M: Neural variability and sampling-based probabilistic representations in the visual cortex. NEURON 92:(2) 530-543 (2016) Conference proceeding

2. Nagy DG, Orbán G: Episodic memory as a prerequisite for online updates of model structure. In: Proc. 38th Annual Meeting of the Cognitive Science Society, Philadelphia, Pennsylvania, USA, 10-08-2016 – 13-08-2016. Eds.: Papafragou A, Grodner D, Mirman D, Trueswell J, Cognitive Science Society, 2016. pp. 2699-2704. (ISBN:978-0-9911967-3-9)

See also: R-E.12

42

R-I. “Lendület” innovative gaseous detector

In document Wigner RCP 2016 (Pldal 38-42)