Department of Data-driven Analysis of Biological Networks (Wibral)
I am a physicist working on solving neuroscience puzzles using information theoretic methods. I am fascinated by information processing in the neocortex where it seems that one highly conserved circuit scheme can support an enormous variety of functions like signal detection, obejct perception, various forms of short and long term memory, executive functions and motor control. To understand how one circuit (with small variations perhaps) can support all these apparently different functions we have to understand their hidden commonalities, the underlying common algorithm that operates on neural data (which fortunately have the unified format of spiking neural activity).
The language describing such a common algorithm must almost necessarily be more abstract than that used for talking about idividual functions like object detection in vision, or remembering an episode from the orgamism's past. I think that information theory provides the necessary level of abstraction to tackle the puzzle of cortical circuit function.
In particular, the framework of local information dynamics developed by Lizier is a promising approach here. Local information dynamics provides the tools to measure the essential component operations on information.
These component operations are the storage, transfer and modification of information. Local information dynamics in principle allows to measure these components locally in space and time while a computation unfolds.
We have successfully used measures of local information dynamics such as local active information storage and transfer to test predictive coding theories, to investigate altered information processing in schizophrenia and autism, and to obtain a better understanding of the information processing changes underlying the loss of consciousness in anesthesia.
At present, my group works on more advanced information measures that are tailored to detect predictive processing in an hypothesis-free way from neural data, on novel types of cortex-inspired deep neural networks with local information theoretic goal functions, and on the development and application of partial information decomposition to neural data.