Welcome to the visual processing lab

We aim to understand how neural networks perform critical computations. In sensory systems, a variety of computations extract information from the environment to guide behavior. Our understanding of these processes remains fragmentary: in some systems, specific neurons have been identified that respond to distinct sensory cues; in others, specific behavioral outputs or computational models that predict physiology or behavior are known. We want to get a complete understanding of how neurons gain specific physiological properties, how they are organized in circuits and how these circuits guide distinct behaviors.

Animals ranging from insects to humans use visual information, especially motion cues, to navigate through the environment, capture prey, or escape predators. Because motion vision requires circuits to integrate visual information over both space and time it has long been considered a paradigmatic computation for understanding brain function and  models that describe how motion information can be extracted have existed for more than 50 years.  However, the neural circuits that implement these models are still incompletely understood. Moreover, many molecular and cellular mechanisms regulate synaptic activity or modulate cellular properties in identified neurons, but they have only rarely been associated with specific, behaviorally relevant computations. My lab intends to achieve this by studying motion detection in a genetic model organism, the fruit fly Drosophila. In flies, motion-guided behaviors have been studied in detail and described computationally. We use cell biological and genetic approaches to manipulate critical neurons in motion detecting circuits. In combination with physiology and quantitative behavioral analysis, we hope to identify the mechanisms by which a nervous system can integrate molecular, cellular and circuit mechanisms to compute behaviorally critical outputs from specific inputs.

Our work is supported by:Funding_banner

Advertisements