The goal of my research has been to investigate neural mechanisms underlying learning and decision-making, and their disturbance in addiction, aging, and schizophrenia. Specifically, I record single unit activity from various brain regions as rats perform a variety of cognitive tasks (e.g. reversal, delay discounting, stop-signal, set-shifting, conflict) and evaluate loss of function after pharmacological manipulation. I completed my dissertation work in the lab of Dr. Carl Olson in the Department of Neuroscience (CNUP) and the Center for the Neural Basis of Cognition (CNBC) at the University of Pittsburgh. There, I recorded from single neurons in several areas in primate frontal and medial cortex during performance of a reward based saccade tasks. After graduating, I accepted a position as post-doctoral fellow on the Cellular and Integrative Neuroscience Post-Doctoral Training Grant at the University of Maryland Medical School under the advisement of Dr. Geoff Schoenbaum. There I continued my work on issues pertaining to value-guided decision-making. Most of these studies used a novel choice task that I developed in Dr. Schoenbaum’s lab. During my time at the School of Medicine, I complemented my skills as a behavioral electrophysiologist with the ability to conduct lesion/inactivation experiments in order to behaviorally validate neural correlates. I also acquired the skills necessary to examine the impact that cocaine, schizophrenia-like brain alterations (NVHL model), and age has on behavior, firing of single units, and local field potentials. I have continued this work as an assistant and associate professor at the University of Maryland College Park as part of the Department of Psychology and the Program in Neuroscience and Cognitive Science. Recently, I have adopted Fast-Scan Cyclic Voltammetry (FSCV) in the lab under the guidance of Joe Cheer as part of a seed grant awarded to us by the University of Maryland.
Contribution to Science
In general, my research has focused on circuits critical for reinforcement learning, reward-guided decision-making and executive control, and how these circuits are disrupted in animal models of addiction, schizophrenia, and aging. Improving our understanding of the neural mechanisms underlying these functions has provided a better working knowledge of how we learn and behave normally, as well as what changes with mental disease and age. My contributions to the field are broken down into the following 5 sections. I have 60 publications, 4035 citations and an h-index of 32 (Google Scholar).
1. Encoding of Value and Motivation: I began my research career focusing on questions related to understanding what is genuinely signaled by neurons in the brain that fire strongly to environmental stimuli that predict more valued reward. During this time, several brain areas had been described as encoding the economic value of cues that predicted reward. These signals were thought to be important in guiding decisions based on anticipated outcomes. We contributed to this field of research by describing activity in several brain areas in frontal cortex that increased firing when monkeys were trained to expect a high versus low value reward (Roesch et. al., 2003; 2005a,b; 2007). Surprising, we found that this value encoding (higher firing when the monkey was working for a better reward) was more prominent in areas strongly affiliated with the motor system (e.g., premotor cortex) compared to areas more strongly affiliated with the limbic/reward system (e.g., orbitofrontal cortex). This observation made us realize that “value encoding” might not genuinely reflect the value of the reward expected, but rather the motivated effort that the animal puts forth while working for more valued rewards. Indeed, monkeys performed the task faster and with higher accuracy when promised a more valued reward. To resolve this issue we designed a novel task in which we manipulated motivation independently from reward value (Roesch et. al., 2004; 2007). Consistent with our hypothesis, we found that reward-related activity in limbic and motor regions reflected two extremes of a continuum, one in orbitofrontal cortex that represented the value of the predicted reward, and another, manifested in premotor cortex that reflected the degree of motivation driven by the value of the reward. These results have called into question what information is actually encoded in many other brain areas reported to exhibit reward-related activity. This works continues to be an important focus of my research today (e.g., Bissonette et. al., 2013; 2014).
Roesch, M.R. and Olson, C.R. (2004) Neuronal Activity Related to Reward Value and Motivation in Primate Frontal Cortex. Science. Apr 9;304(5668):307-10
Roesch, M.R. and Olson C.R. (2007) Neural activity related to anticipated reward: Does it represent value or reflect motivation? Ann N Y Acad Sci Invited Review [See this article for review of my primate work]
Bissonette, G.B., Burton, A.C., Gentry, R.N., Goldstein, B.L., Hearn, T.N., Barnett, B.R., Kashtelyan, V. &Roesch, M.R. (2013) Separate populations of neurons in ventral striatum encode value and motivation. PLoS One, 8, e64673.
Bissonette, GB, Gentry RN, Padmala S, Pessoa L, Roesch MR. Impact of appetitive and aversive outcomes on brain responses: linking the animal and human literatures. Frontiers in Systems Neuroscience. Front Syst Neurosci. 2014 Mar 4;8:24.
2. Reward predictions, prediction errors and attention for learning: Since the beginning of my career I have carefully dissected encoding related to reward-guided decision-making and reinforcement learning across several brain areas. I developed a novel odor guided decision-making task that varies the value of reward by making the reward delivered at the end of each behavioral trial larger or more immediate across several trial blocks. This new task has allowed us to address several issues related to encoding of reward prediction errors, reward predictions, attention, and different types of associative encoding that govern choice and instrumental behavior. This behavioral task has been very fruitful. To date, the task has been the topic of over 25 publications. With this behavioral paradigm we have shown that orbitofrontal cortex and ventral striatum signal reward expectancies, and that dopamine neurons and neurons in basolateral amygdala signal signed and unsigned errors in reward prediction, respectively. Error signaling by dopamine neurons is dependent on input from orbitofrontal cortex and appears to represent theoretical signals put forth by Rescorla-Wagner. On the other hand, signals in basolateral amygdala seem to correspond to attentional signals proposed by Pearce-Hall, increasing firing to both unexpected reward and omission. We suspect that these signals inform mechanisms in anterior cingulate cortex to increase attention on trials after reward contingencies are violated so that learning can occur. Interestingly, recent work suggests that these two signals influence each other and do not guide behavior through independent parallel pathways.
Roesch, M.R., Taylor, A.R. & Schoenbaum, G. (2006) Encoding of time-discounted rewards in orbitofrontalcortex is independent of value representation. Neuron 51, 509-20.
Roesch*, M.R., Calu*, D.J. and Schoenbaum G., (2007) Dopamine neurons in rat ventral tegmental area encode the more valuable option when deciding between differently sized and delayed rewards. Nat Neurosci: Dec;10(12):1615-24.
Roesch, M.R.*, Yuji K. Takahashi*, Thomas A Stalnaker, Richard Z Haney, Donna J Calu, Adam R Taylor, Kathryn A Burke, and Geoffrey Schoenbaum. (2009) The orbitofrontal cortex is necessary for learning! Neuron. Apr 30;62(2):269-80.
Roesch MR, Esber GR, Li J, Daw ND and Schoenbaum. Surprise! Neural correlates of Pearce-Hall and Rescorla-Wagner coexist in the brain. Eur J Neurosci (2012 Apr;35(7):1190-200) Invited Review.
3. Animal models of addiction, schizophrenia, and aging: When I began looking at animal models addiction it was known that addicts and drug-exposed animals had decision-making deficits in reversal-learning tasks and more complex ‘gambling’ variants that require flexible behavior. We furthered this field by showing that cocaine-exposed rats were hypersensitive to changes in expected reward size and delay to reward (Roesch et. al., 2007). Although these behavior deficits were well known, it was still unclear what neural signals were affected. Across several publications, we showed that chronic cocaine exposure impacted several nodes in the circuit critical to these types of behaviors. We demonstrated that flexible encoding of outcomes during decision-making in orbitofrontal cortex and basolateral amygdala was impaired after drug exposure. This miscoding was critical to the expression of the reversal-learning deficit, as demonstrated by ‘fixing’ the impaired behavior of cocaine-treated animals by making lesions to amygdala. In a different experiment, we showed that cocaine-exposure also reduced the degree and flexibility of cue-evoked firing in ventral striatum while enhancing cue-evoked firing in dorsal striatum, consistent with the idea that long term drug exposure makes behavior more habit-like. Most recently, we have shown that lesions to ventral striatum can also enhance stimulus-response encoding in dorsal striatum. Together, these results have provided neurophysiological evidence that exposure to cocaine can cause behaviorally relevant changes in the processing of associative information, which is critical for the decline in behavioral flexibility observed after chronic drug exposure.
In addition to addiction, we have examined how age, schizophrenia, and ADHD might disrupt activity in the circuit described above. We have shown that age reduces attention for learning signals and reward-related signals in amygdala (Roesch, 2012) and orbitofrontal cortex (Roesch, 2012), and that prefrontal (Gruber, 2010) and amygdala function (Roesch, 2015) is disrupted in animal models of schizophrenia (NVHL). We have also started to examine the impact that pre-natal nicotine exposure has on neural circuits critical for impulse control, learning, and attention (model of ADHD). To this end, we have developed several new behavioral paradigms, including attentional set-shifting and stop-signal paradigms, which are commonly used in human work and are disrupted in many psychiatric disorders.
Schoenbaum, G.,Roesch, M.R. & Stalnaker, T.A. Orbitofrontal cortex, decision-making and drug addiction. Trends Neurosci 29, 116-24 (2006). Invited Review
Roesch, M.R., Takahashi, Y., Gugsa, N., Bissonette, G.B. & Schoenbaum, G. Previous cocaine exposure makes rats hypersensitive to both delay and reward magnitude. J Neurosci. 27, 245-50 (2007).
Stalnaker, T.A., Takahashi, Y., Roesch, M.R., and Schoenbaum, G. (2008) Neural substrates of cognitive inflexibility after chronic cocaine exposure. Neuropharmacology. 2008 Jul 22. [See for review of addiction work]
Burton, A.C., Bissonette, G.B., Lichtenberg, N.T., Kashtelyan, V. & Roesch, M.R. (2014) Ventral Striatum Lesions Enhance Stimulus and Response Encoding in Dorsal Striatum. Biological psychiatry
4. Conflict and Response Inhibition: In addition to reward-related functions, I have an interest in how the brain detects the competition between two competing behavioral actions (conflict; cognitive control) and how unwanted behavior is inhibited. My first contribution to this field was in graduate school. There I published a paper with Kae Nakamura and Carl Olson showing that signals in anterior cingulate and supplemental eye field appear to better reflect the manifestation of conflict (i.e., competing directional signals) as opposed to conflict monitoring. I continued examining this issue with Geoff Schoenbaum as post-doc, writing several papers, attempting to make the point that OFC does not encode conflict or response inhibition as suggested by many the field. As a PI of my own lab I have continued to address this question head on showing that ACC and ABL are modulated by errors and attention during learning. Further we have recently developed a stop-signal task to characterize firing in DS, OFC and mPFC. We have shown that activity in DS reflects the manifestation of conflict and miscoding directional signals, OFC signals conflict adaptation (i.e., increased executive control under heightened conflict) and that mPFC monitors the degree of conflict after decisions are made. Lastly, we have shown that prenatal nicotine exposure makes rats impulsive and disrupts firing of mPFC neurons during performance of this task.
Bryden DW, Johnson EE, Tobia SC, Kashtelyan V, Roesch MR. Attention for learning signals in anterior cingulate cortex. J Neurosci. 2011 Dec 14;31(50):18266-74.
Bryden DW, Burton AC, Kashtelyan V, Barnett BR, Roesch MR. Response inhibition signals and miscoding of direction in dorsomedial striatum. Front Integr Neurosci. 2012;6:69.
Bryden DW and Roesch MR. Executive control signals in orbitofrontal cortex during response inhibition. J Neuro,35 (16), 6394-6400;(2015)
Bryden DW, Burton AC, Barnett BR, Cohen VJ, Hearn TN, Jones EA, Kariyil RJ, Kunin A, In Kwak S, Lee J, Lubinski BL, Rao GK, Zhan A, Roesch MR. Prenatal Nicotine Exposure Impairs Executive Control Signals in Medial Prefrontal Cortex.Neuropsychopharmacology. 2016 Feb;41(3):716-25.
5. Social recognition of reward and distress: Understanding how the well-being of others affects our behavior is critical, but vastly understudied. Several disorders are characterized by an inability recognize others’ mental states (e.g., psychopathy, autism). Human imaging work has provided clues regarding core structures involved in this process, including the amygdala, striatum, and orbitofrontal cortex; however, detailed work in animals at the single-unit and neurotransmitter level has not yet occurred. The computational basis of this process is critical for understanding fundamental mechanisms that are necessary and sufficient for these behaviors to occur. Given the current absence of work modeling these mechanisms in animals at the neuronal level we have started to address this issue in the lab. Our first contribution to this field was to show for the first time that DA release is modulated by delivery of reward to a conspecific. Our data show that animals display a mixture of affective states during observation of conspecific reward, first exhibiting increases in appetitive calls (50 kHz), then exhibiting increases in aversive calls (22 kHz). Like ultrasonic vocalizations (USVs), DA signals were modulated by delivery of reward to the conspecific, which mapped onto the emotional state associated with the conspecific receiving reward. Our results demonstrated the positive and negative states associated with conspecific reward delivery modulate DA signals related to learning in social situations.
Kashtelyan V, LichtenbergNT, ChenML, CheerJF, and Roesch MR. Observation of reward delivery to a conspecific modulates dopamine release in ventral striatum. Current Biology 24 (21), 2564-2568, 2014
6. Neurophysiology of rule switching in the corticostriatal circuit.The ability to adjust behavioral responses to cues in a changing environment is crucial for survival. Activity in the medial Prefrontal Cortex (mPFC) is thought to both represent rules to guide behavior as well as detect and resolve conflicts between rules in changing contingencies. While lesion and pharmacological studies have supported a crucial role for mPFC in this type of set-shifting, an understanding of how mPFC represents current rules or detects and resolves conflict between different rules is still unclear. Meanwhile, medial dorsal striatum (mDS) receives major projections from mPFC and neural activity of mDS is closely linked to action selection, making the mDS a potential major player for enacting rule-guided action policies. However, exactly what is signaled by mPFC and how this impacts neural signals in mDS is not well known. In a set-shifting task that we developed we have shown that inactivation of mDS impairs the ability to shift to a new rule and increases in the number of regressive errors. While recording in mDS, we identified neurons modulated by direction whose activity reflected the conflict between competing rule information. We also showed that a subset of these neurons were rule selective, and that the conflict between competing rule cues was resolved as behavioral performance improved. Other mDS neurons were modulated by rule, but not direction. These neurons became selective before behavior performance accurately reflected the current rule. In mPFC, we have shown that activity of single neurons in rat mPFC represent distinct rules. Further, we show increased firing on high conflict trials in a separate population of mPFC neurons. Reduced firing in both populations of neurons was associated with poor performance. Moreover, activity in both populations increased and decreased firing during the outcome epoch when reward was and was not delivered on correct and incorrect trials, respectively. In addition, outcome firing was modulated by the current rule and the degree of conflict associated with the previous decision. These results promote a greater understanding of the role that mPFC and mDS play in switching between rules.
Bissonette GB, Roesch MR. Neurophysiology of rule switching in the corticostriatal circuit.Neuroscience. 2016 Feb 3. pii: S0306-4522(16)00105-6.
Bissonette GB, Roesch MR. Neural correlates of rules and conflict in medial prefrontal cortex during decision and feedback epochs. Front Behav Neurosci. 2015 Oct 6;9:266.
Bissonette GB, Roesch MR. Rule encoding in dorsal striatum impacts action selection. Eur J Neurosci. 2015 Oct;42(8):2555-67.
Bissonette, G.B., Powell, E.M. & Roesch, M.R. (2013) Neural structures underlying set-shifting: roles of medial prefrontal cortex and anterior cingulate cortex. Behavioural brain research, 250, 91-101.
If you are interested in joining the lab please contact Matt Roesch at mroesch [at] umd.edu.
Please visit our website at...
- Cognitive and Neural Systems (CNS)
PhDNeuroscience, Department of Neuroscience and the Center for the Neural Basisi of Cognition, University of Pittsburgh