- 1Research Clinic on Gambling Disorders, Aarhus University Hospital, Aarhus, Denmark
- 2Center of Functionally Integrative Neuroscience, Aarhus University, Aarhus, Denmark
- 3Division on Addiction, Cambridge Health Alliance, Cambridge, MA, USA
- 4Department of Psychiatry, Harvard Medical School, Harvard University, Cambridge, MA, USA
Gambling disorder is characterized by persistent and recurrent maladaptive gambling behavior, which leads to clinically significant impairment or distress. The disorder is associated with dysfunctions in the dopamine system. The dopamine system codes reward anticipation and outcome evaluation. Reward anticipation refers to dopaminergic activation prior to reward, while outcome evaluation refers to dopaminergic activation after reward. This article reviews evidence of dopaminergic dysfunctions in reward anticipation and outcome evaluation in gambling disorder from two vantage points: a model of reward prediction and reward prediction error by Wolfram Schultz et al. and a model of “wanting” and “liking” by Terry E. Robinson and Kent C. Berridge. Both models offer important insights on the study of dopaminergic dysfunctions in addiction, and implications for the study of dopaminergic dysfunctions in gambling disorder are suggested.
Neurobiological Underpinnings of Reward Anticipation and Outcome Evaluation in Gambling Disorder
Gambling disorder is characterized by persistent and recurrent maladaptive gambling behavior, which leads to clinically significant impairment or distress (American Psychiatric Association [DSM 5], 2013). Gambling disorder was recently reclassified from “pathological gambling” (an impulse control disorder) to a “behavioral addiction” under the substance use classification, which emphasizes the association between gambling disorder and other types of addiction.
Gambling disorder is associated with dysfunctions in the dopamine system. The dopamine system is sensitive to behavioral stimulation related to monetary reward, particularly in the ventral striatum (Koepp et al., 1998; Delgado et al., 2000; Breiter et al., 2001; de la Fuente-Fernández et al., 2002; Zald et al., 2004). Dopaminergic dysfunctions in the ventral striatum are linked to gambling disorder (Reuter et al., 2005; Abler et al., 2006; Linnet et al., 2010, 2011a,b, 2012; van Holst et al., 2012; Linnet, 2013).
The dopamine system codes reward anticipation and outcome evaluation. Reward anticipation refers to dopaminergic activation prior to reward, while outcome evaluation refers to dopaminergic activation after the reward. This article reviews evidence on dopaminergic dysfunctions in reward anticipation and outcome evaluation in gambling disorder from two vantage points: a model of reward prediction and reward prediction error by Schultz et al. (Fiorillo et al., 2003; Schultz, 2006; Tobler et al., 2007; Schultz et al., 2008), and a model of “wanting” and “linking” by Robinson and Berridge (Robinson and Berridge, 1993, 2000, 2003, 2008; Berridge and Aldridge, 2008; Berridge et al., 2009). It is suggested that gambling disorder may provide a “model disorder” of addiction for the two approaches, which is not confounded by ingestion of exogenous substances.
The ventral striatum and the nucleus accumbens (NAcc) play a central role in both models, which is consistent with findings of dopamine dysfunctions in the ventral striatum in gambling disorder. Therefore, this review focuses on the ventral striatum in relation to gambling disorder. Other relevant areas include the prefrontal cortex (e.g., orbitofrontal cortex) and other areas of the basal ganglia (e.g., the putamen, nucleus or caudate).
Reward Prediction and Reward Prediction Error
Reward prediction refers to the anticipation of reward, while reward prediction error refers to the outcome evaluation. Reward prediction and reward prediction error are associated with the learning of reward properties of stimuli. According to Wolfram Schultz (2006), reward prediction and reward prediction error derive from Kamin’s blocking rule (Kamin, 1969), which suggests that a reward that is fully predicted does not contribute to learning. A stimulus that can be entirely predicted contains no new information, and the reward prediction error rate is therefore zero. Rescola and Wagner described the so-called Rescola-Wagner learning rule (Rescola and Wagner, 1972), which states that learning slows progressively as the reinforcer becomes more predicted.
In random binary outcome conditions, e.g., reward vs. no-reward, the expected value (EV) is the average value that can be expected from a given stimulus, which is a linear function of reward probability. In contrast, uncertainty, which can be defined as the variance (σ2) of a probability distribution (Schultz et al., 2008), is the mean squared deviation from the EV, which is an inverse U-shaped function. Midbrain and striatal dopamine coding of EV and uncertainty follow linear and quadratic functions of reward prediction similar to their mathematical expressions (Fiorillo et al., 2003; Preuschoff et al., 2006; Schultz, 2006). The dopamine system also codes deviations in outcome from the reward prediction, i.e., reward prediction error: “…dopamine neurons emit a positive signal (activation) when an appetitive event is better than predicted, no signal (no change in activity) when an appetitive event occurs as predicted, and a negative signal (decreased activity) when an appetitive event is worse than predicted…[and] dopamine neurons show bidirectional coding of reward prediction errors, following the equation Dopamine response = Reward occurred−Reward predicted” (Schultz, 2006, pp. 99–100).
Fiorillo et al. (2003) investigated dopamine activation in reward prediction and reward prediction error in relation to EV and uncertainty (i.e., variance in outcome). In the study, two monkeys were exposed to stimuli with varying reward probabilities (P = 0, P = 0.25, P = 0.5, P = 0.75 and P = 1.0). The rate of anticipatory licking and the activation of dopamine neurons in the ventral midbrain (area A8, A9 and A10) were recorded. Dopaminergic coding of reward prediction was measured as a phasic signal immediately after stimulus presentation, while coding of reward prediction error was measured as a phasic signal immediately after the outcome of the stimulus (reward or no reward). Dopaminergic coding of uncertainty was measured as a sustained signal from stimulus presentation to outcome.
The authors reported three main results. First, the reward probabilities of stimuli were correlated with the anticipatory licking rate and the anticipatory phasic dopamine response. This suggests that the reward probability reinforced the dopaminergic activation and the behavioral response. Second, the sustained dopamine response toward uncertainty followed the properties of variance, i.e., it was largest toward stimuli with 50% reward probability (P = 0.5), smaller toward stimuli with P = 0.75 and P = 0.25, and smallest toward stimuli with P = 1.0 and P = 0.0. Third, rewarded stimuli with lower reward probability had a larger phasic dopamine response following the reward, which suggests a larger positive reward prediction error signal; rewarded stimuli with higher reward probability had a smaller phasic dopamine response following the reward, which suggests a smaller reward prediction error signal.
Neurobiological studies of gambling in humans support the evidence of reward prediction and reward prediction error. Abler et al. (2006) used functional magnetic resonance imaging (fMRI) to investigate reward prediction and reward prediction error in an incentive task where participants were shown five figures associated with different reward probabilities (P = 0.0, P = 0.25, P = 0.50, P = 0.75, and P = 1.0). The results showed a significant anticipatory blood oxygen level dependent (BOLD) activation in the NAcc, which was proportional to the reward probability. Furthermore, there was a significant interaction between outcome and BOLD activation in the NAcc, where the BOLD activation was higher when low probability stimuli were rewarded, and lower when high probability stimuli were rewarded.
Preuschoff et al. (2006) used a card guessing task to investigate the relationship between risk and uncertainty in relation to anticipated reward. The task consisted of 10 cards ranging from 1 to 10, where two cards were drawn in succession. Before the drawing of the second card participants had to guess whether the first card would be higher or lower than the second card. The results showed that reward probability was linearly associated with immediate BOLD activation: higher reward probability was associated with a higher immediate anticipatory BOLD signal, and lower reward probability was associated with a lower immediate anticipatory BOLD signal. In contrast, uncertainty showed an inverse U-shaped relation with late BOLD activation: the highest anticipatory BOLD signals were seen around maximum uncertainty (P = 0.5) and the lowest anticipatory BOLD signals were seen around maximum certainty (P = 1.0 and P = 0.0).
Neurobiological studies support the notion of dopaminergic dysfunctions of reward anticipation in gambling disorder. van Holst et al. (2012) compared 15 gambling disorder sufferers with 16 healthy controls in a fMRI study investigating reward anticipation in a card guessing task. Gambling disorder sufferers showed a significant increase in BOLD activation in the bilateral ventral striatum and in the left orbitofrontal cortex toward gain-related EV. This suggests an increased BOLD activation toward reward anticipation. No differences in BOLD activation were found toward outcome evaluation. Linnet et al. (2012) compared 18 gambling disorder sufferers and 16 healthy controls in a positron emission tomography (PET) study using the Iowa Gambling Task (IGT). Dopamine release in the striatum of gambling disorder sufferers showed a significant inverted U-curve with the probability of advantageous IGT performance. Gambling disorder sufferers with maximum uncertainty of outcome (P = 0.5) had a larger dopamine release than individuals with IGT performance closer to certain gains (P = 1.0) or certain losses (P = 0.0). This is consistent with the notion of dopaminergic coding of uncertainty. No interaction was found between dopamine release and uncertainty among healthy control subjects, which could suggest a stronger reinforcement of gambling behavior among gambling disorder sufferers. Therefore, in gambling disorder dopaminergic anticipation of reward and uncertainty might represent a dysfunctional reward anticipation, which reinforces the gambling behavior despite losses.
In outcome evaluation the evidence suggests a blunted dopamine response in gambling disorder sufferers. Reuter et al. (2005) compared 12 gambling disorder sufferers with 12 healthy controls in a card guessing task. Gambling disorder sufferers showed a significantly lower BOLD response in the ventral striatum toward winning compared with healthy controls. Furthermore, gambling disorder sufferers showed a significant negative correlation between the BOLD activation and severity in gambling symptoms, which suggests a blunted outcome evaluation in gambling disorder.
One of the limitations of the reward prediction and reward prediction error model is that it is not a theory of addiction or gambling disorder, per se. In other words, while the increased dopaminergic activation toward uncertainty might be a central mechanism in the reinforcement of gambling behavior, it does not explain why some individuals become addicted to gambling, while others do not. In contrast, the incentive-sensitization model suggests that addictive behavior is associated with a combination of dopaminergic reinforcement and changes to the dopamine system (sensitization) following repeated drug exposure.
Incentive-Sensitization Model of “Wanting” and “Liking”
Terry E. Robinson and Kent C. Berridge (Robinson and Berridge, 1993, 2000, 2003, 2008; Berridge and Aldridge, 2008; Berridge et al., 2009) have proposed an incentive-sensitization model, which distinguishes pleasure (“liking”) from incentive salience (“wanting”) in addiction. “Wanting” is associated with anticipation of reward, while “liking” is associated with outcome evaluation.
The incentive-sensitization model focuses on the dopamine system as a core neurobiological basis of addiction. The ventral striatum and its main component the NAcc are associated with addiction. Changes in the dopamine system associated with drug exposure render the brain circuits hypersensitive or “sensitized” to drugs or drug cues. Sensitization from repeated drug exposure may also occur at the level of psychomotor or locomotor activity. Sensitization is linked with increased incentive salience, which is the cognitive process associated with drug seeking and drug taking behavior. Incentive salience (“wanting”) refers to a motivational state, which can be conscious or unconscious, goal-oriented or non goal-oriented, and pleasurable or non-pleasurable:
“The quotation marks around the term “wanting” serve as caveat to acknowledge that incentive salience means something different from the ordinary common language sense of the word wanting. For one thing, “wanting” in the incentive salience sense need not have a conscious goal or declarative target…. Incentive salience is separable from beliefs and declarative goals that constitute cognitive aspects of “wanting”” (Berridge and Aldridge, 2008, pp. 8–9).
Incentive salience (“wanting”) increases after repeated exposure to drugs and drug cues, while pleasure (“liking”) remains the same or decreases over time. The incentive-sensitization model of “wanting” and “liking” offers an explanation for the apparent paradox that individuals with substance use disorder have an increased desire for drugs despite getting less pleasure from taking them. Incentive “hotspots” have been identified in the NAcc: activation in the medial NAcc shell is distinctly associated with “liking”, while activation throughout the NAcc (particularly around the ventral pallidum) is associated with “wanting” (Berridge et al., 2009).
Incentive sensitization defines the relationship between incentive salience and sensitization. Incentive salience must be coupled with sensitization to account for addictive behavior: an increase in dopamine binding does not define incentive sensitization, but an increase in dopamine binding in relation to particular drug cues does; locomotor activity does not indicate incentive sensitization, but running around to get drugs does; psychomotor preoccupation does not indicate incentive sensitization, but an obsession with taking drugs does. Therefore, simple reinforcement of behavior is insufficient to account for addictive behavior.
“The central idea is that addictive drugs enduringly alter NAcc-related brain systems that mediate a basic incentive-motivational function, the attribution of incentive salience. As a consequence, these neural circuits may become enduringly hypersensitive (or “sensitized”) to specific drug effects and to drug-associated stimuli (via activation by S-S associations). The drug-induced brain change is called neural sensitization. We proposed that this leads psychologically to excessive attribution of incentive salience to drug-related representations, causing pathological “wanting” to take drugs” (Robinson and Berridge, 2003, p. 36).
Berridge and Aldridge (2008) provide an example of the incentive-sensitization approach to research in addiction. In this approach, animals are trained under two conditions: first, the animals are conditioned to work (press a lever) for rewards (e.g., food pellets), and must persist working to earn rewards. In a separate training session the animals receive rewards without having to work for them, where each reward is associated with an auditory tone cue for 10–30 s, which is the conditioned stimulus (CS+). After training, the animals are tested in an extinction paradigm where “wanting” is measured as the number of lever presses the animal is willing to perform without receiving a reward. Since the animals receive no rewards, the “wanting” is not confounded by consumption of reward. The key of the paradigm is to test changes in behavior when the conditioned auditory stimulus is introduced during different drug induced states. In a series of studies, Wyvell and Berridge (2000, 2001) showed that rats injected with amphetamine microinjections in the NAcc shell had significantly more lever presses when the conditioned auditory stimulus was introduced compared to rats injected with saline microinjections. In a related experiment, Wyvell and Berridge (2000, 2001) found that the measures of liking (facial reaction to receiving a sugar reward) did not differ whether the animals received saline or amphetamine microinjections. These findings suggest that amphetamine is associated with an increased cue-triggered “wanting”, but not with increased pleasure (“liking”) from receiving the reward.
The incentive-sensitization model’s suggestions of increased “wanting” and decreased “liking” in addiction are consistent with the findings from the gambling disorder literature of increased dopamine activation to anticipated reward (Fiorillo et al., 2003; Abler et al., 2006; Preuschoff et al., 2006; Linnet et al., 2011a, 2012) and blunted dopamine activation to outcome of reward (Reuter et al., 2005). These findings suggest that dopaminergic dysfunctions toward anticipated rewards, rather than actual rewards, reinforce gambling behavior among gambling disorder sufferers. The sensitization of the dopamine system toward anticipated rewards rather than incurred rewards can explain why gambling disorder sufferers continue gambling despite losses, and might play a central role in the formation of erroneous perceptions about the likelihood of winning from gambling (Benhsain et al., 2004).
One of the limitations of the incentive-sensitization model is that individuals with substance use disorder have lower dopamine release and lower dopamine receptor availability despite having increased incentive-sensitization:
“However, it must be acknowledged that the current literature contains conflicting results about brain dopamine changes in addicts. For example, it has been reported that detoxified cocaine addicts actually show a decrease in evoked dopamine release rather than the sensitized increase described above…. Another finding in humans that seems inconsistent with sensitization is that cocaine addicts are reported to have low levels of striatal dopamine D2 receptors even after long abstinence…. This suggests a hypodopaminergic state rather than a sensitized state” (Robinson and Berridge, 2008, p. 3140).
While lower binding potentials are reported in substance use disorders, there is no evidence of decreased binding potentials in the gambling disorder literature (Linnet, 2013). Therefore, gambling disorder might serve as a “model” disorder for the incentive-sensitization model, as gambling is not confounded by the ingestion of exogenous substances.
Implications of Reward Anticipation and Outcome Evaluation in Gambling Disorder
The models by Schultz et al. and Robinson and Berridge provide important insights on the study on gambling disorder. The reward prediction and reward prediction error model by Schultz et al. offers an explanation for the behavioral reinforcement of reward anticipation in addiction, while the incentive-sensitization model by Robinson and Berridge explains the mechanisms of “wanting” and “liking” in addiction. At the same time, gambling disorder may serve as a “model” disorder in addressing certain aspects of the two models.
First, the lower levels of binding potentials reported in substance use disorder are not seen in gambling disorder (Linnet et al., 2010, 2011a,b, 2012; Clark et al., 2012; Boileau et al., 2013). This might suggest that incentive sensitization can occur independently of baseline dopamine binding in support of the incentive-sensitization model.
Second, while the studies by Fiorillo et al. (2003) and Preuschoff et al. (2006) support the notion of sustained anticipatory dopamine activation toward uncertainty, more research is needed to determine whether or not this mechanism is associated with dopaminergic dysfunctions in gambling disorder.
Third, the gambling disorder literature suggests increased brain activation toward reward anticipation and blunted activation toward outcome evaluation. This is consistent with the incentive-sensitization model’s suggestion of increased “wanting” but decreased “liking” in addiction and the notion of sustained anticipatory dopamine activation in reward prediction. Dopaminergic dysfunction in reward anticipation might constitute a common mechanism of addiction, because it occurs in the absence of reward. Therefore, reward anticipation may have a similar (dys)function, whether the reward is food, drugs or gambling. Further studies should address reward anticipation and outcome evaluation in gambling disorder.
Conflict of Interest Statement
The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Acknowledgments
This study was supported by funding from the Danish Agency for Science, Technology and Innovation grant number 2049-03-0002, 2102-05-0009, 2102-07-0004, 10-088273 and 12-130953; and from the Ministry of Health grant number 1001326 and 121023.
References
Abler, B., Walter, H., Erk, S., Kammerer, H., and Spitzer, M. (2006). Prediction error as a linear function of reward probability is coded in human nucleus accumbens. Neuroimage 31, 790–795. doi: 10.1016/j.neuroimage.2006.01.001
American Psychiatric Association [DSM 5]. (2013). Diagnostic and Statistical Manual of Mental Disorders: DSM 5. 5th Edn. Washington, DC: American Psychiatric Publishing.
Benhsain, K., Taillefer, A., and Ladouceur, R. (2004). Awareness of independence of events and erroneous perceptions while gambling. Addict. Behav. 29, 399–404. doi: 10.1016/j.addbeh.2003.08.011
Berridge, K. C., and Aldridge, J. W. (2008). Decision utility, the brain and pursuit of hedonic goals. Soc. Cogn. 26, 621–646. doi: 10.1521/soco.2008.26.5.621
Berridge, K. C., Robinson, T. E., and Aldridge, J. W. (2009). Dissecting components of reward: ‘liking’, ‘wanting’, and learning. Curr. Opin. Pharmacol. 9, 65–73. doi: 10.1016/j.coph.2008.12.014
Boileau, I., Payer, D., Chugani, B., Lobo, D., Behzadi, A., Rusjan, P. M., et al. (2013). The D2/3 dopamine receptor in pathological gambling: a positron emission tomography study with [11c]-(+)-propyl-hexahydro-naphtho-oxazin and [11c]raclopride. Addiction 108, 953–963. doi: 10.1111/add.12066
Breiter, H. C., Aharon, I., Kahneman, D., Dale, A., and Shizgal, P. (2001). Functional imaging of neural responses to expectancy and experience of monetary gains and losses. Neuron 30, 619–639. doi: 10.1016/s0896-6273(01)00303-8
Clark, L., Stokes, P. R., Wu, K., Michalczuk, R., Benecke, A., Watson, B. J., et al. (2012). Striatal dopamine D2/D3 receptor binding in pathological gambling is correlated with mood-related impulsivity. Neuroimage 63, 40–46. doi: 10.1016/j.neuroimage.2012.06.067
de la Fuente-Fernández, R., Phillips, A. G., Zamburlini, M., Sossi, V., Calne, D. B., Ruth, T. J., et al. (2002). Dopamine release in human ventral striatum and expectation of reward. Behav. Brain Res. 136, 359–363. doi: 10.1016/s0166-4328(02)00130-4
Delgado, M. R., Nystrom, L. E., Fissell, C., Noll, D. C., and Fiez, J. A. (2000). Tracking the hemodynamic responses to reward and punishment in the striatum. J. Neurophysiol. 84, 3072–3077.
Fiorillo, C. D., Tobler, P. N., and Schultz, W. (2003). Discrete coding of reward probability and uncertainty by dopamine neurons. Science 299, 1898–1902. doi: 10.1126/science.1077349
Kamin, L. J. (1969). “Selective association and conditioning,” in Fundamental Issues in Instrumental Learning, eds N. J. Mackintosh and W. K. Honing (Halifax, N.S.: Dalhousie University Press), 42–64.
Koepp, M. J., Gunn, R. N., Lawrence, A. D., Cunningham, V. J., Dagher, A., Jones, T., et al. (1998). Evidence for striatal dopamine release during a video game. Nature 393, 266–268. doi: 10.1038/30498
Linnet, J. (2013). The Iowa Gambling Task and the three fallacies of dopamine in gambling disorder. Front. Psychol. 4:709. doi: 10.3389/fpsyg.2013.00709
Linnet, J., Moller, A., Peterson, E., Gjedde, A., and Doudet, D. (2011a). Dopamine release in ventral striatum during Iowa Gambling Task performance is associated with increased excitement levels in pathological gambling. Addiction 106, 383–390. doi: 10.1111/j.1360-0443.2010.03126.x
Linnet, J., Møller, A., Peterson, E., Gjedde, A., and Doudet, D. (2011b). Inverse association between dopaminergic neurotransmission and Iowa Gambling Task performance in pathological gamblers and healthy controls. Scand. J. Psychol. 52, 28–34. doi: 10.1111/j.1467-9450.2010.00837.x
Linnet, J., Mouridsen, K., Peterson, E., Møller, A., Doudet, D., and Gjedde, A. (2012). Striatal dopamine release codes uncertainty in pathological gambling. Psychiatry Res. 204, 55–60. doi: 10.1016/j.pscychresns.2012.04.012
Linnet, J., Peterson, E. A., Doudet, D., Gjedde, A., and Møller, A. (2010). Dopamine release in ventral striatum of pathological gamblers losing money. Acta Psychiatr. Scand. 122, 326–333. doi: 10.1111/j.1600-0447.2010.01591.x
Preuschoff, K., Bossaerts, P., and Quartz, S. R. (2006). Neural differentiation of expected reward and risk in human subcortical structures. Neuron 51, 381–390. doi: 10.1016/j.neuron.2006.06.024
Rescola, R. A., and Wagner, A. R. (1972). “A theory of pavlovian conditioning: variations in the effectiveness of reinforcement and nonreinforcement,” in Classical Conditioning II: Current Research and Theory, eds A. H. Black and W. F. Prokasy (New York: Appleton-Century-Crofts), 64–99.
Reuter, J., Raedler, T., Rose, M., Hand, I., Gläscher, J., and Buchel, C. (2005). Pathological gambling is linked to reduced activation of the mesolimbic reward system. Nat. Neurosci. 8, 147–148. doi: 10.1038/nn1378
Robinson, T. E., and Berridge, K. C. (1993). The neural basis of drug craving: an incentive-sensitization theory of addiction. Brain Res. Brain Res. Rev. 18, 247–291. doi: 10.1016/0165-0173(93)90013-p
Robinson, T. E., and Berridge, K. C. (2000). The psychology and neurobiology of addiction: an incentive-sensitization view. Addiction 95(Suppl. 2), S91–S117. doi: 10.1046/j.1360-0443.95.8s2.19.x
Robinson, T. E., and Berridge, K. C. (2003). Addiction. Annu. Rev. Psychol. 54, 25–53. doi: 10.1146/annurev.psych.54.101601.145237
Robinson, T. E., and Berridge, K. C. (2008). Review. The incentive sensitization theory of addiction: some current issues. Philos. Trans. R. Soc. Lond. B Biol. Sci. 363, 3137–3146. doi: 10.1098/rstb.2008.0093
Schultz, W. (2006). Behavioral theories and the neurophysiology of reward. Annu. Rev. Psychol. 57, 87–115. doi: 10.1146/annurev.psych.56.091103.070229
Schultz, W., Preuschoff, K., Camerer, C., Hsu, M., Fiorillo, C. D., Tobler, P. N., et al. (2008). Explicit neural signals reflecting reward uncertainty. Philos. Trans. R. Soc. Lond. B Biol. Sci. 363, 3801–3811. doi: 10.1098/rstb.2008.0152
Tobler, P. N., O’Doherty, J. P., Dolan, R. J., and Schultz, W. (2007). Reward value coding distinct from risk attitude-related uncertainty coding in human reward systems. J. Neurophysiol. 97, 1621–1632. doi: 10.1152/jn.00745.2006
van Holst, R. J., Veltman, D. J., Büchel, C., van den Brink, W., and Goudriaan, A. E. (2012). Distorted expectancy coding in problem gambling: is the addictive in the anticipation? Biol. Psychiatry 71, 741–748. doi: 10.1016/j.biopsych.2011.12.030
Wyvell, C. L., and Berridge, K. C. (2000). Intra-accumbens amphetamine increases the conditioned incentive salience of sucrose reward: enhancement of reward “wanting” without enhanced “liking” or response reinforcement. J. Neurosci. 20, 8122–8130.
Wyvell, C. L., and Berridge, K. C. (2001). Incentive sensitization by previous amphetamine exposure: increased cue-triggered “wanting” for sucrose reward. J. Neurosci. 21, 7831–7840.
Keywords: anticipation, reward prediction error, reward prediction, incentive salience, dopamine, gambling disorder, pathological gambling
Citation: Linnet J (2014) Neurobiological underpinnings of reward anticipation and outcome evaluation in gambling disorder. Front. Behav. Neurosci. 8:100. doi: 10.3389/fnbeh.2014.00100
Received: 02 January 2014; Accepted: 10 March 2014;
Published online: 25 March 2014.
Edited by:
Bryan F. Singer, University of Michigan, USAReviewed by:
Alexis Faure, Centre Neurosciences Paris Sud (CNPS), CNRS, FranceJonathan David Morrow, University of Michigan, USA
Copyright © 2014 Linnet. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Jakob Linnet, Research Clinic on Gambling Disorders, Aarhus University Hospital, Nørrebrogade 44, Building 30, DK-8000 Aarhus C, Denmark e-mail: linnet@cfin.au.dk; jakolinn@rm.dk; jlinnet@mac.com