top of page

How does the brain influence moral decisions?

Updated: Feb 24

By Alissa Sofia Maria Bocance


The social intuitionist theory (Haidt, 2001) connects studies on automaticity (Bargh and Chartrand, 1999) with insights from neuroscience and evolutionary psychology. The cognitive control and conflict theory (Greene et al., 2004) suggests that emotional brain regions promote one result, while cognitive areas support an alternative one (Kahneman and Frederick, 2007; McClure et al., 2007). According to the cognitive and emotional integration theory, behavioral choices cannot be categorized into cognitive and emotional decisions. Challenging contextual scenarios can complicate the process of making behavioral choices (Gottfried, 1999; Moll et al., 2003). Approaches for investigating morality have been established, ranging from moral versus non-moral contexts to moral dilemmas (Young and Dungan, 2012). A situation is termed a moral dilemma when all possible actions breach a binding moral principle (Thomson, 1985). Historically, there have been two primary distinctions regarding moral dilemmas and judgments: (1) personal versus impersonal dilemmas and judgments (Greene et al., 2004); (2) utilitarian versus non-utilitarian moral judgments (Brink, 1986). These differences have resulted in the emergence of various paradigms. The most well-known are the trolley problem (Thomson, 1985) and the footbridge scenario (Navarrete et al., 2012). In both the trolley problem and the footbridge dilemma, the decision is to either save five individuals by sacrificing one or allow five to perish while one lives (Hauser, 2006; Greene, 2007). Nonetheless, the latter aligns with the definition of a personal dilemma, whereas the former does not (for comprehensive analyses of comparable moral dilemmas, refer to Greene et al., 2004; Koenigs et al., 2007; Decety et al., 2011; Pujol et al., 2011).

Alternative tasks that subject morality to experimental investigation involve visual statements or images (Greene et al., 2001; Harenski and Hamaan, 2006) or use scales and surveys to evaluate moral conduct from a clinical perspective (see Rush et al., 2008 for a review). Morality comprises a very intricate neural network. This section provides an overview of the primary brain regions and their associated circuitry. The moral brain encompasses a vast functional network that includes various brain structures. Simultaneously, numerous of these structures intersect with other areas that regulate various behavioral functions. We will examine them in this sequence: the frontal lobe, the parietal lobe, the temporal lobe and insula, followed by the subcortical structures. The orbital and ventromedial prefrontal cortices are involved in moral decisions influenced by emotions, whereas the dorsolateral prefrontal cortex seems to regulate its reaction. The anterior cingulate cortex might facilitate these conflicting processes. The orbitofrontal cortex (OFC) is linked to morality and has been connected to the online representation of rewards and punishments (O'Doherty et al., 2001; Shenhav and Greene, 2010). The right medial OFC showed activation during passive observation of moral stimuli as opposed to non-moral ones (Harenski and Hamaan, 2006), whereas the activation of the left OFC has been associated with the processing of emotionally significant statements carrying moral value (Moll et al., 2002). The anterior cingulate cortex (ACC) plays a role in the detection of errors (Shackman et al., 2011). It becomes active when individuals produce a utilitarian reaction (Young and Koenigs, 2007). The ACC, along with others, has been associated with the theory of mind (ToM) and self-referential activities (Frith, 2001), and it plays a role in monitoring moral conflicts (Greene et al., 2004, p. 391).

The inferior parietal region is mainly associated with working memory and cognitive control, so its recruitment during moral processing might be due to some cognitive engagement (Greene et al., 2004; Harenski et al., 2008; Cáceda et al., 2011). One of its functions, together with the posterior area of the superior temporal sulcus (STS), which we will review below, seems to be the perception and representation of social information that may be crucial for making inferences about others' beliefs and intentions (Allison et al., 2000) and the representation of personhood (Greene and Haidt, 2002). The temporoparietal junction (TPJ) plays a key role in moral intuition and belief attribution during moral processing in others (Young and Saxe, 2008; Harada et al., 2009; Young et al., 2010; Moore et al., 2011; Young and Dungan, 2011). Neural underpinnings of morality are not yet well understood. Researchers in moral neuroscience have tried to find specific structures and processes that shed light on how morality works. Orbital and ventromedial prefrontal cortices are implicated in emotionally driven moral decisions, while the dorsolateral prefrontal cortex appears to moderate its response. These competing processes may be mediated by the anterior cingulate cortex. Parietal and temporal structures play important roles in others' beliefs and intentions. The insular cortex engages during empathic processes. Other regions seem to play a more complementary role in morality. Morality is supported not by a single brain circuitry or structure but by several circuits overlapping with other complex processes. The identification of the core features of morality and moral-related processes is needed. Neuroscience can provide meaningful insights to delineate the boundaries of morality in conjunction with moral psychology. The temporal lobe is one of the main neural regions activated during ToM tasks (Völlm et al., 2006; Ciaramitaro et al., 2007; Muller et al., 2010). Structural abnormalities within this area can be related to psychopathy (Blair, 2010; Pujol et al., 2011). The anterior/middle temporal gyrus is related to moral judgment (Moll et al., 2001; Greene et al., 2004; Harenski and Hamann, 2006). The posterior cingulate cortex processes personal memory, self-awareness, and emotionally salient stimuli (Sestieri et al., 2011). It is one of the brain regions that exhibit greater engagement in personal than impersonal dilemmas (Funk and Gazzaniga, 2009). Its activation has been related to social ability (Greene et al., 2004), empathy (Völlm et al., 2006), and forgiveness (Farrow et al., 2001), and it can predict the magnitude of the punishments applied in criminal scenarios (Buckholtz et al., 2008). Additionally, the limbic system, which includes structures like the amygdala and the hippocampus, influences decision-making by processing emotions and memories. There is no doubt that emotions have an important influence on our decisions, sometimes leading to decisions that appear to be irrational.



References

  1. Allison, T., et al. (2000). Social perception from visual cues: Role of the STS region. Trends in Cognitive Sciences, 4(7), 267–278.

  2. Bargh, J. A., & Chartrand, T. L. (1999). The unbearable automaticity of being. American Psychologist, 54(7), 462–479.

  3. Blair, R. J. R. (2010). Neuroimaging of psychopathy and antisocial behavior: A targeted review. Current Psychiatry Reports, 12(1), 76–82.

  4. Brink, D. O. (1986). Utilitarian morality and the personal point of view. The Journal of Philosophy, 83(8), 417–438.

  5. Buckholtz, J. W., et al. (2008). The neural correlates of third-party punishment. Neuron, 60(5), 930–940.

  6. Cáceda, R., et al. (2011). Paradoxical lack of emotional bias in moral decision-making in major depression. Psychiatry Research: Neuroimaging, 191(1), 76–82.

  7. Ciaramidaro, A., et al. (2007). The intentional network: How the brain represents and acts on intentionality. NeuroImage, 34(2), 1319–1328.

  8. Decety, J., et al. (2011). The contribution of emotion and cognition to moral sensitivity: A neurodevelopmental study. Cerebral Cortex, 21(4), 1066–1073.

  9. Farrow, T. F., et al. (2001). Empathy for pain involves the affective but not sensory components of pain. Science, 303(5661), 1157–1162.

  10. Frith, C. D. (2001). Mind and brain: Neuroscience impacts on philosophy. Mind & Language, 16(2), 1–22.

  11. Funk, C. M., & Gazzaniga, M. S. (2009). The functional lateralization of decision-making in the human brain. Journal of Cognitive Neuroscience, 21(1), 1–9.

  12. Gottfried, J. A. (1999). Functional neuroimaging of odor identification and contextual processing. Proceedings of the National Academy of Sciences USA, 96(7), 4935–4940.

  13. Greene, J. D., et al. (2001). An fMRI investigation of emotional engagement in moral judgment. Science, 293(5537), 2105–2108.

  14. Greene, J. D., et al. (2004). The neural bases of cognitive conflict and control in moral judgment. Neuron, 44(2), 389–400.

  15. Greene, J. D. (2007). The secret joke of Kant’s soul. Moral Psychology, 3(1), 35–79.

  16. Greene, J. D., & Haidt, J. (2002). How (and where) does moral judgment work? Trends in Cognitive Sciences, 6(12), 517–523.

  17. Harada, T., et al. (2009). Neural correlates of theory of mind and empathy. Brain Research, 1281(1), 132–141.

  18. Harenski, C. L., & Hamann, S. (2006). Neural correlates of regulating negative emotions related to moral violations. NeuroImage, 30(1), 313–324.

  19. Hauser, M. D. (2006). Moral minds: How nature designed our universal sense of right and wrong. New York: Harper Collins.

  20. Kahneman, D., & Frederick, S. (2007). Frames and brains: Elicitation and control of response tendencies. Trends in Cognitive Sciences, 11(2), 45–46.

  21. Koenigs, M., et al. (2007). Damage to the prefrontal cortex increases utilitarian moral judgments. Nature, 446(7138), 908–911.

  22. McClure, S. M., et al. (2007). A dual-systems perspective on addiction: Contributions from neuroimaging and cognitive neuroscience. Annals of the New York Academy of Sciences, 1104(1), 62–78.

  23. Moll, J., et al. (2002). The neural correlates of moral sensitivity: A functional magnetic resonance imaging investigation. Nature Neuroscience, 6(3), 799–804.

  24. Moll, J., et al. (2003). The moral affiliations of disgust: A functional MRI study. Cognitive and Behavioral Neurology, 16(1), 68–78.

  25. Moor, B. G., et al. (2011). Social rejection and reasoning brain regions in moral judgment. Social Cognitive and Affective Neuroscience, 7(3), 287–297.

  26. Navarrete, C. D., et al. (2012). The footbridge dilemma in moral judgment. Emotion, 12(1), 131–136.

  27. O'Doherty, J. P., et al. (2001). Representation of reward value in the human orbitofrontal cortex. Nature Neuroscience, 4(1), 95–102.

  28. Pujol, J., et al. (2011). Overlapping neural correlates of moral judgment and psychopathy. Social Cognitive and Affective Neuroscience, 7(4), 395–406.

  29. Rush, B., et al. (2008). Measurement of moral development in psychology. Psychological Assessment, 20(4), 354–366.

  30. Shackman, A. J., et al. (2011). The integration of negative affect, pain, and cognitive control in the cingulate cortex. Nature Reviews Neuroscience, 12(3), 154–167.

  31. Shenhav, A., & Greene, J. D. (2010). Moral judgments recruit domain-general valuation mechanisms to integrate representations of probability and magnitude. Neuron, 67(4), 667–677.

  32. Thomson, J. J. (1985). The trolley problem. Yale Law Journal, 94(6), 1395–1415.

    Völlm, B. A., et al. (2006). The neural basis of theory of mind and empathy: Meta-analytic findings. NeuroImage, 29(4), 1173–1184.

  33. Young, L., & Dungan, J. (2012). Where in the brain is morality? Science, 336(6081), 1390–1392.

  34. Young, L., & Koenigs, M. (2007). Investigating emotion in moral cognition: A review of evidence from functional neuroimaging and neuropsychology. British Medical Bulletin, 84(1), 69–79.

  35. Young, L., & Saxe, R. (2008). The neural basis of belief encoding and integration in moral judgment. NeuroImage, 40(4), 1912–1920.

  36. Young, L., et al. (2010). The neural basis of moral judgments about harm. Neuron, 65(6), 845–851




 
 
 

Comments


bottom of page