- By Frédéric Canini and Damien Claverie ( Revue Défense Nationale, 2023/HS4)
- Translated By Mohamed SAKHRI
Ethical challenges are situations in which an individual must choose between multiple solutions, none of which is satisfactory. All these options violate one or more ethical values that underpin the individual. An ethical dilemma arises from the conflict of choices and the necessity of making a decision. However, while there may not be good choices available, there may be worse options that could leave lasting consequences. This definition excludes pseudo-ethical dilemmas for which a good solution exists but is not favored by the environment (Kvalnes, 2019).
The self-referential nature of ethical decision-making requires defining ethics as the manner of behaving that aligns with one’s self-perception. This definition is rooted in the Greek word “èthos,” which signifies both custom and the habitual way of being of an individual, as well as their psychology (Cassin, 2019). Thus, it involves being in accordance with oneself and societal customs—meaning both the implicit and explicit foundations of the society in which we live. This ambiguity can lead to potential ethical dissonances.
Examples of Ethical Dilemmas
Historically, ethical work deriving from the branch of philosophy dealing with morality (“ethikê”) is now supplemented by developments in experimental philosophy, psychology, and neuroscience. Philosophical reflection is thus accompanied by an experimental foundation (Knobe et al., 2012). This has resulted in models of ethical dilemmas arising from field observations, thought experiments, and even laboratory experiments that include brain explorations using functional imaging.
The trolley dilemma is a classic model proposed by British-American philosopher Philippa Foot, illustrating an obstetric ethical dilemma of “saving the mother or the child”: “You are driving an uncontrollable trolley and come to a fork in the tracks, but there are men working on the left and right tracks” (Foot, 1967). The issue has since been made more complex (Andrade, 2019): “One person is working on the right track, but five people are working on the left track. These people are far away but identifiable, and it is certain that the trolley will kill those it encounters. The subject, positioned on a bridge above the switch, can choose the track the trolley will take. However, if the trolley is heading toward the track with five people, the subject can push a person from the bridge, whose fall will definitely stop the trolley.” Each situation presents a dilemma, as none of the proposed solutions is satisfactory. A utilitarian decision emerges, prioritizing the death count, alongside a deontological response prioritizing the impossibility of sacrificing one person. In all cases, contextual elements are crucial.
Stanley Milgram’s experiment serves as another example of an ethical dilemma, even though the original experiment was designed to study obedience to authority (Milgram, 1974; Terestchenko & Fazzi, 2013). “A subject is recruited to participate in a scientific experiment. They take on the role of an experimenter under the control of an authority figure. They administer a series of questions to a student and apply an electric shock for each incorrect answer. The shock’s intensity increases with the number of errors. The student is an actor who intentionally gives incorrect answers and simulates pain according to the shock’s intensity. The experiment stops as soon as the subject refuses or halts the experiment.” The ethical dilemma here is: to stop and violate their commitment to participate or to continue and excessively harm the student. This dilemma becomes even more significant, as it recurs with each increase in electric intensity. Before stopping, the subject negotiates alternative solutions with the student (encouraging, prompting answers) or with their authority (asking for permission to stop) and displays sometimes intense stress reactions. The authority opposes stopping and maintains the dilemma until the subject revolts. This widely replicated model helps understand the critical role of context, the impact of personality, and the social interactions at play. However, it also raises ethical questions: the subject believes in the scenario and may suffer, even if they discuss it with Milgram’s team after the experiment.
All ethical dilemmas are based on impersonal conflicts (the trolley) or personal conflicts (having to kill to save oneself), violations of social norms, or conflicts of interest. Creating new dilemmas more suited to military contexts is possible based on these categories and field observations, whether for combatants (e.g., whether to shoot a child-soldier, risking the individual to save the collective) or military healthcare personnel subjected to dual loyalty (physicians vs. military) and a dual purpose (the individual vs. the collective).
The Brain’s Response
Neuroscience offers functional insights into these ethical dilemma experiences. However, the complexity of the mechanisms (Figure 1) resembles a gigantic matrix where time (fast vs. slow responses) and encoding pathways (emotion coding, rationality, conflict, decision-making, etc.) interact. The temporal dichotomy has been theorized as “experience vs. rationality” (Epstein, 2003), ventral vs. dorsal brain pathways (LeDoux, 1998), and more generally as “System 1 vs. System 2” (Kahneman, 2016).
The rapid responses provided by experience or System 1 involve unconscious, spontaneous decisions that are low in psychological resource consumption, associative, but context-dependent and unreliable. They correspond to implicit cognition with its cognitive biases and prejudices, as well as to habits, expertise, and intuition. In the field of ethics, intuition is defined as a quick, unconscious, integrative decision, emotionally marked, but resulting in little stress (Hodgkinson et al., 2008). It detects coherence in the context and leads to an appropriate response based on prior evaluation. While it may not strictly support the ethical conflict, it represents the first step toward it. Intuition engages numerous brain areas, including subcortical regions (amygdala) and cortical areas (insula integrating bodily information; medial prefrontal areas, bilateral ventrolateral, and left dorsolateral areas integrating emotional and rational information) (Forbes & Grafman, 2010). Therefore, intuition represents a fusion of emotional and rational information. However, it differs from slower processes since it involves the lower parietal and superior temporal cortices (Ilg et al., 2007), along with decisions triggered by prior exposure to information, which are also unconscious, but for which the temporo-occipital complex is deactivated (Zander et al., 2016).
Slow responses provided by rationality or System 2 are largely conscious, context-independent, but guided by rules, elaborated, analytical, and explicit, making them more resource-intensive. Managing ethical conflict involves a reasoning phase that engages the right cingulate and temporal gyri, followed by a moral decision-making phase—”right vs. wrong”—involving anterior cingulate areas and the medial frontal gyrus (Bryant et al., 2016). Rational information is typically processed through a dorsal route that activates the dorsolateral prefrontal cortex, while emotional information is carried mainly by a ventral route activating the ventromedial prefrontal cortex (Forbes & Grafman, 2010). Depending on the differential activation of informational pathways, utilitarian decisions prioritizing the common good over individual interests may emerge (dorsal pathway), or altruistic or egocentric decisions focusing more on the individual, whether concerning others or oneself (ventral pathway), may arise.
Systems 1 and 2 function in a coordinated manner, which is not surprising given the number of brain structures working in each modality (Forbes & Grafman, 2010). Thus, prosocial attitudes (implicit) facilitate altruistic behaviors, while sensitivity to injustice (explicit) promotes positive comments about altruistic conduct. Spontaneous behavior and thoughtful comments made retrospectively share the same prosocial tone in ethical responses (Chen et al., 2022). The question is how System 2 is favored over System 1 during decision-making. Considering the trolley model, when the dilemma is not engaged or integrated by the individual, System 1 proposes a behavior (if there are workers on the left track, slow down or go right). If the context turns out incompatible (the brakes don’t work AND there are workers on the right track as well), a conflict arises, leading to an intense conscious emotion, increased vigilance, stress emergence, and mobilization of System 2. This transition could be triggered by very early unconscious emotions mobilizing the body, thus providing complementary information to the brain in what is known as somatic markers (Bechara & Damasio, 2005). In this way, a beneficial strategy for the individual may be set in motion before they are even conscious of it (Bechara et al., 1997).
The Individual’s Response
Our understanding of the brain’s functioning during ethical dilemmas is ultimately quite simplistic as it revolves around thought experiments or laboratory conditions and simple paradigms dissecting the context. What about real life, where altruistic acts take place?
In the United States, acts of dedication are recognized with the awarding of the “Carnegie Hero Medal” (Rand & Epstein, 2014). The vast majority of behaviors that motivated this distinction resulted from rapid decisions. System 1 was in operation. Further reflection evidently did not lead to altruistic decisions. Under pressure, some follow their intuition and act, while others may get trapped in a dilemma but act far less frequently, if at all.
What happens in a situation of life-threatening risk, such as a shipwreck? Who are the survivors—those who will be the first to reach the means of rescue? In most cases, the highest survival rates go to the crew, then to the captain, followed by male passengers, then female passengers, and finally children (Elinder & Erixson, 2012). The immediate conclusion might be that the survival instinct prevails. The case of the Titanic (1912) is an exception. It has been compared to the Lusitania, another ship that sank (1915). The Lusitania, which sank in 18 minutes, saw self-centered behaviors arise, whereas the Titanic, sinking over more than 2 hours, observed social behaviors (Frey et al., 2010). Thus, we cannot exclude the emergence of social behaviors due to the expression of implicit social norms or emotional interactions with other survivors that lead to conscious sacrificial conduct. One must keep in mind that these shipwreck situations are marked by intense stress that modifies behaviors. Since exposure to a laboratory stressor tends to reduce utilitarian responses (Youssef et al., 2012), it would be consistent to suggest that intense stress promotes survival-oriented reactions. However, all these responses should be understood within a temporal dynamic.
The third example is Milgram’s experiment, a paradigm where the individual does not risk their life but engages due to the very mystification they experience. The response is clear: as the dilemma becomes more pronounced, the emotional response and accompanying stress grow stronger, thus increasing the likelihood of the experiment being halted. The utilitarian logic that promotes science and/or the logic of obedience to a norm fades away.
The Aftermath of the Ethical Dilemma
Once the decisive moment of ethical decision-making has passed, what remains? What are the psychological consequences of a poorly managed ethical dilemma? An act performed against an individual’s deep desire, for whatever reason, especially if repeated within a job context, can leave a mark in their brain—a memory of the event. This memory will at best be an unpleasant recollection, at worst a mental pathology, varying in severity and forming a true spectrum ranging from vulnerability to maladaptation (“moral distress”) and to illnesses (“moral injury”), all linked to the stress accompanying exposure to ethical dilemmas (Papazoglou & Chopko, 2017). The severity and type of these aftereffects vary.
Chronic exposure to low-intensity ethical dilemmas tends to lead to maladaptation to the profession, compassion fatigue, and burnout. Exposure to a single life-altering event tends to induce shame and guilt, often associated with anxiety or depression, and post-traumatic stress disorder (PTSD). These consequences of exposure to a challenging profession, intertwined with exposure to repeated low-intensity ethical dilemmas or rare but dramatic incidents, necessitate follow-up for personnel, especially if the profession is structured around re-exposure to the context (Task Group HFM-179, 2018).
References
- Andrade Gabriel, « Medical Ethics and the Trolley Problem », Journal of Medical Ethics and History of Medicine, 12(3), 2019, p. 1-15. http://dx.doi.org/10.18502/jmehm.v12i3.766.
- Bechara Antoine et Damasio Antonio R., « The Somatic Marker Hypothesis : a Neural Theory of Economic Decision », Games and Economic Behavior, 52(2), 2005, p. 336-372. https://doi.org/10.1016/j.geb.2004.06.010.Consulter
- Bechara Antoine, Damasio Hanna, Tranel Daniel & Damasio Antonio R., « Deciding Advantageously before Knowing the Advantageous Strategy », Science, 275(5304), 1997, p. 1293-1295. https://doi.org/.Consulter
- Bryant Douglas J., Wang F., Deardeuff Kelley, Zoccoli Emily & Nam Chang S., « The Neural Correlates of Moral Thinking : a Meta-Analysis », International Journal of Computational & Neural Engineering, 3(2), 2016, p. 28-39. https://doi.org/10.19070/2572-7389-160005.
- Cassin Barbara (dir.), Vocabulaire européen des philosophes. Le dictionnaire des intraduisibles, Édition du Seuil/Le Robert, 2019, p. 1563.
- Chen Chenyi, Martínez Roger Marcelo, Chen Yu-Chun, Fan Yang-Teng et Cheng Yawei, « The Neural Mediators of Moral Attitudes and Behaviors », Behavioural Brain Research, 430, 2022, 113934. https://doi.org/.
- Elinder Mikael & Erixson Oscar, « Gender, Social Norms, and Survival in Maritime Disasters », Proceeding of the National Academy of Science of the United States of America, 109(33), 2012, p. 13220-13224. https://doi.org/.
- Epstein Seymour, « Cognitive-Experiential Self-Theory of Personality », in Millon Theodore & Lerner Melvin J. (eds), Handbook of Psychology. Vol. 5 : Personality and Social Psychology, Hoboken, New Jersey : John Wiley & Sons, 2003, p. 159-184.Consulter
- Foot Philippa, « The Problem of Abortion and the Doctrine of the Double Effect », Oxford Review, 5, 1967, p. 5-15.
- Forbes Chad E. & Grafman Jordan, « The Role of the Human Prefrontal Cortex in Social Cognition and Moral Judgment », Annual Review of Neuroscience, 33, p. 299-324, 2010. https://doi.org/.Consulter
- Frey Bruno S., Savage David A. & Torgler Benno, « Interaction of Natural Survival Instincts and Internalized Social Norms Exploring the Titanic and Lusitania Disasters », Proceeding of the National Academy of Science of the United States of America, 107(11), 2010, p. 4862-4865. https://doi.org/10.1073/pnas.0911303107.
- Hodgkinson Gerard P., Langan-Fox Janice & Sadler-Smith Eugene, « Intuition : a Fundamental Bridging Construct in the Behavioural Sciences », British Journal of Psychology, 99, 2008, p. 1-27, 2008. https://doi.org/.Consulter
- Ilg Ruediger, Vogeley Kai, Goschke Thomas, Bolte Annette, Shah Jon N., Pöppel Ernst & Fink Gereon R., « Neural Processes Underlying Intuitive Coherence Judgments as Revealed by fMRI on a Semantic Judgment Task », NeuroImage, 38(1), 2007, p. 228-238. https://doi.org/10.1016/j.neuroimage.2007.07.014.Consulter
- Kahneman Daniel, Système 1 Système 2 : Les deux vitesses de la pensée, traduit par Raymond Clarinard. Flammarion, 2016, 706 pages.
- Knobe Joshua, Buckwalter Wesley, Nichols Shaun, Robbins Philip, Sarkissian Hagop & Sommers Tamler, « Experimental Philosophy », Annual Review of Psychology, 63, 2012, p. 81-99. https://doi.org/.Consulter
- Kvalnes Øyvind, Moral Reasoning at Work. Rethinking Ethics in Organizations. Switzerland : Palgrave Macmillan, 2019. 145 pages. ISBN 978-3-030-15190-4.
- LeDoux Joseph, The Emotional Brain. The Mysterious Underspinning of Emotional Life, London : Weidenfeld & Nicholson, 1998, 384 pages. ISBN 0-75380-670-3.
- Milgram Stanley, Obedience to Authority. An Experimental View, New York : Harper & Row, 1974, 224 pages. ISBN 0-06-131983-X.
- Papazoglou Konstantinos & Chopko Brian, « The Role of Moral Suffering (Moral Distress and Moral Injury) in Police Compassion Fatigue and PTSD : an Unexplored Topic », Frontiers in Psychology, 8(1999), 2017. https://doi.org/10.3389/fpsyg.2017.01999.
- Rand David G. & Epstein Ziv G., « Risking Your Life without a Second Thought : Intuitive Decision-Making and Extreme Altruism », PLoS ONE, 9(10), 2014, e109687. https://doi.org/10.1371/journal.pone.0109687.Consulter
- Task Group HFM-179, Moral Decisions and Military Mental Health, 2018, AC/323(HFM-179)TP/718.
- Terestchenko Michel, Fazzi Mariane & Milgram Stanley, Expérience sur l’obéissance et la désobéissance à l’autorité, La Découverte, 2013.
- Youssef Farid F., Dookeeram Karine, Basdeo Vasant, Francis Emmanuel, Doman Mekaeel, Mamed Danielle, Maloo Stefan, Degannes Joel, Dobo Linda, Ditshotlo Phatsimo & Legall George, « Stress Alters Personal Moral Decision Making », Psychoneuroendocrinology, 37(4), april 2012, p. 491-498. https://doi.org/.
- Zander Thea, Horr Ninja K., Bolte Annette & Volz Kirsten G., « Intuitive Decision Making as a Gradual Process : Investigating Semantic Intuition-Based and Priming-Based Decisions with fMRI », Brain and Behavior, 6(1), 2016, e00420. http://dx.doi.org/10.1002/brb3.420.