We typically think of the golden rule while making moral decisions: do unto others as you would have them do unto you. However, why we make such judgments has been a hot topic of discussion. Are we motivated by guilt, as though we don’t want to feel awful about disappointing the other person? Or by justice, in which case we seek to avoid inequitable outcomes?
According to a Radboud University-Dartmouth College study on moral decision-making and collaboration, some persons may rely on both guilt and fairness principles and may vary their moral rule depending on the circumstances. Prior research in economics, psychology, and neuroscience has been challenged by the findings, which are generally predicated on the assumption that humans are driven by a single moral principle that remains consistent throughout time. The research was just published in Nature Communications.
“Our research shows that when it comes to moral action, people may not always follow the golden rule. While most people care about others, others may engage in what we call “moral opportunism,” in which they still want to appear moral but want to maximize their own gain “Jeroen van Baar, a postdoctoral research associate in Brown University’s department of cognitive, linguistic, and psychological sciences, began this research while a visiting scholar at Dartmouth from Radboud University’s Donders Institute for Brain, Cognition, and Behavior.
“Because our daily circumstances tend to stay the same, we may not recognize that our morality are context-dependent in everyday life. However, we may discover that the moral rules we thought we’d always follow are actually quite malleable in new circumstances “Luke J. Chang, an assistant professor of psychological and brain sciences at Dartmouth University and director of the Computational Social Affective Neuroscience Laboratory (Cosan Lab), explained the findings. “When one examines how our moral conduct could alter in new settings, such as during war,” he noted, “this has enormous repercussions.”
The researchers created a modified trust game called the Hidden Multiplier Trust Game to study moral decision-making in the setting of reciprocity. This allowed them to classify decisions in reciprocating trust as a function of an individual’s moral approach. The team could use this method to figure out which type of moral strategy a study participant was employing: inequity aversion (where people reciprocate to seek fairness in outcomes), guilt aversion (where people reciprocate to avoid feeling guilty), greed, or moral opportunism (a new strategy that the team identified, where people switch between inequity aversion and guilt aversion depending on what will serve their interests best). The researchers also investigated the brain activity patterns linked with moral strategies and constructed a computational, moral strategy model that might be used to explain how people behave in the game.
The findings show for the first time that the inequity aversion and guilt aversion strategies are based on distinct patterns of brain activity, even though the techniques produce the same behavior. The subjects who were morally opportunistic saw their brain patterns move between the two moral strategies in various settings, according to the study. “Our findings show that people may make judgments based on a variety of moral principles, and that some people are far more flexible and will apply various principles depending on the scenario,” Chang added. “This might explain why individuals we like and respect do behaviors that we find ethically unacceptable on occasion.”