SED Task 9. Moral emotions
Learning goals
1. Cognitive theories of moral development of Piaget and Kohlberg
2. Difference between gut feelings and moral convictions
3. Discrepancy theoretical predictions and moral development in real life
4. Development of moral
5. Neural underpinnings of moral
Haidt
The key factor that catalyzed the new synthesis was the "affective revolution" of the 1980s
and the increase in research on emotion that followed the "cognitive revolution" of the 1960s
and 1970s. I describe three principles, each more than 100 years old, that were revived during
the affective revolution. Each principle links together insights from several fields, particularly
social psychology, neuroscience, and evolutionary theory. I conclude with a fourth principle
that I believe will be the next step in the synthesis.
Principle 1: Intuitive Primacy (but Not Dictatorship)
The human mind is composed of an ancient, automatic, and very fast affective system and a
phylogenetically newer, slower, and motivationally weaker cognitive system. Brains are
always and automatically evaluating everything they perceive, and higher-level human
thinking is preceded, permeated, and influenced by affective reactions (simple feelings of like
and dislike) which push us gently (or not so gently) toward approach or avoidance.
Evolutionary approaches to morality generally suggest affective primacy. Most propose that
the building blocks of human morality are emotional, and later on language and the ability to
engage in conscious moral reasoning came much later, perhaps only in the past 100 thousand
years, so the neural mechanisms for emotion are probably still stronger, because reasoning
was added later.
Criticism on the contrast of "affect" and "cognition," led to the social Intuitionist Model:
moral psychology consists of two kinds of cognition: moral intuition and moral reasoning.
Moral intuition refers to fast, automatic, and (usually) affect-laden processes in which an
evaluative feeling of good-bad or like-dislike (about the actions or character of a person)
appears in consciousness without any awareness of having gone through steps of search,
weighing evidence, or inferring a conclusion. Moral reasoning, in contrast, is a controlled and
"cooler" (less affective) process; it is conscious mental activity that consists of transforming
information about people and their actions in order to reach a moral judgment or decision.
People generally begin reasoning by setting out to confirm their initial hypothesis. Areas of
the medial prefrontal cortex, including ventro medial prefrontal cortex and the medial frontal
gyms are important for moral judgment. These areas appear to be crucial for integrating affect
(including expectations of reward and punishment) into decisions and plans. Other important
areas include the amygdala and the frontal insula. These areas seem to be involved in
sounding a kind of alarm, and push subsequent processing in a particular direction. Affective
reactions push, but they do not absolutely force. People thinking about difficult moral
dilemmas exhibit increased activity in the anterior cingulate cortex, a brain region that
responds to internal conflict. Subjects that go against their emotional instinct exhibit increased
activity in the dorsolateral prefrontal cortex, suggesting that they do additional processing and
override their initial flash of horror.
Learning goals
1. Cognitive theories of moral development of Piaget and Kohlberg
2. Difference between gut feelings and moral convictions
3. Discrepancy theoretical predictions and moral development in real life
4. Development of moral
5. Neural underpinnings of moral
Haidt
The key factor that catalyzed the new synthesis was the "affective revolution" of the 1980s
and the increase in research on emotion that followed the "cognitive revolution" of the 1960s
and 1970s. I describe three principles, each more than 100 years old, that were revived during
the affective revolution. Each principle links together insights from several fields, particularly
social psychology, neuroscience, and evolutionary theory. I conclude with a fourth principle
that I believe will be the next step in the synthesis.
Principle 1: Intuitive Primacy (but Not Dictatorship)
The human mind is composed of an ancient, automatic, and very fast affective system and a
phylogenetically newer, slower, and motivationally weaker cognitive system. Brains are
always and automatically evaluating everything they perceive, and higher-level human
thinking is preceded, permeated, and influenced by affective reactions (simple feelings of like
and dislike) which push us gently (or not so gently) toward approach or avoidance.
Evolutionary approaches to morality generally suggest affective primacy. Most propose that
the building blocks of human morality are emotional, and later on language and the ability to
engage in conscious moral reasoning came much later, perhaps only in the past 100 thousand
years, so the neural mechanisms for emotion are probably still stronger, because reasoning
was added later.
Criticism on the contrast of "affect" and "cognition," led to the social Intuitionist Model:
moral psychology consists of two kinds of cognition: moral intuition and moral reasoning.
Moral intuition refers to fast, automatic, and (usually) affect-laden processes in which an
evaluative feeling of good-bad or like-dislike (about the actions or character of a person)
appears in consciousness without any awareness of having gone through steps of search,
weighing evidence, or inferring a conclusion. Moral reasoning, in contrast, is a controlled and
"cooler" (less affective) process; it is conscious mental activity that consists of transforming
information about people and their actions in order to reach a moral judgment or decision.
People generally begin reasoning by setting out to confirm their initial hypothesis. Areas of
the medial prefrontal cortex, including ventro medial prefrontal cortex and the medial frontal
gyms are important for moral judgment. These areas appear to be crucial for integrating affect
(including expectations of reward and punishment) into decisions and plans. Other important
areas include the amygdala and the frontal insula. These areas seem to be involved in
sounding a kind of alarm, and push subsequent processing in a particular direction. Affective
reactions push, but they do not absolutely force. People thinking about difficult moral
dilemmas exhibit increased activity in the anterior cingulate cortex, a brain region that
responds to internal conflict. Subjects that go against their emotional instinct exhibit increased
activity in the dorsolateral prefrontal cortex, suggesting that they do additional processing and
override their initial flash of horror.