NUDGING
1. Core definition
A nudge subtly guides behaviour in a predictable way without removing choices or altering
financial incentives.
It targets "predictable" responses based on how the human mind processes information.
The Theoretical Foundations of Choice Architecture
• Bounded Rationality: Nudging builds on Herbert Simon’s idea that people have limited
information-processing capacity. Instead of evaluating all options to maximise utility,
individuals rely on mental shortcuts and simplified decision strategies.
• Full Rationality: Full rationality reflects the normative assumption in classical economics
that decision-makers consider all available options and their attributes in order to choose the
option that maximises utility.
Shortcuts Used to Perceive the World
These heuristics influence how we interpret external stimuli and patterns:
• Cheerleader Effect: Refers to the tendency for individuals to appear more attractive when
viewed in a group than when seen alone. This occurs because the visual system averages
facial features within a group, creating a more favourable overall impression.
• Clustering Illusion: The tendency to overinterpret small streaks, runs, or clusters in random
data as meaningful patterns. It reflects our natural inclination to impose structure on
information—linked to Gestalt principles such as proximity and similarity—even when no
real pattern exists.
Shortcuts Used to Perceive and Judge Ourselves
These heuristics relate to how we value our own efforts and predict our own behavior:
• IKEA Effect: The IKEA effect refers to the tendency for people to place a higher value on
products or outcomes they have partially created themselves. Personal effort increases
perceived worth, even when the final product is objectively similar to one made by someone
else.
• Planning Fallacy: The planning fallacy describes the tendency to underestimate the time
required to complete a task. People are overly optimistic about their own efficiency, even
when they have past experience showing that similar tasks took longer.
, • The Rationality Wars: A central debate in behavioural science contrasts two perspectives:
◦ Heuristics and Biases (Kahneman & Tversky): This approach emphasises how intuitive
thinking can lead to systematic errors, such as the conjunction fallacy. Heuristics are seen as
cognitive shortcuts that save effort but reduce accuracy.
◦ Ecological Rationality (Gigerenzer): This perspective argues that heuristics can be
highly effective, especially in “large world” environments characterised by uncertainty and
incomplete information. In such contexts, simple rules can outperform complex statistical
models.
• Bias-Variance Dilemma: This dilemma helps explain why simple heuristics sometimes
outperform complex models.
◦ Simple heuristics (e.g., the “nine-month hiatus rule”) have high bias—they are rigid and
ignore many variables—but zero variance, meaning they do not fluctuate based on small or
misleading data samples.
◦ Complex models incorporate more information but often suffer from high variance,
making their predictions unstable and less accurate in uncertain environments.
• Effort-Accuracy Trade-off: Traditionally, heuristics were viewed as a compromise: they
reduce cognitive effort but at the cost of accuracy. This aligns with the Heuristics and Biases
tradition, where shortcuts are seen as sources of predictable error.
• Ecological Rationality: Gigerenzer’s work challenges the idea that heuristics are merely
second-best. In uncertain environments, simple heuristics can be more accurate than complex
models because they are better adapted to the structure of the environment. This phenomenon
is known as the less-is-more effect: using less information can sometimes lead to better
decisions.
2. Dual-Process Theories: System 1 and System 2
The traditional foundation of nudging relies on a distinction between two modes of thinking:
• System 1 (Intuition): Fast, automatic, effortless, associative, and often emotionally
charged. It is the "default" mode of operation.
• System 2 (Reasoning): Slower, serial, effortful, and consciously monitored; it is flexible
and potentially rule-governed.
However, modern research challenges the rigidity of this divide. Traits such as “unconscious,”
“unintentional,” and “efficient” do not consistently cluster together, making the strict System
1 vs. System 2 separation what some scholars call a “mythical number two.”
1. Core definition
A nudge subtly guides behaviour in a predictable way without removing choices or altering
financial incentives.
It targets "predictable" responses based on how the human mind processes information.
The Theoretical Foundations of Choice Architecture
• Bounded Rationality: Nudging builds on Herbert Simon’s idea that people have limited
information-processing capacity. Instead of evaluating all options to maximise utility,
individuals rely on mental shortcuts and simplified decision strategies.
• Full Rationality: Full rationality reflects the normative assumption in classical economics
that decision-makers consider all available options and their attributes in order to choose the
option that maximises utility.
Shortcuts Used to Perceive the World
These heuristics influence how we interpret external stimuli and patterns:
• Cheerleader Effect: Refers to the tendency for individuals to appear more attractive when
viewed in a group than when seen alone. This occurs because the visual system averages
facial features within a group, creating a more favourable overall impression.
• Clustering Illusion: The tendency to overinterpret small streaks, runs, or clusters in random
data as meaningful patterns. It reflects our natural inclination to impose structure on
information—linked to Gestalt principles such as proximity and similarity—even when no
real pattern exists.
Shortcuts Used to Perceive and Judge Ourselves
These heuristics relate to how we value our own efforts and predict our own behavior:
• IKEA Effect: The IKEA effect refers to the tendency for people to place a higher value on
products or outcomes they have partially created themselves. Personal effort increases
perceived worth, even when the final product is objectively similar to one made by someone
else.
• Planning Fallacy: The planning fallacy describes the tendency to underestimate the time
required to complete a task. People are overly optimistic about their own efficiency, even
when they have past experience showing that similar tasks took longer.
, • The Rationality Wars: A central debate in behavioural science contrasts two perspectives:
◦ Heuristics and Biases (Kahneman & Tversky): This approach emphasises how intuitive
thinking can lead to systematic errors, such as the conjunction fallacy. Heuristics are seen as
cognitive shortcuts that save effort but reduce accuracy.
◦ Ecological Rationality (Gigerenzer): This perspective argues that heuristics can be
highly effective, especially in “large world” environments characterised by uncertainty and
incomplete information. In such contexts, simple rules can outperform complex statistical
models.
• Bias-Variance Dilemma: This dilemma helps explain why simple heuristics sometimes
outperform complex models.
◦ Simple heuristics (e.g., the “nine-month hiatus rule”) have high bias—they are rigid and
ignore many variables—but zero variance, meaning they do not fluctuate based on small or
misleading data samples.
◦ Complex models incorporate more information but often suffer from high variance,
making their predictions unstable and less accurate in uncertain environments.
• Effort-Accuracy Trade-off: Traditionally, heuristics were viewed as a compromise: they
reduce cognitive effort but at the cost of accuracy. This aligns with the Heuristics and Biases
tradition, where shortcuts are seen as sources of predictable error.
• Ecological Rationality: Gigerenzer’s work challenges the idea that heuristics are merely
second-best. In uncertain environments, simple heuristics can be more accurate than complex
models because they are better adapted to the structure of the environment. This phenomenon
is known as the less-is-more effect: using less information can sometimes lead to better
decisions.
2. Dual-Process Theories: System 1 and System 2
The traditional foundation of nudging relies on a distinction between two modes of thinking:
• System 1 (Intuition): Fast, automatic, effortless, associative, and often emotionally
charged. It is the "default" mode of operation.
• System 2 (Reasoning): Slower, serial, effortful, and consciously monitored; it is flexible
and potentially rule-governed.
However, modern research challenges the rigidity of this divide. Traits such as “unconscious,”
“unintentional,” and “efficient” do not consistently cluster together, making the strict System
1 vs. System 2 separation what some scholars call a “mythical number two.”