2 2025
Unique Number:
Due date: 20 August 2025
SECTION 1
ARTIFICIAL INTELLIGENCE, POLITICS AND DISCOURSE.
DISCLAIMER & TERMS OF USE
Educational Aid: These study notes are intended to be used as educational resources and should not be seen as a
replacement for individual research, critical analysis, or professional consultation. Students are encouraged to perform
their own research and seek advice from their instructors or academic advisors for specific assignment guidelines.
Personal Responsibility: While every effort has been made to ensure the accuracy and reliability of the information in
these study notes, the seller does not guarantee the completeness or correctness of all content. The buyer is
responsible for verifying the accuracy of the information and exercising their own judgment when applying it to their
assignments.
Academic Integrity: It is essential for students to maintain academic integrity and follow their institution's policies
regarding plagiarism, citation, and referencing. These study notes should be used as learning tools and sources of
inspiration. Any direct reproduction of the content without proper citation and acknowledgment may be considered
academic misconduct.
Limited Liability: The seller shall not be liable for any direct or indirect damages, losses, or consequences arising from
the use of these notes. This includes, but is not limited to, poor academic performance, penalties, or any other negative
consequences resulting from the application or misuse of the information provided.
, For additional support +27 81 278 3372
SECTION 1
ARTIFICIAL INTELLIGENCE, POLITICS AND DISCOURSE.
1. Core Concept
o Intersection of AI technologies, political processes, and communication
o Examined through Critical Discourse Analysis (CDA) to uncover
hidden power relations and ideologies
2. Artificial Intelligence
o Technological Forms: machine learning, natural language
processing, generative AI
o Applications in Politics: data analytics, campaign targeting,
automated content generation
o Risks: algorithmic bias, surveillance, manipulation of public opinion
3. Politics
o Policy-Making: AI regulation, ethical guidelines, transparency
o Power Structures: how AI reinforces or challenges existing
hierarchies
o Global Governance: geopolitics of AI development and control
4. Discourse
o Political Messaging: framing of AI in public debates, speeches, media
narratives
o Media Representation: portrayal of AI in news, policy documents, and
social platforms
o Public Perception: influence of language on trust or fear of AI
5. Critical Discourse Analysis (CDA) Lens
o Power and Ideology: who controls AI narratives and for whose benefit
, For additional support +27 81 278 3372
o Framing and Agenda Setting: how discourse shapes AI policy
priorities
o Inclusion vs. Exclusion: whose voices are amplified or silenced in AI
debates
6. Key Themes
o Bias and Fairness: uncovering discriminatory outcomes in AI tools
used in political contexts
o Manipulation and Misinformation: deepfakes, automated
propaganda, echo chambers
o Ethics and Accountability: discourse around moral responsibility for
AI’s societal impacts
7. Outcomes of CDA Approach
o Reveals hidden assumptions in political AI discourse
o Helps design fairer, more transparent AI policies
o Informs the public on critical evaluation of AI narratives