online)àbefore 2008(social media: not only products, but also people are online)àafter 2008 (Apps, mergning of
online and of>line)àrecent (smart devices)
• algorithm: encoded procedures for transforming input data into desired outputs, based on speci>ied calculations
*power of algorithm (desired output):
1. prioritization (making an order list)
^emphasize certain things at the expense of others
^ e.g. Google Page
2. classi3ication (picking a category)
3. association (>inding links)
4. 3iltering (what is important)
^including or excluding informations
^ e.g. Instagram feed
• types of algorithm:
1. rule-base algorithm: based on a set of rules or steps
*IF ‘condition’ THEN ‘result’
*quick and easy to follow, but only applicable to the speci>ied conditions
2. machine learning algorithm (supervised or unsupervised)
*algorithms that learn by themselves
i. supervised learning: learn from labeled data, data is paired with correct output
ii. unsupervised learning: unlabeled data, >ind patterns
iii. semi-supervised learning: a combination of supervised and unsupervised
iv. reinforcement learning: an agent learning to make decisions by interacting with environment
• recomender systems: algorithm that provide suggestions for content that is most likely of interest to a
particular user
*rationale: avoid choice overload, maximize user relevance, increase work ef3iciency
*three types of recommender systems:
1. content-based 3iltering: algorithm recommend items that are similar to the ones that the user
liked in the past (based on the similarity of items)
2. collaborative 3iltering: algorithms suggest recommendations to the user based on items that
other users with similar tastes liked in the past
3. hybrid 3iltering: algorithms combine features from both content-based and collaborative
systems, and usually with other additional elements (e.g. demographics)
• algorithmic appreciation: people rely more on advice from algorithms than other people, descpite the blindness
to algorithm’s process
*automation bias: blind faith in information from computers
• algorithmic aversion: tendency to prefer human judgments over algorithmic decisions, even when the human
decisions are clearly inferior
*less tolerance for errors from algorithms than from humans
*explanation: people are averse because they don’t understand the algorithmic process
*algorithmic anxiety: lack of control and uncertainty over algorithms create anxiety
• algorithmic persuasion: Any deliberate attempt by a persuader to in3luence the beliefs, attitudes and
behaviors of people through online communication that is mediated by algorithms
• algorithmic persuasion framework:
, • data:
*step: input in the model
1. >irst-party data: owned or collected by the sender of the persuasion attempt itself
2. second-party data: owned by a collaborating party
3. third-party data: collected by companies that are not directly involved in the primary process
*另⼀种分类
1. explicit data: disclosed by the users
2. implicit data: collected without the users’ awareness
• algorithms: the techniques, persuader objectives and biases
*four techniques: prioritization, classi3ication, association, and 3iltering
*persuader objectives: cognitive, affective and behavioral in3luence
*algorithmic bias: developers unconsciously program their biases into algorithms; algorithms are created
for a purpose (e.g. persuade, increase value and capital, nudging,…)
• persuasion attempt: how persuasive communication manifest itself
Context: health, advertising, politics, etc
Nature: paid (sponsored content); Organic (algorithm of the platform)
Medium: devices and platforms
Modality: text (search engine), audio, video, images
• explanations of the persuasion process (mechanisms)
*step: persuasion process in the model
1. Relevance: align with interest of the recipients (tailored messages)
2. Reduction: reduce a large corpus of content into a smaller consideration set to avoid choice overload
*reduce optionsà increase favorability
3. Social norms: what similar others do (collaborative >iltering)
4. Automation bias: people are persuaded because they trust machine
5. Reinforcement: show what the users like to seeà reinforce pre-existing attitude and beliefs
• persuasion effects:
*intended effect: desired by the persuader who exposed users to the algorithm
*unintended effect: undesired…
• computational politics (Tufekci, 2014): Applying computational methods to large datasets derived from online
and of>line data sources for conducting outreach, persuasion and mobilization
*make political communication increasingly personalized
• examples of computational politics:
1. social media platforms: targeted advertising; news feed algorithm; trending topics
2. search engines: search results ranking; autocomplete suggestions
3. video platforms (e.g. youtube): recommendation
4. new aggregation platforms: story selection; personalized news digest
5. messaging apps: group recommendations: suggest groups or channels based on user contacts
• political bots on social media: social media accounts equipped with algorithms that post message of their own
accord (mimic human user)
*to manipulate the public opinion
*by spreading propaganda in support of OR against particular issues or people
*steps to work:
i. write or access a pre-made script for a bot
ii. automatically setting up an account
iii. mimicking an actual person
iv. crawling through content and scanning posts on social network (observe the environment)
v. posting content to engage with human users
vi. networks of bots act together
• 6 interwined contributing dynamics (driving forces) to computational politics (Tufekei, 2014):
1. Big data
* digital mediationàincrease in depth and reach of data on each individual
*telescope and microscope
*explicit or implicit
2. Emerging computational methods (allow for modeling of speci>ic individuals)
*development in storage and database system
*extraction of semantic information