100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Summary

Summary Law, Technology and Society (LTS) - Notes

Rating
5.0
(1)
Sold
19
Pages
83
Uploaded on
26-05-2025
Written in
2024/2025

Law, Technology and Society (LTS) summary of the 10 weeks.

Institution
Course











Whoops! We can’t load your doc right now. Try again or contact support.

Written for

Institution
Study
Course

Document information

Uploaded on
May 26, 2025
Number of pages
83
Written in
2024/2025
Type
Summary

Subjects

Content preview

Week 1

Technology as a regulatory target

A 2007 forum hosted by TELOS at King's College London divided technology regulation into two parts:

Technologies used as means of regulation Regulation of technologies to manage risks and
(e.g., CCTV) perceived harms

Varied interpretations of ‘technology regulation’
stem from different understandings of the terms ‘technology’ and ‘regulation’.

The term ‘regulation’ has multiple meanings in Typically, 'technology' refers to current cutting-
literature: edge developments like biotechnology, ICTs,
• Promulgation of binding rules neurotechnology, and nanotechnology.
• Any deliberate state influence Koops
• All forms of social or economic influence "the broad range of tools and crafts that
Some scholars focus on government regulation people use to change or adapt to their en-
(limiting their study scope to regulation by gov- vironment"
ernment or government agents), while others Klang
adopt a decentred approach considering broader "both the purposeful activity and results of
influences on action (Julia Black's approach to regu- the transformation or manipulation of nat-
lation, that moves beyond the state as the sole regula- ural resources and environments in order
tor and includes other modalities of regulation). to satisfy human needs or goals"
Koops (but focuses on 'disruptive' technologies)
“the intentional influencing of someone's or
something's behaviour” Discussions about regulating specific new tech-
nologies (e.g., nanomedicine, social media pri-
Brownsword and Goodwin vacy) are commonly considered part of 'tech-
(adopting Julia Black's definition): nology regulation'.
“the sustained and focussed attempt to
alter the behaviour of others according to Technology regulation scholarship tends to fo-
standards or goals with the intention of cus on new technologies (which explains why it
producing a broadly identified outcome doesn't encompass all regulation). It focuses on
or outcomes, which may involve mecha- how regulators should handle new technological
nisms of standard-setting, information- fields.
gathering and behaviour modification” Koops explains that familiar technologies (e.g.,
‘Regulation’ can be both broader and narrower cars, boilers, or building construction) usually
than 'law', offering a more flexible framework for fall within the scope of existing legislation or
analysis: regulatory instruments, unlike radically new
technologies:
• Captures 'soft law'.
• Includes more distributed means of control and Innovative technologies raise more regulatory
can encompass unintentional influences like questions than non-innovative ones.
market forces.
• Better explains various influences, including
funding priorities and professional standards.
• Enables discussions about the advantages of
decentred, less formal rulemaking in controlling
rapidly evolving technological practices.

,Technology regulation often focuses on addressing real or potential environmental, health, or social harms
resulting from technological artifacts and processes.

Regulatory approaches

Direct regulation Indirect regulation
Prohibiting specific artifacts or processes. Influencing designers and users through subtle
means, such as mandating courses in engineer-
ing degrees or providing incentives for safety
innovation.

The target of technology regulation is complex, depending on how ‘technology’ is defined:
• It may be limited to ‘tools and crafts’ or encompass all ‘means’.
• One approach visualizes a network of actors (e.g., politicians, engineers) and objects (technological artifacts) in-
fluencing each other. This network approach makes it challenging to distinguish ‘technology regulation’ from reg-
ulation more broadly, as most regulation aims to influence people, things, and relationships.
In fact, focusing on 'technology' as a regulatory target may be less fruitful than recognizing the complex rela-
tionships between law, regulation, technology, and society. Lyria Bennett Moses suggests looking at the socio-
technical landscape rather than addressing ‘technology’ as a regulatory target.
The LTS model illustrates the complex relationships between new technologies, societal issues, and regulatory
interventions. Emerging new technologies are often creating new concerns and issues. Regulation and funda-
mental values both constrain and facilitate technological development. (Interaction model: the interconnect-
edness of regulation, technological developments, and societal values (normative outlooks). Changes in one
area affect the others.)
E.g., Waymo has unveiled a design for a fully autonomous robotaxi based on a Zeekr minivan that lacks a steering
wheel, pedals, and other traditional driver controls (tho current prototypes still include human controls). This vehi-
cle is designed to operate entirely without human intervention, relying on advanced AI and sensor systems for nav-
igation and decision-making. Real-world testing on public roads is necessary for Autonomous Vehicles (AVs) to en-
sure they can handle unexpected obstacles. The Article 8(1) of the Vienna Convention on Road Traffic traditionally
required that every moving vehicle have a driver who is able to control the vehicle at all times, posing a significant
obstacle to the deployment of fully autonomous vehicles. To address this challenge, amendments to the Vienna
Convention have been proposed and implemented, allowing for the transfer of driving tasks to the vehicle itself, as
long as they can be overridden or switched off by the driver.
E.g., George Hotz's Comma One was a device designed to retrofit existing cars with semi-autonomous driving capa-
bilities, similar to Tesla's Autopilot, but at a much lower cost. The device aimed to provide features like lane center-
ing and adaptive cruise control to a wider range of vehicles. However, the National Highway Traffic Safety Admin-
istration (NHTSA) expressed worry about the product's safety for customers and other road users. The NHTSA sent
a letter to Comma.ai requesting detailed information about the Comma One's operation and safety measures. The
agency demanded answers to specific questions about the device's design, testing practices, and safety trials.
NHTSA threatened daily fines of up to $21,000 if the company failed to respond by the given deadline. Rather than
comply with the NHTSA's requests, Hotz decided to cancel the Comma One project entirely.
E.g., in case of genetically modified organisms (GMOs), there have been concerns about potential allergic reactions,
toxicity, and long-term health effects of consuming it. GMOs pose risks such as gene flow (outcrossing), where
modified genes spread to wild plants or non-GMO crops, potentially disrupting ecosystems and biodiversity. As a
response, EU law requires all GMO foods to be traceable and labeled, ensuring transparency for consumers and
enabling coexistence between GM and non-GM crops through measures like buffer zones.


↷ Law, Technology and Society (LTS) model
(work in progress model)

Technology
⬆ 1. ⬇
Two conflicting approaches:

, a. Focusing on a specific instance of technology (e.g., Google AVs).
(Drawback: may focus on coincidental features)
b. Addressing a broad category (e.g., self-driving vehicles in general).
(Drawback: risks becoming too abstract)
• Determine the specific technology of focus.
• Identify its relevant characteristics.
• Analyze the interests at stake or being promoted.

Issue

Addressing issues raised by technological development:
• Potential risks
E.g., potential issues like AVs may face ethical dilemmas, such as choos-
ing between hitting a child or an elderly couple in unavoidable acci-
dent scenarios.
In addressing potential risks, this stage considers ethical dilemmas, such as
the trolley problem for AVs, which implicitly draws on ethical frameworks like
utilitarianism (weighing outcomes) and rights-based approaches (considering
the rights of individuals involved).
• Manifest problems
E.g., existing issues like AVs causing accidents on public roads

• What problem/risk?
Identify the specific challenges or dangers posed by the technology.
(Identify relevant features of new technologies that may create issues.)
E.g., for self-driving cars, the absence of a human driver is a key feature that
raises legal and safety concerns.
⬆ 2.
• Why?

Understand the reasons behind these challenges.
• Whose stakeholders?
Determine which groups (e.g., consumers, businesses, governments) are af-
fected. Consider their roles in defining problems or setting agendas.
E.g., examining who the relevant stakeholders are in the AV ecosystem. Recog-
nizing the diverse range of stakeholders involved, including manufacturers,
regulators, and the public.
Different stakeholders may identify problems based on their perspectives
and interests. Regulators, industry players, and consumers often have var-
ying concerns about new technologies.
E.g., manufacturers may focus on innovation, efficiency, and market opportuni-
ties. Consumers often prioritize usability, safety, and personal data protec-
tion.

• Assessing what existing laws and regulations say about the technology and
associated problems.
Once an issue is identified, it is assessed against current regulations:
• Are there gaps (lacunæ) in existing laws?
• Is enforcement lacking?
• Are there unintended consequences of current rules?
⬆ 3.
This step evaluates whether the current regulatory framework adequately

addresses the identified issue.

, Intervention

It focuses on regulatory intervention when a gap or shortcomings in regulation
are found:
• Developing new regulations or modifying existing ones.
• Deciding who regulates, why, when, and what aspects of the technology should be
regulated. Which combination of means should be used (e.g., law, norms, architec-
ture, markets)?

Ethics and science and technology studies (STS) tools for problem analysis, as
well as regulatory toolboxes for analyzing and developing interventions (e.g., big
data analytics, sandboxes), are important frameworks for addressing ethical and
regulatory challenges in emerging technologies.
Interventions can take various forms:
• Soft modality
Norms, market mechanisms, architectural changes.
• Hard modality
Formal laws and enforcement mechanisms.

However, not every risk warrants intervention due to the tension
between regulation and individual freedoms.

Is it necessary?

Are the issues or harm serious?

Does it unduly restrict our freedoms?

Are the restrictions proportionate to the threat of harm?

Is it justified?

What are the grounds for intervention?

Regulators must justify intervention based on three categories:
a) Market failure (American perspective)
b) Human rights protection (European perspective)
c) Conflict resolution
The category that justifies intervention can provide guidance on how to regulate.

Feedback Loop
Interventions influence how new technologies are integrated into society. This loop
⬉ ensures that ongoing technological advancements are continuously monitored and ⬋
regulated in alignment with fundamental societal values.

Flawed Law Syndrome
The tendency to quickly label/assume existing laws as outdated or inadequate when new technologies emerge,
prompting calls for immediate legal reform (the desire to fix perceived problems by changing the law rather than ex-
ploring other solutions). However, acting too quickly to create new regulations can lead to poorly designed laws that
may not effectively address the issues at hand.

Developers are typically aware of some rules and obligations but are not legal experts. As a result, they may
unintentionally create technologies that are non-compliant with existing regulations. Thus, in some instances,
industries themselves complain about regulations, typically claiming they are either too restrictive or unclear

Reviews from verified buyers

Showing all reviews
5 months ago

5.0

1 reviews

5
1
4
0
3
0
2
0
1
0
Trustworthy reviews on Stuvia

All reviews are made by real Stuvia users after verified purchases.

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
LevisHelp Tilburg University
Follow You need to be logged in order to follow users or courses
Sold
206
Member since
5 year
Number of followers
7
Documents
7
Last sold
10 hours ago

4.9

25 reviews

5
22
4
3
3
0
2
0
1
0

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions