The Digital Services Act as an Online Constitutional Framework
INTRODUCTION
The 2022 Digital Services Act (Regulation (EU) 2022/2065) sets out the basic legal framework for digital services in
the EU. It regulates the responsibilities of online intermediaries (that is, all the actors that are interposed between
us, users of digital services, and the producers of digital contents). The DSA gives us an example of
constitutionalization of the legal order.
The constitutionalization of the legal order refers to the process by which constitutional principles, and in particular
fundamental rights, increasingly permeate and shape legislative frameworks, administrative and judicial practices
and private interactions. It embodies a paradigm shift wherein all legal norms are informed and assessed in light of
constitutional standards.
The DSA exemplifies this trend, explicitly embedding EU fundamental rights standards, particularly those outlined in
the Charter of Fundamental Rights of the European Union (CFR), into the governance framework for digital services.
This integration symbolizes the constitutionalization of digital governance by institutionalizing safeguards that
uphold and monitor adherence to fundamental rights such as freedom of expression, protection of personal data,
and non-discrimination.
Notable features illustrating this constitutionalization are: the requirement for periodic independent audits of very
large online platforms and search engines (art. 37), explicitly assessing compliance with obligations set forth by the
platforms' adherence to fundamental rights as delineated in the CFR. Additionally, art. 14 of the DSA mandates
transparency reporting, compelling providers of intermediary services to disclose information about their content
moderation policies and enforcement actions, thereby fostering accountability in the exercise of their platform
governance. Moreover, art. 26 of the DSA requires clear identification and transparency in online advertising,
aiming to protect individuals' rights to privacy and data protection, and to ensure informed consent. Another critical
element is the crisis protocols outlined in art. 48, which are specifically designed to address extraordinary situations
affecting public security or health. These protocols must incorporate safeguards to prevent adverse impacts on
fundamental rights. The explicit incorporation of fundamental rights standards into enforcement mechanisms is also
evident in art. 51, which emphasizes proportionality and due process in investigations and enforcement actions.
INTERMEDIARY LIABILITY
Intermediaries are the rule-takers under the DSA. They function as the connective tissue of the Internet, facilitating
communication between content creators (senders) and end-users (receivers). Their primary role is to transmit,
store or make third-party content available, without being its author. The DSA identifies four categories of
intermediaries: mere conduit, caching, hosting and online platforms. Each category is subject to distinct
responsibilities.
Type Function Examples
Mere Conduit Transmit data without altering it or storing it Vodafone, Orange, Telefónica
Caching Temporarily store data to optimize delivery speed Cloudflare, Akamai
Dropbox, GitHub, Mediafire,
Hosting Store user-generated content without active control
forums
Online Host and organize public user interactions, displaying it, through Facebook, YouTube, eBay,
Platforms an algorithm TikTok…
Mere conduit providers are like digital highways. They transmit data from one point to another but do not
inspect or modify it. For instance, when a message is sent via WhatsApp, or a website is accessed, Internet
providers simply forward data packets.
Caching services temporarily store content that is frequently accessed. A content delivery network like
Cloudflare stores parts of websites closer to users to load them faster, especially videos, scripts, or
images.
, Hosting services store data uploaded by users, such as documents, images, or websites. Dropbox holds
files; GitHub stores code repositories. The provider does not typically know or control the legality of the
stored content unless it is notified.
Online platforms do more than host: they actively curate and present content. For example, YouTube
doesn’t just store videos: it recommends, ranks, and organizes them through algorithms. This increases
their role (and, consequently, their responsibility) in shaping online communication.
The DSA reflects the principle that liability must be proportionate to effective control: the more influence an
intermediary exerts over content, the more responsibility it bears.
Art. 4 – Mere Conduit
Intermediaries that only transmit or enable access to communications networks are not liable for the content they
carry. This protects the neutrality of network providers and ensures they are not forced to become censors. It aligns
with art. 10 ECHR, which guarantees freedom to receive and impart information. However, they can be subject to
judicial decisions ordering to block another intermediary or content being transmitted.
Art. 5 – Caching
Caching providers are exempt from liability as long as they store data automatically, temporarily, and for efficiency
purposes. They must, however, act promptly if notified that the cached content is illegal.
Example: Cloudflare and La Liga
In 2023, La Liga requested the removal of illegal live-streaming links to football matches, which were being cached
and served by Cloudflare. Although Cloudflare was not hosting the infringing content itself, it was facilitating fast
access to it. La Liga argued that Cloudflare could not remain neutral once notified of the infringement. The case
illustrates how caching services may lose their liability exemption under art. 5 DSA if they fail to act, promptly, after
being made aware of illegal content.
In a significant development, recently the Commercial Court No. 6 of Barcelona has authorized La Liga to directly
require Internet Service Providers (that is, conduits) to block IP addresses utilized by platforms shielded by
Cloudflare. And this is a problem, since Cloudflare stores many different websites within a single IP address… as a
result, an online shop in Spain can see its online services disrupted without having been involved in the issue.
Art. 6 – Hosting
Hosting providers are exempt of liability unless they:
Have actual knowledge of illegal content and do not remove it, or
Are aware of facts that make illegality apparent and fail to act.
This exemption strikes a balance: providers are not expected to proactively monitor content but must act when put
on notice. Example: If a user uploads hate speech to a web forum, the host is not liable unless it is notified and fails
to remove the content in a timely way.
No General Monitoring Obligation
Under art. 7 of the DSA, service providers acting as conduits, caching providers, or hosts may choose to remove
illegal content on a voluntary basis. Such action, however, does not entail the loss of the liability exemption
discussed above. Their free decision to intervene does not imply that they start assuming a general obligation to
monitor content or to engage in proactive surveillance of the information they transmit or store. These providers
retain their liability exemption even when they take voluntary action, unless they are specifically notified of illegal
content. All for the sake of freedom of information, of expression and the right to access Internet.
Further reinforcing this principle, art. 8 of the DSA prohibits any requirement for intermediaries to conduct general
monitoring of all content. This provision serves as a safeguard for users, protecting them from indiscriminate
INTRODUCTION
The 2022 Digital Services Act (Regulation (EU) 2022/2065) sets out the basic legal framework for digital services in
the EU. It regulates the responsibilities of online intermediaries (that is, all the actors that are interposed between
us, users of digital services, and the producers of digital contents). The DSA gives us an example of
constitutionalization of the legal order.
The constitutionalization of the legal order refers to the process by which constitutional principles, and in particular
fundamental rights, increasingly permeate and shape legislative frameworks, administrative and judicial practices
and private interactions. It embodies a paradigm shift wherein all legal norms are informed and assessed in light of
constitutional standards.
The DSA exemplifies this trend, explicitly embedding EU fundamental rights standards, particularly those outlined in
the Charter of Fundamental Rights of the European Union (CFR), into the governance framework for digital services.
This integration symbolizes the constitutionalization of digital governance by institutionalizing safeguards that
uphold and monitor adherence to fundamental rights such as freedom of expression, protection of personal data,
and non-discrimination.
Notable features illustrating this constitutionalization are: the requirement for periodic independent audits of very
large online platforms and search engines (art. 37), explicitly assessing compliance with obligations set forth by the
platforms' adherence to fundamental rights as delineated in the CFR. Additionally, art. 14 of the DSA mandates
transparency reporting, compelling providers of intermediary services to disclose information about their content
moderation policies and enforcement actions, thereby fostering accountability in the exercise of their platform
governance. Moreover, art. 26 of the DSA requires clear identification and transparency in online advertising,
aiming to protect individuals' rights to privacy and data protection, and to ensure informed consent. Another critical
element is the crisis protocols outlined in art. 48, which are specifically designed to address extraordinary situations
affecting public security or health. These protocols must incorporate safeguards to prevent adverse impacts on
fundamental rights. The explicit incorporation of fundamental rights standards into enforcement mechanisms is also
evident in art. 51, which emphasizes proportionality and due process in investigations and enforcement actions.
INTERMEDIARY LIABILITY
Intermediaries are the rule-takers under the DSA. They function as the connective tissue of the Internet, facilitating
communication between content creators (senders) and end-users (receivers). Their primary role is to transmit,
store or make third-party content available, without being its author. The DSA identifies four categories of
intermediaries: mere conduit, caching, hosting and online platforms. Each category is subject to distinct
responsibilities.
Type Function Examples
Mere Conduit Transmit data without altering it or storing it Vodafone, Orange, Telefónica
Caching Temporarily store data to optimize delivery speed Cloudflare, Akamai
Dropbox, GitHub, Mediafire,
Hosting Store user-generated content without active control
forums
Online Host and organize public user interactions, displaying it, through Facebook, YouTube, eBay,
Platforms an algorithm TikTok…
Mere conduit providers are like digital highways. They transmit data from one point to another but do not
inspect or modify it. For instance, when a message is sent via WhatsApp, or a website is accessed, Internet
providers simply forward data packets.
Caching services temporarily store content that is frequently accessed. A content delivery network like
Cloudflare stores parts of websites closer to users to load them faster, especially videos, scripts, or
images.
, Hosting services store data uploaded by users, such as documents, images, or websites. Dropbox holds
files; GitHub stores code repositories. The provider does not typically know or control the legality of the
stored content unless it is notified.
Online platforms do more than host: they actively curate and present content. For example, YouTube
doesn’t just store videos: it recommends, ranks, and organizes them through algorithms. This increases
their role (and, consequently, their responsibility) in shaping online communication.
The DSA reflects the principle that liability must be proportionate to effective control: the more influence an
intermediary exerts over content, the more responsibility it bears.
Art. 4 – Mere Conduit
Intermediaries that only transmit or enable access to communications networks are not liable for the content they
carry. This protects the neutrality of network providers and ensures they are not forced to become censors. It aligns
with art. 10 ECHR, which guarantees freedom to receive and impart information. However, they can be subject to
judicial decisions ordering to block another intermediary or content being transmitted.
Art. 5 – Caching
Caching providers are exempt from liability as long as they store data automatically, temporarily, and for efficiency
purposes. They must, however, act promptly if notified that the cached content is illegal.
Example: Cloudflare and La Liga
In 2023, La Liga requested the removal of illegal live-streaming links to football matches, which were being cached
and served by Cloudflare. Although Cloudflare was not hosting the infringing content itself, it was facilitating fast
access to it. La Liga argued that Cloudflare could not remain neutral once notified of the infringement. The case
illustrates how caching services may lose their liability exemption under art. 5 DSA if they fail to act, promptly, after
being made aware of illegal content.
In a significant development, recently the Commercial Court No. 6 of Barcelona has authorized La Liga to directly
require Internet Service Providers (that is, conduits) to block IP addresses utilized by platforms shielded by
Cloudflare. And this is a problem, since Cloudflare stores many different websites within a single IP address… as a
result, an online shop in Spain can see its online services disrupted without having been involved in the issue.
Art. 6 – Hosting
Hosting providers are exempt of liability unless they:
Have actual knowledge of illegal content and do not remove it, or
Are aware of facts that make illegality apparent and fail to act.
This exemption strikes a balance: providers are not expected to proactively monitor content but must act when put
on notice. Example: If a user uploads hate speech to a web forum, the host is not liable unless it is notified and fails
to remove the content in a timely way.
No General Monitoring Obligation
Under art. 7 of the DSA, service providers acting as conduits, caching providers, or hosts may choose to remove
illegal content on a voluntary basis. Such action, however, does not entail the loss of the liability exemption
discussed above. Their free decision to intervene does not imply that they start assuming a general obligation to
monitor content or to engage in proactive surveillance of the information they transmit or store. These providers
retain their liability exemption even when they take voluntary action, unless they are specifically notified of illegal
content. All for the sake of freedom of information, of expression and the right to access Internet.
Further reinforcing this principle, art. 8 of the DSA prohibits any requirement for intermediaries to conduct general
monitoring of all content. This provision serves as a safeguard for users, protecting them from indiscriminate