Consent and Data in Social Media: Looking Behind Privacy
- Human Rights Research Center
- 11 minutes ago
- 32 min read
Author: Rodrigo Pina, LLM
August 28, 2025
![[Image source: World Wide Web Foundation]](https://static.wixstatic.com/media/e28a6b_231a5121887a44eb9180c37751f8f467~mv2.png/v1/fill/w_49,h_28,al_c,q_85,usm_0.66_1.00_0.01,blur_2,enc_avif,quality_auto/e28a6b_231a5121887a44eb9180c37751f8f467~mv2.png)
1. Introduction
Here is a fact, not a secret, and for most people, neither new nor surprising: whenever we use Instagram, X, TikTok, or any other social media platform, we agree to their terms and conditions, including their privacy policies. We consent to surrender our personal information — our preferences, associations, movements, and behaviors — in exchange for access to the platform. Whether we think of this as a service or, more fittingly, an entry into a space, the result is the same: our data is harvested to fuel a tailored, engagement-maximizing experience, ultimately using our attention to sell ad space to advertisers.
This is the foundation of the social media business model. It is widely known and thoroughly documented. Scholars, journalists, and activists have raised awareness of it for years. Books have been written, laws proposed, and dinner-table conversations filled with jokes, or paranoia, about our phones listening to us. We are no longer in the dark.
And yet, something lingers beneath the surface. Despite widespread awareness and the legal requirement for consent, public discomfort persists. People continue to express unease, “they’re spying on us…they’re selling our identities,” even as they return to the very platforms they critique. This persistent contradiction points to a deeper tension: If we understand what's happening and still consent to it, why does it continue to feel wrong?
Most regulatory efforts address this tension through the lens of privacy rights and market logic, which place individual autonomy at their core. This assumes that if users are adequately informed and freely consent, then the transaction is legitimate. But this framework may be misframing the problem.
The discomfort surrounding platform data practices reveals a deeper normative conflict that cannot be solved through an individual logic. At stake are not simply the nuances of how individuals can protect their data, but how we, collectively, define public space in the digital age. When access to the digital sphere, the very infrastructure of public discourse and visibility, is conditioned on economic exchange, we are no longer negotiating over personal information. We are negotiating over the terms of political existence.
From this perspective, the current regulatory impasse, most visibly illustrated by the European Union’s (EU) resistance to the Meta business model, is not a matter of legal inconsistency. It is a symptom of a broader political crisis: the commodification of public life and the enclosure of democratic space by privately governed infrastructures.
This paper does not seek to resolve this crisis with a new legal formula. Rather, it aims to make visible the submerged logic behind it: that what is at stake is not privacy per se, but the conditions of worldhood—of being able to appear, act, and be recognized in a shared, non-commodified space.
The issue can be broken into three parts. The first traces the regulatory development of EU data protection law through the case study of Meta’s attempt to comply with the General Data Protection Regulation (GDPR). The second analyzes the contradictions and limitations in how European courts and regulators have addressed Meta’s model. The third reframes the debate using Hannah Arendt’s political theory, arguing how individual consent is insufficient to safeguard democratic spaces and suggesting for a shift from the language of individual consent to a collective politics of access, one that treats digital platforms not just as services, but as spaces essential to democratic life and the formation of collective agency.
2. The “Pay or Okay” Controversy
Since the GDPR came into force in the EU, companies like Meta have engaged in an ongoing regulatory dance with EU institutions,[1] trying to reconcile data-driven business models with the legal boundaries of personal data processing.[2] At the heart of this controversy is a basic principle: under the GDPR, the collection and use of personal data must be lawful.[3] This means data can only be processed on specific legal grounds, such as, contractual necessity, legitimate interest, or, the informed and voluntary consent of the individual.[4] Without a legal basis, data processing is illegal.
Meta, whose core revenue comes from behavioral advertising,[5]—ads tailored through extensive tracking of user activity—has tested the limits of each of these legal bases. Initially, it claimed that personalized advertising was contractually necessary for delivering its services. However, the European Data Protection Board (EDPB) and Ireland’s Data Protection Commission (DPC) concluded otherwise. In December 2022, through binding decisions, they ruled that personalized advertising is not necessary for the performance of the user-platform contract.[6]
Meta then invoked “legitimate interest,” arguing that tracking-based advertising was commercially essential and broadly expected by users. But this too was deemed insufficient. The EDPB found that the intrusion into user privacy outweighed Meta’s commercial aims, particularly given its dominant market position and the scale of its data collection.[7] This left Meta with only one option to obtain users’ explicit, informed, and freely given consent.
But the conditions for valid consent under the GDPR are demanding. It must be specific, unambiguous, and freely given.[8] The law explicitly prohibits “take it or leave it” approaches, where access to a service is made contingent on agreeing to data processing.[9] The aim is to ensure real autonomy: users must be able to refuse consent without being penalized or excluded.[10]
A 2023 ruling by the Court of Justice of the European Union (CJEU) added further nuance. In a case brought by Germany’s competition authority, the court held that a company’s market dominance does not automatically invalidate consent, but it does raise serious concerns about whether consent is truly voluntary. The court suggested that users must be given a meaningful alternative, potentially a paid version of the service, if they choose not to consent.[11]
This became the legal opening for Meta’s current “pay or okay” model: users are asked to either consent to personalized ads or pay for an ad-free experience.[12] On the surface, this offers a choice and may appear to satisfy the GDPR requirements. But it has triggered sharp regulatory and political responses.[13]
In April 2024, the EDPB issued a formal opinion warning that in most cases, requiring users to either pay or surrender their data does not amount to freely given consent.[14] The Board argued that large platforms, due to their centrality in public life and their concentration of power, must provide a third option: a free version of the service that does not rely on behavioral advertising. Only then can users make an unpressured, autonomous decision.
This so-called “third option” might include services supported by contextual advertising, where ads are based on the content being viewed rather than the user’s personal profile. It could also involve general, non-tracking-based advertising aligned with topics selected by the user. The point is not to eliminate advertising altogether, but to ensure that users can access digital services without being forced into either providing payment or tracking.
The European Commission also initiated investigations under the Digital Markets Act (DMA), a regulation that imposes heightened obligations on so-called “gatekeepers,” large online platforms with entrenched market positions.[15] Under the DMA, gatekeepers must obtain users’ freely given and informed consent before combining personal data across different services, and must offer an equivalent alternative that uses less personal data if consent is refused.[16]
In June 2025, the Commission fined Meta €200 million for failing to comply with these requirements.[17] Although Meta later introduced a modified version of its model, allegedly relying on less intrusive data practices, the Commission’s decision pertains to the earlier implementation of the “pay or okay” model, and the updated version remains under ongoing investigation.
Whether Meta’s model can survive this evolving regulatory landscape remains uncertain. Courts may again be asked to clarify the meaning of “free choice,” or determine what constitutes an “appropriate fee.” But the larger shift is already underway. Binary consent models are no longer seen as sufficient.
![[Image source: noyb.eu]](https://static.wixstatic.com/media/e28a6b_1a983bc4811d4ef7b72a89a00374cea4~mv2.png/v1/fill/w_49,h_24,al_c,q_85,usm_0.66_1.00_0.01,blur_2,enc_avif,quality_auto/e28a6b_1a983bc4811d4ef7b72a89a00374cea4~mv2.png)
3. The Market Economy Contradiction
At the core of the GDPR’s prohibition of “take it or leave it” practices lies a principled commitment to protecting individual autonomy.[18] But when viewed through the lens of contractual exchange and market logic, this commitment creates a profound contradiction, one that destabilizes the very foundations of voluntary transactions in liberal economies.
Platforms like Meta operate on a simple premise: users receive access to valuable digital services—social networking, communication tools, curated content—in exchange for personal data. If data is understood as a form of currency,[19] this is not unlike other familiar transactions. A service is rendered and compensation is provided.[20] No cash changes hands, but the relationship is grounded in mutual benefit and consent.[21] This model reflects the basic logic of most sectors in liberal market economies, where private actors strike agreements under transparent terms and each party derives value from the exchange.[22]
Yet under the dominant interpretation of the GDPR, especially Recital 43 and subsequent opinions by the EDPB, this logic is fundamentally disrupted. Regulators argue that platforms must provide access to services even when users refuse to share the data that sustains those services financially. In effect, companies are compelled to give away the product of their infrastructure and investment without receiving either money or data in return. This is the equivalent, in economic terms, of mandating that a restaurant serves meals to patrons who refuse to pay.
This contradiction becomes sharper when viewed in light of the CJEU’s ruling, which held that dominant platforms like Meta may satisfy GDPR requirements by offering users a choice: consent to behavioral advertising or pay a subscription fee.[23] At first glance, this model appears to reconcile personal autonomy with commercial necessity. But this framing misconstrues the very nature of genuine consent by reducing it to the mere availability of a paid alternative. Freely given consent does not depend on the existence of multiple payment paths. The legitimacy of consent is not secured by offering a user the option to pay instead of surrendering their data, but by ensuring that any decision is made transparently, without deception or coercion. The presence of alternative forms of compensation, whether financial or data-based, does not in itself make a choice genuinely voluntary. Just as a restaurant is not required to accept dishwashing as proof that its patrons are freely choosing to dine, a platform need not provide a cash-based subscription model to validate the authenticity of user consent.
This is precisely what makes the CJEU’s 2023 ruling so revealing. By holding that dominant platforms like Meta could comply with the GDPR by offering users a binary choice, the Court attempts to reconcile two irreconcilable demands, the GDPR’s strict standard for uncoerced consent and the economic logic of contract-based exchange. Rather than confront the fundamental contradiction, the Court reframes the issue as one of payment optionality. This preserves the letter of the law while avoiding the political implications of a regulatory stance that, if taken to its logical end, would dismantle the business model of an entire sector. The result is a legal fiction that treats market-style options as evidence of freedom, while sidestepping the deeper structural constraints that shape those choices.
This unusually high standard for consent creates a regulatory asymmetry that is virtually unprecedented in liberal, market-based societies. In no other domain of economic or legal life are individuals so extensively shielded from their own capacity to enter into binding agreements.[24] Western legal systems routinely recognize that rights though fundamental are not absolute.[25] They are relational, contextual, and often transactional.[26] Individuals voluntarily limit their rights every day through contracts, employment, tenancy, service agreements, and even recreational activities. These limitations are not seen as violations but as exercises of contractual autonomy: decisions to exchange some degree of liberty, privacy, or control for access to goods, services, or opportunities.
In the case of employment contracts, workers routinely consent to highly intrusive terms including monitoring of communications, restrictions on speech, control over schedules and behavior, all in exchange for a wage. Rental agreements limit how tenants may use a property. Loan contracts impose strict financial obligations over years or decades. None of these arrangements require the state to verify whether each alternative path was perfectly equal. The test is far more modest. Was the agreement entered into knowingly, voluntarily, and with clarity? If so, the law defers to the judgment of the individual.[27]
Why, then, is privacy treated differently, held to a dramatically higher standard that borders on paternalism? The answer often invoked is that privacy is uniquely intimate, closely tied to identity and autonomy. This argument confuses the function of privacy with its form. Privacy is not an untouchable right held in isolation. It is a structural value designed to support broader ends, chief among them, personal freedom.[28] Like other rights grounded in autonomy, its force lies precisely in the individual’s ability to waive, trade, or modify it under conditions of informed choice.
Indeed, privacy is perhaps the most consent-dependent of all rights. It exists not as a shield against all observation or data processing, but as a domain of control: the ability to decide what to disclose, to whom, and under what terms. The right to privacy does not collapse simply because one chooses to share information. On the contrary, that choice is what constitutes privacy. When individuals voluntarily agree to exchange personal data, whether for convenience, entertainment, or access to a service, they are not being exploited. They are exercising the very agency privacy was meant to protect.[29]
Consider the cultural phenomenon of reality television, programs like Big Brother, where participants voluntarily give up virtually all aspects of privacy, agreeing to be filmed 24/7 and having their behavior publicly broadcast. These are extreme, total invasions of privacy, yet they are socially and legally accepted because the participants have consented. If such profound exposure can be legitimate through voluntary agreement, why should a user’s decision to share behavioral data in exchange for a free digital service be deemed inherently suspect?
The GDPR’s approach, by contrast, is averse to data commodification[30] and treats consent in the data economy as presumptively compromised. It imposes a burden of justification on any data-based exchange, assuming exploitation unless proven otherwise, despite recognizing virtually no comparable constraints in other areas of economic or contractual life. This double standard not only undermines the coherence of contractual logic, but risks infantilizing individuals by shielding them from the consequences of their own reasoned decisions. It replaces the right to choose with a regime of suspicion, in which data exchanges are acceptable only when they conform to an abstract, regulator-defined model of autonomy.
4. Privatization of the Public Space
These arguments are not meant to protect corporate interests but to expose a deeper contradiction in contemporary regulation. The issue is not regulatory overreach in defending privacy, but that regulators remain confined within a normative framework of individual rights and market logic, one increasingly inadequate to the problem at hand. If the prohibition of “take-it-or-leave-it” practices holds any coherence, it lies beyond that framework, rooted instead in a political intuition. Platforms are no longer private actors offering optional services, but infrastructures of social life itself. These are public spaces where agency is exercised and constructed. As such, they ought to be held collectively and cannot be governed solely by market rules.
This tension reveals more than concern for individual consent. It exposes an unarticulated discomfort with the commodification of spaces historically public—managed by political communities or the state—that have now been seized by private entities. The real problem then, is not that platforms violate privacy per se, but that they commodify access to the digital public sphere, the space where we form opinions, negotiate meaning, engage politically, and construct collective life. What is being privatized is not merely data, but the conditions of appearance, the very infrastructure through which we become social and political beings. By resisting the exchange of access for data or money, regulators defend something deeper than individual privacy: the conviction that certain spaces must remain outside market logic altogether.
Such intuition cannot be explained by familiar frameworks of data protection law—informational self-determination, individual autonomy, or contractual fairness—all of which presume the individual as the sole unit of analysis and the voluntariness of market contracts. These assumptions fail when services become the de facto infrastructure of collective life. These are the arenas where agency is constructed and freedom realized.[31] The fundamental contradiction arises from imposing economic logic on what has effectively become a political space, a transformation most clearly illuminated by Hannah Arendt’s analysis.In The Human Condition, Arendt, drawing from the experience of the classical Greek polis, distinguishes between two spheres, the private and the public. The private realm is the domain of necessity and survival, where human beings live as animal laborans, bound to biological processes, engaged in repetitive labor, and absorbed in life’s material reproduction, as much as every other species on earth. The public realm, by contrast, is the space of action, where individuals appear before others through speech and deed, revealing their unique identities and contributing to a shared world. In this space, humans exist not merely as living beings, but as zoon politikon, political beings capable of initiating new beginnings and co-creating the common world through plurality and deliberation.[32]
Modernity, according to Arendt, collapses this distinction. With the rise of what she calls the social sphere, the boundaries between the private and the public become blurred. Biological necessity migrates into the public realm, and the management of life. What was once the concern of households becomes the business of states. Politics becomes a kind of administration, tasked with the large-scale coordination of life processes, health, welfare, and labor. The state becomes, in Arendtian terms, the "head of the household" writ large, the patriarch of a vast family.[33]
This shift has profound consequences. The space of action is replaced by behavior, and freedom gives way to necessity. The public realm is no longer where we appear as individuals among equals, but where we are managed as a mass. The rise of the economic and social sciences, which seek to study, predict, and optimize human behavior, reflects this transformation. Human beings are no longer understood primarily as political agents capable of beginning anew, but as biological and psychological systems to be regulated, measured, and controlled, much like nature itself[34].
The Industrial Revolution was both a symptom and an accelerant of this trend. It applied the organizational rationality of the political realm to that of labor, transforming necessity into a large-scale system of production. As society became increasingly governed by economic logic, the public sphere was subordinated to the imperatives of growth, consumption, and administration. What was once the site of freedom became a site of function.
While Arendt could not have foreseen the full extent of this transformation, its most complete expression lies in the age of data-driven digital platforms. Social media does not merely reflect social life, it organizes and conditions it. These platforms have unprecedented access to the micro-behaviors of users. They can see patterns of attention, affect, and engagement. They do not simply observe the public sphere, they effectively own it, reorganizing it around our animal impulse to behave, not our human capacity to act.
They have become the architects of both private and public space. Informational capitalism colonizes the sphere of labor by extracting data from every gesture, while algorithmic curation accordingly manipulates the sphere of appearance, shaping what is seen and said . Genuine capacity for action is supplanted by a highly engineered impulse to behave. What appears as free expression is often the product of subtle nudges, predictive analytics, and attention-maximizing design.
Though privately owned, these platforms now occupy the role once held by the agora.[35] They are no longer mere communication tools but the very conditions of communication, architectures of public appearance that invade the private sphere while simultaneously dissolving the integrity of the public one. In this way, platforms are not simply market actors but sovereign administrators of a privatized public sphere. They control access, visibility, and relevance, wielding a privileged epistemic position, knowing more about our behavioral tendencies than we do ourselves.
What emerges is a totalizing social logic that collapses the distinction between public and private. Privacy is not merely diminished, the boundary dissolves entirely, replaced by a space where both the private and public spheres are commodified, undermining the conditions for the existence of freedom.
This is where the category mistake occurs: to treat access to digital platforms as a consumer choice, dependent on individual will, is to misrecognize the nature of the space itself. This is not about subscribing to a service. It is about the terms under which one becomes visible in the world where agency can be formed, and that world is already shaped by behavioral prediction and monetization. As a result, the conditions under which political freedom can emerge are no longer neutral. They are engineered.
Herein lies the deeper problem, if the very space in which political freedom is to be exercised and developed is governed by mechanisms of behavioral prediction and monetization, then freedom itself becomes conditional. Privacy is meant to secure autonomy and agency, yet it is made contingent on consent, despite the fact that genuine consent presupposes the very freedom that privacy is meant to protect. Autonomy is not a pre-given attribute. It is cultivated through participation in a shared world. But if that world is structured by commodification, then the freedom to consent is already compromised.[36]
Arendt insisted that freedom is not something we carry with us into the world, ready to be expressed. It is something that emerges through action in a public space. If that space is commodified, then the conditions for freedom’s emergence are undermined. The very idea of meaningful consent, let alone political agency, is eroded. What follows is that privacy, and the autonomy it aims to protect, cannot be meaningfully safeguarded through the individualist frameworks of rights or market exchange. It requires more than informed consent. It requires a prior guarantee that the space in which autonomy is formed remains collectively held, shielded from commodification, and governed not as a marketplace, but as a political commons.[37]
This helps explain the unease regulators express around data driven social media models. Under current legal paradigms, consent that is freely given, specific, and informed renders data processing lawful. Yet even when these formal requirements are met, regulators often resist models that make access to social life conditional on data extraction. This inconsistency is not a failure of legal interpretation. It is evidence of a deeper normative conflict, one that cannot be captured by individualist logic alone. What the law intuits, but struggles to name, is that what is at stake is not merely data, but worldhood, not just autonomy, but the space in which autonomy becomes possible.
This is why the rejection of “take it or leave it” practices and the “pay-or-okay” model cannot be fully explained within existing rights frameworks and market logic which presuppose some level of individual autonomy or agency. It reflects a submerged political judgment that some spaces cannot be bought and sold because they belong to no one and everyone. That they are the very ground of our being together and of our agency. The public realm, in Arendt’s terms, is not a resource to be accessed or traded. It is a common world, sustained not by transactions, but by appearances—by the presence of individuals acting, speaking, and disclosing themselves to one another as equals.
What courts and regulators sense, but do not yet say, is that the question is no longer whether users consent to platform access. It is whether the conditions under which freedom becomes possible are being systematically dismantled. In this light, the contradictions of European data law with market economies logic are not anomalies, they are signals. They mark the limits of a legal order that continues to treat public life as a commodity, even as it senses that something irreducible is slipping away.
5. Conclusion
The regulatory confrontation with Meta’s data-driven model presents a surface-level paradox. The company offers users a formal choice, either pay for an ad-free service or consent to data collection in exchange for free access. Within the logic of the market, this appears to satisfy the core tenets of fairness, autonomy, and informed consent. And yet, European regulators have resisted this model, effectively requiring Meta to provide access without conditioning it on either payment or personal data. From a market perspective, this seems inconsistent, perhaps even excessive.
But this apparent contradiction only arises if we insist on framing the issue within the boundaries of market logic. Once we recognize that platforms like Meta no longer function as ordinary service providers, but as infrastructural spaces where public life unfolds, the regulatory stance begins to make coherent political sense. These platforms are not simply products to be purchased or declined. They are the arenas in which visibility, participation, and collective life are constituted—spaces where agency is formed and enacted. When access to such spaces is commodified, the very conditions that make freedom and public life possible are placed at risk.
Seen this way, the resistance to “pay-or-okay” models is not a regulatory overstep, but a political intuition. It signifies that some spaces must remain outside of market terms altogether. What is being defended is not just privacy in the narrow sense, but the public good it aims to protect. The public realm itself as the infrastructure of appearance and action. The regulatory discomfort, though often couched in the vocabulary of rights and consent, ultimately points toward this deeper concern, that if the terms of access to public space are defined by economic exchange, then autonomy itself becomes contingent, and the space of freedom contracts.
This reframing also helps us return to the question posed at the outset: if we understand what’s happening, why do we still consent to it? Why does it still feel wrong? The answer, in Arendtian terms, lies in the erosion of the very space where genuine action might take place. Behavior persists in place of action not because we are incapable of agency, but because the structures that cultivate and support that agency have been reconfigured to extract value, not enable freedom. Consent, under such conditions, is less a choice than a symptom of a constrained political horizon.
Ultimately, the contradiction is not within the regulatory response, it is within the logic that assumes these platforms can be governed solely as markets. Once that assumption is set aside, the normative core of current regulatory decisions comes into view. The legal reasoning may still be developing, but the political judgment is already active: some spaces should not be bought. They must remain open not because of tradition, nor efficiency, but because they are the grounds of democratic life itself.
Glossary
Agency: The capacity to act intentionally and meaningfully within a shared world. In this context, agency refers to more than individual decision-making, it involves the ability to participate freely in shaping one's environment and collective life.
Agora: The ancient Greek public space for political deliberation.
Algorithmic Curation: The process by which digital platforms use algorithms to selectively organize, prioritize, and present content to users. Rather than reflecting neutral or organic activity, algorithmic curation shapes what is visible, relevant, and engaging based on platform-driven goals, often optimizing for attention, engagement, or profit. In doing so, it influences perception, discourse, and behavior in subtle but powerful ways.
Behavioral Prediction / Behavioral Analytics: Techniques that use user data and algorithms to forecast behavior, often applied in ad targeting and content personalization.
Commodification: The process of turning non-market goods, such as personal data or democratic discourse, into commercial products for exchange or profit.
Contractual Autonomy: The principle that individuals can freely and voluntarily enter into binding agreements, assuming informed and uncoerced participation.
Data as Currency: The idea that individuals pay for digital services not with money, but with personal data, which becomes a medium of exchange.
Democratic Discourse: The collective communication and debate through which citizens engage with shared political, social, and ethical issues in a democratic society.
Epistemic: Relating to knowledge, understanding, or the conditions under which something is known. In this context, epistemic refers to the unique position digital platforms occupy in shaping what is seen, known, and valued in the public sphere.
Greek Polis: An ancient Greek city-state that served as the foundational model of political life in classical philosophy.
Individual Rights Framework: A legal and philosophical approach focused on protecting individual autonomy, dignity, and freedom often through enforceable personal rights.
Informational Capitalism: An economic system in which information, particularly personal data, becomes a primary source of value extraction and profit. In this context, informational capitalism refers to the way digital platforms monetize users’ behaviors, emotions, and interactions by turning them into data to be analyzed, predicted, and sold, thereby extending capitalist logics into the intimate and communicative spheres of life.
Informational Self-Determination: The right of individuals to control the collection, use, and dissemination of their personal data; foundational in European data protection law.
Liberal Market Economy: An economic system grounded in private ownership, free markets, and minimal state intervention, emphasizing individual freedom and contractual relations.
Micro-behaviors: Small, often unconscious actions or patterns, such as clicks, scrolls, pauses, likes, or facial expressions, that reveal user preferences, attention, and emotional states.
Normative Conflict: A clash between competing moral or social norms that creates tension or ambiguity in decision-making. In the context of social media and data consent, normative conflict arises when individual autonomy collides with broader societal values.
Nudges: Subtle design choices or cues intended to steer behavior in a particular direction without restricting options or altering incentives. In the context of digital platforms, nudges often appear as interface features, notifications, or content placement that guide user attention and choices, shaping engagement in ways aligned with platform interests, often without users’ conscious awareness.
Plurality: The condition of human diversity and the existence of multiple distinct individuals who appear before one another as equals in the public realm.
Political Commons / Public Good: Resources that benefit all members of society, like clean air or open discourse, and are meant to be preserved collectively rather than privatized.
Public Discourse: The open exchange of ideas, opinions, and information among members of a society, typically occurring in spaces accessible to all.
Sovereign: Possessing ultimate authority or control over a domain. In this context, sovereign describes how digital platforms, despite being private entities, exercise governing power over key aspects of public life, such as access to information, visibility, and communication.
Tracking-Based Advertising: A system of delivering personalized ads by collecting and analyzing users’ behavior across platforms and services.
Worldhood: Concept referring to the shared, structured context in which people act, relate, and make meaning, shaped today by digital infrastructures.
Zoon Politikon: An Aristotelian concept that defines humans as inherently political beings, whose nature is fulfilled through participation in public life.
Footnotes
[1]See Andrew Folks, 'Pay, OK or a Third Way: Context, Analysis from the EDPB's Opinion' (International Association of Privacy Professionals, 23 May 2024).
[2] What might seem like a European dispute is in fact globally significant. Through the Brussels Effect, EU regulatory standards tend to radiate outward, reshaping global digital practices. The GDPR is not only a benchmark for data protection in Europe, it has inspired legislation in Brazil (LGPD), California (CCPA), and beyond. In this context, the unfolding dispute over Meta’s business model is less a regional technicality than a case study in how digital economies are governed.
[5]On how these platforms' business models generate revenue from data collection and targeted advertising, see Michael Veale and Frederik Zuiderveen Borgesius, ‘Adtech and Real-Time Bidding under European Data Protection Law’ (2022) 23 German Law Journal 226.
[6]European Data Protection Board, Binding Decision 4/2022 on the dispute submitted by the Irish SA on Meta Platforms Ireland Limited and its Facebook service (Art. 65 GDPR) (EDPB, 5 December 2022).
[7]European Data Protection Board, Urgent Binding Decision 01/2023 on Meta Platforms Ireland Limited (Art. 66(2) GDPR) (EDPB, 27 April 2023)
[9] Recital 43 (2), GDPR. See also Recital 42 (5).
[10]See European Data Protection Board, ‘Guidelines 05/2020 on Consent under Regulation 2016/679’ (25 May 2020) EDPB 10/EN.
[11] Case C‑252/21 Meta Platforms Inc and Others v Bundeskartellamt ECLI:EU:C:2023:537
[12] Meta, Facebook and Instagram to offer subscription for no ads in Europe (Meta Newsroom, 2024).
[13]Civil society organizations have strongly criticized this approach, disparagingly referring to it as the “pay or okay” model. See NOYB, ‘28 NGOs Urge EU DPAs to Reject “Pay or Okay” Model by Meta’ (noyb, 2024).
[14] European Data Protection Board, Opinion 08/2024 on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms (EDPB, April 2024).
[15] European Commission, 'Commission opens non-compliance investigations against Alphabet, Apple and Meta under the Digital Markets Act' (European Commission, 25 March 2024).
[17] European Commission, Commission finds Apple and Meta in breach of the Digital Markets Act (23 April 2025).
[18] See European Data Protection Board, ‘Guidelines 05/2020 on Consent under Regulation 2016/679’ (25 May 2020) EDPB 10/EN.
[19] The commodification of personal data is highly contested, despite being the de facto economic model underpinning much of the digital economy, as demonstrated by Oladapo Agboola, ‘Personal Data as Currency: Navigating the Economics Behind Free Digital Services’ (21 April 2025) SSRN. Numerous scholars have critiqued this paradigm, as shown in Salomé Viljoen, ‘Data as Property?’ Phenomenal World (16 October 2020). By referencing this critical scholarship, I acknowledge the significant challenges posed to the commodification framework. Nevertheless, in this section, I engage with the “data-as-currency” logic to examine its appeal within liberal market traditions. In the following section, however, I argue against the normative desirability of such commodification.
[20] See Linus J Hoffmann, Commodification beyond Data: Regulating the Separation of Information from Noise (2023) 2 European Law Open 424. It offers a compelling account of the actual value digital platforms provide in today’s information-saturated environment. The author argues that the core service offered by companies like Meta and Google is not merely data insights to other companies, but the technological separation of valuable information from digital noise. Through search engines, recommender systems, and targeted advertising, these platforms help users navigate the overwhelming flow of online content.
[21] This logic of voluntary exchange—foundational to liberal market economies—maintains that private actors freely enter into contracts under transparent and consensual terms, each deriving mutual benefit. This principle is not merely theoretical; it underpins the institutional design of Western democracies, particularly in the economic domain. Classical western thinkers such as Adam Smith and Milton Friedman articulated this vision in which economic freedom—rooted in individual consent, private property, and contractual autonomy—serves as a cornerstone of personal liberty and democratic governance.
[22]These ideals remain embedded in the legal and cultural fabric of Western liberal democracies today. For instance, the enforceability of private contracts is protected under constitutional and civil law frameworks across jurisdictions like the US and the EU. The liberal tradition underpins key doctrines such as the freedom of contract and the minimal state interference in private economic arrangements, upheld in decisions like Lochner v New York 198 US 45 (1905). Similarly, the EU’s internal market, while more protective of consumers, is structured explicitly on the free movement of goods, services, capital, and persons—principles that reflect the same commitment to voluntary exchange and economic self-determination.
[23] Case C‑252/21 Meta Platforms Inc and Others v Bundeskartellamt ECLI:EU:C:2023:537.
[24] While it is true that contractual freedom is limited across numerous areas of law, such as consumer protection, employment, family law, and financial regulation, these limitations typically target specific terms or the effects of contracts, rather than challenging the core nature of contract formation itself. For example, in employment law, statutory protections, such as minimum wage requirements, working hour limits, and health and safety regulations may override the specific terms of an employment contract. However, the contract itself remains valid and enforceable. In these cases, the law restricts what can be agreed upon, not whether parties are permitted to enter into a contract in the first place.
[25] This principle is explicitly embedded in the European Convention on Human Rights (ECHR), where many rights such as freedom of expression (Article 10), the right to privacy (Article 8), and freedom of religion (Article 9) include internal limitation clauses. These provisions allow for interference with a right when it is deemed “necessary in a democratic society” for purposes such as the protection of national security, public order, health, morals, or the rights and freedoms of others. Similarly, the Charter of Fundamental Rights of the European Union affirms that fundamental rights may be subject to limitations, provided such limitations are “provided for by law,” “respect the essence of those rights,” and comply with the principle of proportionality (Article 52(1)). This principle which requires a balancing of rights against competing rights or interests is central to European fundamental rights jurisprudence.In U.S. constitutional law, rights are also subject to limitation, albeit through a different doctrinal structure. While the U.S. Constitution does not include explicit limitation clauses, the courts have developed a robust jurisprudence applying tiered levels of scrutiny depending on the nature of the right and the context in which it is exercised. For example, freedom of speech under the First Amendment enjoys strong protection, but is not absolute. Exceptions exist for incitement to violence, defamation, and obscenity, as established in landmark cases such as Brandenburg v. Ohio (1969) and Miller v. California (1973).
[26] Karl Polanyi’s The Great Transformation highlights how legal institutions were instrumental in commodifying labor, land, and money transforming them into "fictitious commodities" to enable the rise of market capitalism. This transformation was not spontaneous but legally constructed. Processes like enclosure, wage labor, and the emergence of private property relied on new legal norms and doctrines to enable exchange and accumulation.Julie Cohen in Between Truth and Power argues that law plays a foundational but often unacknowledged role in shaping the economic and ideological architectures of liberal democracies. Legal systems not only facilitate market expansion and neoliberal governance but also reshape themselves to accommodate new market logics, such as informational capitalism. Rights, within this framework, are not static protections but flexible instruments, routinely invoked, traded, or limited within legal processes that reflect shifting economic rationalities.These perspectives illustrate that rights in liberal democracies are deeply embedded in broader systems of exchange, commodification, and governance, making them not only relational but often inherently transactional.
[27]See Yulie Foka-Kavalieraki and Aristides N. Hatzis, ‘The Foundations of a Market Economy: Contract, Consent, Coercion’ (2009) 8(1) European View 29. The article provides a compelling argument that imperfections, which are inherent in markets and consequently in contracts, do not diminish the fundamental importance of contracts and consent in facilitating economic transactions. A common criticism of market economies is that many transactions involve some level of coercion, challenging the notion of freely consented contracts. The authors address this critique by distinguishing between different types of coercion. They argue that coercion in contract formation only applies when a party is forced to accept a choice they do not want, rather than simply facing a difficult choice. By clarifying this distinction, the article defends the legitimacy of contracts as expressions of genuine consent and essential building blocks of market economies.
[28] Despite the ECHRs’ recognition of the right to privacy as a condition for the development and fulfilment of personality deeply tied to human dignity, as seen in cases such as Reklos and Davourlis v. Greece (2009) EMLR 290 and Burghartz v. Switzerland (1994) ECHR, the foundational conception of privacy has long centered on autonomy. Daniel J. Solove, in A Brief History of Information Privacy Law, traces this back to the notion of privacy as the "right to be let alone," primarily understood as a safeguard against government intrusion into individual life. Giovanni De Gregorio in digital constitutionalism, builds on this autonomy-oriented foundation by showing how data protection, in the European framework, evolves from this negative liberty into a positive right, designed not only to shield individuals from interference but to empower them with control over their personal information in the digital age. Data protection thus emerges as a constitutional response to the threats of automation and algorithmic governance, reinforcing personal autonomy through the principle of informational self-determination.
[29] In Why Privacy Matters, Neil Richards offers a compelling account of privacy as a matter of degree, rather than an absolute state. He defines privacy as “the degree to which human information is neither known nor used,” emphasizing that most personal information exists in a continuum between complete secrecy and full public exposure. This framework challenges binary assumptions that once information is shared, it ceases to be private. Instead, Richards underscores that privacy is not eliminated through disclosure but is shaped by the context, audience, and scope of that disclosure.Richards’ account supports a conception of privacy as fundamentally grounded in control and consent. Privacy is best understood not as total invisibility or isolation, but as the ability to determine what information is shared, with whom, and under what conditions. It is this selective sharing, this exercise of judgment over dissemination, that constitutes the very practice of privacy. Sharing information does not waive the right to privacy, rather, the choice to share is what affirms that right.
[30] Bart Custers and Gianclaudio Malgieri, ‘Priceless Data: Why the EU Fundamental Right to Data Protection is at Odds with Trade in Personal Data’ (2022) 45 Computer Law & Security Review 105683 clearly demonstrates how EU data protection law resists the commodification of personal data. They argue that treating personal data as a tradable good, as common in many digital business models, is fundamentally incompatible with the EU framework, where data protection is an inalienable fundamental right under both the Charter and the GDPR. Unlike commodities, personal data cannot be owned or permanently transferred, even when users consent to processing, they retain the right to withdraw that consent at any time. This undermines the stability of data-as-payment models and introduces legal uncertainty, highlighting the tension between EU fundamental rights and data market practices.
[31] In Rachel Griffin, ‘Rethinking Rights in Social Media Governance: Human Rights, Ideology and Inequality’ (2023) 2(1) European Law Open 30, 30–56 similarly critiques the limitations of individualistic logic, focusing on platform regulation. She argues that rights-based frameworks centered on the individual cannot address the structural and collective dimensions of harm inherent in social media governance. This critique can be extended to data protection law, where dominant concepts such as informational self-determination and contractual fairness presume voluntariness and individual agency, assumptions that falter when digital platforms become essential infrastructures of public life. Just as Griffin calls for moving beyond individual rights to confront platform power, a similar shift is necessary in data protection to reckon with the structural conditions that shape access to and agency within digital public spaces.
[32] Hannah Arendt, The Human Condition (2nd edn, University of Chicago Press 1998) 22–31
[33]Ibid. 28–58
[34] Ibid. 42-46
[35] The analogy to the agora reflects a growing recognition, both legal and philosophical, that digital platforms function not merely as commercial service providers but also as de facto public spaces. Courts and regulators have increasingly acknowledged that platforms mediate essential forms of democratic participation, communication, and visibility. In the U.S., this recognition appeared in dicta from cases such as Packingham v. North Carolina, where the Supreme Court described social media as “the modern public square.” In the European context, the Digital Services Act (DSA) formalizes a shift in regulatory framing, it begins to treat platforms not only as economic actors but as infrastructures of collective life. These are spaces that shape discourse, identity, and social reality. The DSA, while still operating within market-based logics, introduces elements of public accountability, risk mitigation, and democratic oversight, signaling a more collective logic of governance.
However, this recognition remains uneven across legal domains. While platform regulation has begun to conceptualize platforms as conditions of shared worldhood, privacy and data protection law remains tethered to an individualist paradigm. It treats privacy primarily as a personal entitlement, an informational shield, rather than as a structural precondition for political agency. This creates a normative and conceptual mismatch. As Arendt argued, the public and private spheres are co-constitutive: freedom in the public realm depends on a protected private realm where individuals can retreat, reflect, and form themselves. Without such a protected sphere, the capacity for meaningful appearance and political action is diminished. Thus, while both platform regulation and data protection aim to safeguard agency, only the former begins to reimagine the conditions under which agency can emerge. The result is a fragmented regulatory landscape, one that intuits the political stakes of digital infrastructures but lacks an integrated theory of freedom capable of addressing them.
[36]Francesco Pira, ‘Disinformation a Problem for Democracy: Profiling and Risks of Consensus Manipulation’ (2023) 8 Frontiers in Sociology. Pira shows how individuals increasingly operate within digital echo chambers that prioritize emotionalism, confirmation bias, and identity tailoring over deliberative engagement. This analysis supports the argument above by demonstrating how commodified information environments erode the shared, trustworthy spaces necessary for genuine political agency and consent. Pira’s analysis can also be seen as a digital evolution of Noam Chomsky’s Manufacturing Consent thesis where Chomsky exposed how traditional mass media structures serve elite interests through subtle framing and agenda-setting. Pira shows that social media extends this manipulation through algorithmic personalization and behavioral profiling.
This erosion of agency is confirmed by Franziska Zimmer, Wolfgang G Stock and Katrin Scheibe, ‘Fake News in Social Media: Bad Algorithms or Biased Users?’ (2019) 7(2) Journal of Information Science Theory and Practice 40, who show that algorithmic filter bubbles do not operate in isolation but amplify users' own cognitive biases, leading to echo chambers where falsehoods are reinforced rather than questioned. In such spaces, the very conditions for reflective judgment and dissent are weakened, directly compromising the individual’s capacity for autonomous political reasoning. David Lauer, ‘Facebook’s Ethical Failures Are Not Accidental; They Are Part of the Business Model’ (2021) 1 AI and Ethics 395 adds that these dynamics are not accidental but intrinsic to platform business models like Facebook’s, which profit from outrage, division, and disinformation. The monetization of attention incentivizes engagement-maximizing algorithms that distort informational environments, making it structurally difficult for users to encounter balanced perspectives or exercise informed consent.
[37] Salomé Viljoen’s thesis in Data as Property? resonates closely with the claim that freedom must be constituted through collective governance of the spaces where autonomy is formed. Like the critique that commodified digital infrastructures erode the very conditions for meaningful consent and agency, Viljoen argues that data about individuals should not be treated as personal property subject to market exchange, but as a democratic resource requiring collective governance. Her concept of “data egalitarianism” calls for legal regimes that recognize the relational nature of data and its population-scale impact, proposing democratic, not individualistic, structures to govern how data is collected, used, and distributed. This reorientation shifts the goal from protecting individual liberty to enabling the positive conditions of freedom through shared obligation, reinforcing the idea that autonomy cannot thrive in privatized, commodified public spheres.
Sources
Books
Arendt H, The Human Condition (2nd edn, University of Chicago Press 1998)
Chomsky N and Herman ES, Manufacturing Consent: The Political Economy of the Mass Media (Pantheon Books 1988)
Cohen JE, Between Truth and Power: The Legal Constructions of Informational Capitalism (Oxford University Press 2019)
Friedman M, Capitalism and Freedom (University of Chicago Press 1962)
Polanyi K, The Great Transformation: The Political and Economic Origins of Our Time (Beacon Press 2001)
Richards NM, Why Privacy Matters (Oxford University Press 2023)
Smith A, The Wealth of Nations (Penguin Classics 1999)
Journal Articles
Agboola O, ‘Personal Data as Currency: Navigating the Economics Behind Free Digital Services’ (21 April 2025) SSRN https://ssrn.com/abstract=5227471 accessed 1 July 2025
Custers B and Malgieri G, ‘Priceless Data: Why the EU Fundamental Right to Data Protection is at Odds with Trade in Personal Data’ (2022) 45 Computer Law & Security Review 105683 https://www.sciencedirect.com/science/article/pii/S0267364922000309 accessed 1 July 2025
De Gregorio G, ‘Digital Constitutionalism, Privacy and Data Protection’ in Digital Constitutionalism in Europe: Reframing Rights and Powers in the Algorithmic Society (Cambridge University Press 2022) 216–72 https://www.cambridge.org/core/books/digital-constitutionalism-in-europe/digital-constitutionalism-privacy-and-data-protection/E1725A06254D721E8E5D1D6B461CFAA2#chapter accessed 1 July 2025
Foka-Kavalieraki Y and Hatzis AN, ‘The Foundations of a Market Economy: Contract, Consent, Coercion’ (2009) 8(1) European View 29 https://doi.org/10.1007/s12290-009-0081-y accessed 30 June 2025
Griffin R, ‘Rethinking Rights in Social Media Governance: Human Rights, Ideology and Inequality’ (2023) 2(1) European Law Open 30–56 https://www.cambridge.org/core/journals/european-law-open/article/rethinking-rights-in-social-media-governance-human-rights-ideology-and-inequality/7DF50DD0BD3466FF3BD86909A2A6437A#fn66 accessed 30 June 2025
Hoffmann LJ, ‘Commodification Beyond Data: Regulating the Separation of Information from Noise’ (2023) 2 European Law Open 424 https://www.cambridge.org/core/journals/european-law-open/article/commodification-beyond-data-regulating-the-separation-of-information-from-noise/1B901FED81ACFDA726D24406F408DF65 accessed 30 June 2025
Lauer D, ‘Facebook’s Ethical Failures Are Not Accidental; They Are Part of the Business Model’ (2021) 1 AI and Ethics 395 https://doi.org/10.1007/s43681-021-00068-x accessed 30 June 2025
Pira F, ‘Disinformation a Problem for Democracy: Profiling and Risks of Consensus Manipulation’ (2023) 8 Frontiers in Sociology https://doi.org/10.3389/fsoc.2023.1150753 accessed 1 July 2025
Solove DJ, ‘A Brief History of Information Privacy Law’ in Proskauer on Privacy, PLI (2006) https://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2076&context=faculty_publications accessed 29 June 2025
Veale M and Zuiderveen Borgesius F, ‘Adtech and Real-Time Bidding Under European Data Protection Law’ (2022) 23 German Law Journal 226 https://www.cambridge.org/core/journals/german-law-journal/article/adtech-and-realtime-bidding-under-european-data-protection-law/017F027B4E78EBCAE1DCBC1E12B93B9D accessed 30 June 2025
Viljoen S, ‘Data as Property?’ (Phenomenal World, 16 October 2020) https://www.phenomenalworld.org/analysis/data-as-property/ accessed 29 June 2025
Zimmer F, Stock WG and Scheibe K, ‘Fake News in Social Media: Bad Algorithms or Biased Users?’ (2019) 7(2) Journal of Information Science Theory and Practice 40 https://doi.org/10.1633/JISTaP.2019.7.2.4 accessed 30 June 2025
Cases
Brandenburg v Ohio, 395 US 444 (1969) https://supreme.justia.com/cases/federal/us/395/444/ accessed 1 July 2025
Burghartz v Switzerland (1994) ECHR 22 https://hudoc.echr.coe.int/rus#{%22itemid%22:[%22001-57865%22]} accessed 1 July 2025
Case C‑252/21 Meta Platforms Inc and Others v Bundeskartellamt ECLI:EU:C:2023:537 https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:62021CJ0252 accessed 1 July 2025
Lochner v New York, 198 US 45 (1905) https://supreme.justia.com/cases/federal/us/198/45/ accessed 1 July 2025
Miller v California, 413 US 15 (1973) https://supreme.justia.com/cases/federal/us/413/15/ accessed 1 July 2025
Packingham v North Carolina, 582 US 98 (2017) https://globalfreedomofexpression.columbia.edu/cases/packingham-v-state-north-carolina/ accessed 1 July 2025
Reklos and Davourlis v Greece (2009) EMLR 290 https://hudoc.echr.coe.int/eng#{%22itemid%22:[%22001-90617%22]} accessed 1 July 2025
Legislation and Legal Instruments
Charter of Fundamental Rights of the European Union [2012] OJ C 326/391 https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:C2012/326/02 accessed 1 July 2025
European Convention on Human Rights (ECHR) (opened for signature 4 November 1950, entered into force 3 September 1953) https://www.echr.coe.int/documents/d/echr/convention_ENG accessed 1 July 2025
Regulation (EU) 2016/679 (General Data Protection Regulation) [2016] OJ L119/1 https://gdpr-info.eu/ accessed 30 June 2025
Regulation (EU) 2022/1925 (Digital Markets Act) [2022] OJ L265/1 https://eur-lex.europa.eu/eli/reg/2022/1925/oj/eng accessed 30 June 2025
US Constitution amend I https://constitution.congress.gov/constitution/amendment-1/ accessed 1 July 2025
Policy and Institutional Materials
European Commission, ‘Commission Finds Apple and Meta in Breach of the Digital Markets Act’ (23 April 2025) https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2341 accessed 1 July 2025
European Commission, ‘Commission Opens Non-Compliance Investigations Against Alphabet, Apple and Meta Under the Digital Markets Act’ (25 March 2024) https://ec.europa.eu/commission/presscorner/detail/en/ip_24_1687 accessed 1 July 2025
European Data Protection Board, Binding Decision 4/2022 on the Dispute Submitted by the Irish SA on Meta Platforms Ireland Limited and Its Facebook Service (Art 65 GDPR) (5 December 2022) https://edpb.europa.eu/our-work-tools/decisions/binding-decisions/4-2022-dispute-submitted-irish-sa-meta-platforms_en accessed 1 July 2025
European Data Protection Board, Guidelines 05/2020 on Consent under Regulation 2016/679 (25 May 2020) EDPB 10/EN https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202005_consent_en.pdf accessed 1 July 2025
European Data Protection Board, Opinion 08/2024 on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms (April 2024) https://edpb.europa.eu/our-work-tools/our-documents/opinion-art-64/opinion-082024-valid-consent-context-consent-or-pay_en accessed 1 July 2025
Other Online Sources and Commentary
Folks A, ‘Pay, OK or a Third Way: Context, Analysis from the EDPB's Opinion’ (IAPP, 23 May 2024) https://iapp.org/news/a/pay-ok-or-a-third-way-context-analysis-from-the-edpb-s-opinion accessed 30 June 2025
Meta, ‘Facebook and Instagram to Offer Subscription for No Ads in Europe’ (Meta Newsroom, 2024) https://about.fb.com/news/2024/02/facebook-and-instagram-subscription-no-ads-europe/ accessed 1 July 2025
NOYB, ‘28 NGOs Urge EU DPAs to Reject “Pay or Okay” Model by Meta’ (2024) https://noyb.eu/en/28-ngos-urge-eu-dpas-reject-pay-or-okay-model-meta accessed 1 July 2025