Data centers are the invisible backbone of the new economy. When we talk about artificial intelligence, the focus often falls on algorithms, predictive capabilities, and impacts on the job market. Yet, little is said about the physical infrastructure that sustains this intelligence: structures that consume energy, store sensitive data, and concentrate informational power at a scale that challenges traditional legal logic. It is precisely at this point that Bill 3018/2024 seeks to intervene, proposing a regulatory framework for AI data centers, with an emphasis on security, privacy, transparency, and sustainability. At first glance, it seems like a rational and well-intentioned response to technological advancement. But through the lens of Law and Economics, what is at stake is far more than mere technique — it is the battle over the incentives that will shape the future of digital innovation.
The first relevant aspect of the bill lies in its attempt to correct a clear market failure: the asymmetry between the social costs of poor data governance and the private benefits captured by operators. When personal data is leaked, manipulated, or used to train biased algorithms, it is rarely the data center operator who bears the consequences — it is society, it is the citizen whose autonomy is compromised without even noticing. As Steven Shavell points out, legal intervention is economically justified when it creates mechanisms for internalizing negative externalities. That is precisely what Article 3 of the bill attempts to do by requiring robust cybersecurity, algorithmic traceability, and transparency in data use. Regulation imposes a cost that forces private agents to account for the side effects of their activities — and that is the heart of any efficient regulation: ensuring that those who profit from risk also bear its costs.
However, the requirement of algorithmic transparency raises a significant technical and economic dilemma. By obligating operators to disclose information about how their AI algorithms function, the bill touches on one of the most sensitive issues of our time: the opacity of deep neural network models. It is not always possible to explain in human terms how a system arrived at a particular conclusion — and when it is, it often requires the disclosure of valuable trade secrets. A generic demand for transparency, without appropriate technical mediation, may discourage investment or result in mere box-ticking exercises, with standardized and ineffective reports. Law and Economics warns of this risk: rules that attempt to resolve everything through top-down mandates, without considering compliance costs, can be economically inefficient — creating incentives for evasion, litigation, or mere regulatory paralysis.
Another sensitive point is energy efficiency. The popular image of AI as a “clean,” cloud-based solution contrasts with the physical reality of data centers — structures that demand colossal amounts of energy, especially during the training of large language models. Article 5 of the bill gets it right by including clear requirements for energy efficiency and environmental sustainability, but the real challenge is structural: even as equipment becomes more efficient, the rebound effect can nullify these gains, since greater efficiency lowers the marginal cost of processing and stimulates excessive use. The Law and Economics response would be to introduce dynamic pricing and incentive mechanisms — such as progressive taxation based on energy intensity per terabyte processed, or subsidies granted only to data centers powered by renewable sources. Without such tools, we are left with a well-intentioned legal command that falls short in the face of the economic logic of private maximization.
The requirement for clear data governance — including the appointment of a Data Protection Officer and periodic impact assessments — aligns with what Richard Posner might call “pricing informational risk.” By mandating that companies disclose and internalize the cost of collecting and processing data, the State forces operators to confront the trust dilemma — arguably the most scarce commodity in the digital age. However, formal compliance alone does not guarantee effectiveness. As I demonstrate in my book, rules only produce real effects when paired with reputational structures and compatible incentives. For this reason, the regulation of data centers must be accompanied by reputational mechanisms, such as public rankings, best practice certifications, and disclosure obligations that directly influence investor and consumer decision-making.
In sum, Bill 3018/2024 has notable merits and should be understood as the first step in building an institutional framework for AI data centers. But if we want this framework to be functional, it must be designed as a system of incentives. It is not enough to impose obligations; behaviors must be shaped. This demands dialogue with industry actors, rigorous economic analysis, and the courage to experiment with dynamic regulatory formats. As Guido Calabresi taught, good law is not that which forbids everything, but that which offers pathways for private interest to align with the public good. Artificial intelligence will redefine markets, democracies, and human relationships. If we want this transformation to be ethical, safe, and efficient, we must begin at the beginning: with infrastructure. Regulating data centers is regulating the future.