Building programmable commons

ABSTRACT Before rushing into regulating order-enabling computational technologies, is there a different way to think about them? What if power over what can be computed, translates to power over what can be decided? Can we then shape policies for the management of technologies that do not just ‘take’ power, but make it? By reviewing early work of network theorists and Internet scholars as well as literature on the governance of the commons, this paper argues that beyond market, states, and their hybrids and beyond private property and public sector regimes, there exists political space for social practices and transformative legal interventions that can give shape to radically different institutional actions for the management of the world’s infocomputational resources. Programmable commons and the public value of programmability are thus introduced as parts of a broader political project that aspires to democratise access to, and management of these resources. By drawing on the history of a family of commons – namely intellectual commons, infrastructure commons, and global commons – this paper explores the material form and impact of infocomputational technologies and presents a blend of bottom-up and top-down initiatives for their commons-based organisation.


Introduction
Commons have histories rooted in social processes; histories that emerged from altruistic patterns of collaborative action and social practices that connect to and amplify the public value emanating from a place or space deemed as capable of benefitting the many.Ever since its development, the analytical toolkit of the commons and its theoretical outreach have been The structure of the paper proceeds as follows: The paper starts with a selected overview of the scholarship and policy debate on data commons followed by a descriptive section on the concept of programmable commons.After that, it explores 'neighbouring' categories of commons (intellectuals commons, infrastructure commons, and global commons) to draw lessons and vocabularies from their histories and development while mapping their characteristics on the contemporary legal and political context of the digital economy.Sections 4 and 5 develop the idea of programmable commons as an alternative modus operandi for the governance of our digital world and explains what law and civil society can do to start seeing them.

Modern approaches to digital commons
During the last decade, the rapid expansion of data-extractive technologies and business models motivated scholars to theorise and provide recommendations for non-commodified alternatives to the data economy.These alternatives can take various forms.Helpfully, in a recent report for the Panel for the Future of Science and Technology of the EU (STOA), Solano and others divided existing data governance arrangements into categories based on their purpose, beneficiaries, value, and tools used to pool data.Accordingly, data commons can take the form of public data trusts, data collaboratives, data cooperatives, data (semi)commons, indigenous data sovereignty, or personal data sovereignty. 6ecent scholarship has been focusing on the theoretical foundations of digital commons and practical ways for their materialisation by adopting a data-centric approach.In this direction, for example, Birkinbine observed the porous relationship between the open source and commercial spheres and offered an account of digital commons that incorporates a structural critique of capitalism to move beyond a 'politics of provision'. 7In parallel, building on Ostrom's institutional framework for commons-based management, Morell presented a roadmap for the bottom-up creation of data commons by highlighting 'the need for community control over the collaborative process of building the common-pool resource'. 8Moving from theory to practice, Bria calls us to draw inspiration from movements advocating for collective management of public resources (ie water and electricity) and mobilise for commons-based governance of, and public investment in, data-intensive infrastructures for cities and governments. 9n a similar context, by operationalising some of these ideas into (legal) reality, Delacroix and Lawrence challenged contractual approaches to data management and US-oriented accounts of 'information fiduciaries' and argued for data management through 'data trusts' whose goal would be to aggregate data and centralise its management under a non-corporate scheme, the trust. 10eanwhile, recent EU legislative and political initiatives have been experimenting with commons-like terminology and provisions.The Data Governance Act (DGA) dedicates an entire chapter to 'Data Altruism' organisations tasked to 'support purposes of general interest'. 11The DGA along with the proposed Data Act aspire to materialise the EU's vision for the creation of common EU data spaces to promote innovation and job growth and make high-value datasets available for the common good. 12In the same spirit, the European Commission has recently published its proposal for a European Health Data Space Regulation whose goal is to establish for the first time the legal and infrastructural framework for the interoperability of electronic health record systems and the secondary use of health data within the EU in order to decentralise health data exchange and facilitate the pooling of health data within and across Member States. 13The proposal also builds on the DGA to include provisions for the implementation of data altruism in health.
Although, in theory, these initiatives are inspired by and, in turn, aim at contributing to some sort of commons-based data management, their actual relevance to commons is unclear.Fundamentally, patterns of altruistic behaviour can only result from bottom-up processes of engagement among different actors rather than imposed by a legal mandate.Law can indeed play its part in boosting the dynamics of social cooperation by generating rights and entitlements to acknowledge and encourage altruistic behaviour, but this is not the direction the EU has taken.For example, instead of developing or supporting existing entities authentically committed to data altruistic practices, the DGA introduces commercially-oriented data intermediaries thereby favouring corporate-like schemes that 'simply provide an alternative to current market practice with a platformized business model'. 14Equally, faced with the challenge of bringing 'data altruism' from theory to practice, the EU started from the top by introducing new, and often overlapping, authorities and deadlines instead of drawing inspiration from and supporting the needs of communities that, driven by genuine altruism, have managed to build and maintain their own data spaces in the past. 15Finally, none of the provisions for common European data spaces seems to challenge the power of the existing private data spaces.Indicatively, although the material scope of the Data Act would offer the ideal opportunity to legislate a right to access privately-held data for purposes of public interest, the proposal seems to limit such scope only to reasons of 'public emergency'. 16ore importantly, shifting power dynamics merely through data governance reforms would not necessarily mend the power asymmetries in the political economy of AI.Regardless of the amount of data one can access, power imbalances will persist as long as such access is 'given' by an entity that also happens to have ownership over the computational means that can collect and process data at scale.
In other words, although data is power, power is not only data.In the technology realm, alongside data power, there are at least three other forms of power that are normatively significant yet usually overlooked by mainstream law and policy discourse: infrastructural power, organisational/logistical power, and value-chain power.
Infrastructural power refers to Big Tech's ownership over the means of computational production (i.e.operating systems, cloud environments, or sensors networks) and the centralisation of power 'by virtue of dominating access to data, storage space, computational power to process them, [and]  financial resources to afford the resources needed for developing machine learning pipelines'. 17In turn, logistical/organisational power enables Big Tech to entangle third-parties to specific growth models and centralised operational workflows by leveraging scalable and agile computing thereby ordering economic models and deepening path-dependencies. 18Finally, Big Tech's leading role in the design and manufacturing of new products inheres the ability of the very same large companies to tightly monitor and control the Global Value Chains (GVCs) of the AI industry, from semiconductors to machine learning value chains.By establishing and designing the hardware/software parameters of their devices and data centres, Big Tech's orchestrating role in global trade flows puts enormous pressure on their suppliers, dominates global manufacturing capacity, and facilitates value capture (usually from the Global North) and cost displacement (usually to the Global South) whilst stiffening the range and resilience of programmability in the process. 19s a result, insofar as power is not challenged in its totality, Big Tech will not feel too uncomfortable when called upon to be altruistic, contribute to some sort of data commons, or provide access to its troves of data as long as it maintains the ability to determine and leverage the allocation of computational resources to whatever ends it considers worth pursuing.For example, OpenAI's ChatGPT has a free version available to anyone subject to available capacity; a subscription based ($20/ month) formula that provides faster and priority access to the model's features; and an API-based access for developers that want to build applications on top of it, currently priced at 0.002$ per token.More than that, new forms of dependencies may arise as strategies of/for control and monetisation run deeper.For instance, if certain leaks are to be believed, OpenAI will soon launch a new service, the 'Foundry', to rent compute capacity (probably through Microsoft's Azure Cloud) to third parties that wish to build inference models similar to the company's ChatGPT.Prices for this service will range from $264,000 to $1,584,000 per year. 20or this reason, an account of commons that aspires to become a meaningful counterforce to the powerful technology companies that commodify our data and optimise its value for their profit requires holistic considerations of the means and conditions of data production, its value chains, and future trajectories, and whether (and how) people can question the status quo, imagine alternatives, and work together to create them.

Situating the programmable commons
Throughout this article, the term 'programmable commons' is used to refer to the set of arrangements that are and should be in place for the management of the world's infocomputational resources.In turn, the term 'infocomputational resources' is borrowed from natural sciences where it is used to encompass 'how increasingly complex structures develop as a result of information processing in nature'. 21Infocomputationalism refers to the unified framework of two complementary concepts, information, and computation, that together represent 'structure and process, being and becoming'. 22As such, although a thorough account for the migration of the concept of infocomputationalism to the data, law, and policy realm remains out of the scope of this paper, the term 'infocomputational resources' is used here to describe the material ensembles of information and computation, upon which modern digital infrastructures are built and function.For example, a data centre or an operating system can be viewed as an infrastructure built upon infocomputational resources, namely data collected in aggregate and processed by hardware of scarce availability and finite capabilities.By giving name to what lies underneath the infrastructure, the paper attempts to create the legal-political dynamics for novel normative considerations and intellectual tools.To better explore the particular characteristics of the term and the meaning of programmable commons as a concept, it would help first help to situate them within a group of family commons.

Intellectual commons and the limits of enclosure
Intellectual commons are different from traditional, earthy commons.Information, contrary to fish or fruits, is a non-rival and non-excludable good and as such 'one person's use of knowledge … [does] not subtract from another person's capacity to use it'. 23At the same time, one's input to the intellectual commons may be someone else's output.As Boyle insightfully notes: 'Every increase in protection raises the cost of, or reduces access to, the raw material from which you [may build] future products.' 24Therefore, contrary to natural resources, intellectual commons are not subtractive but generative. 25lthough generative commons normally allow the creation and building of new resources by its users, 'generative' does not necessarily mean valueladen.Rather, as Ostrom and Hess point out: '[A commons'] outcome can be good or bad, sustainable or notwhich is why we need understanding and clarity, skilled decision-making abilities, and cooperative management strategies to ensure durable robust systems'. 26his brings us to one of the most fundamental differences between traditional and intellectual commons.Contrary to earthy commons whose existence is rooted in communal activities and patterns of social coordination, intellectual commons are primarily established by socio-legal constructions that serve explicitly value-laden goals.Cohen eloquently observes that '[t]he process of constructing a public domain begins with an act of imagination that doubles as an assertion of power'. 27In the same spirit, Beniger argues that information, contrary to matter or energy, is epiphenomenal and derivative in that its value derives from its end-directed organisation; an end-directed organisation which is not static but as Morell notes, a permanent 'work in progress' that pools dispersed information and cognitive capacities in evolving bodies of shared knowledge. 28he role of law in directing these ends is vital.Boyle persuasively argues that the public domain was supposed to be protected rather than harmed by intellectual property law.But legal strategies of 'more is better' fuelled by major advancements in technologies of reproduction reverted this constitutive destination for the benefit of those who pushed for more protection. 29imilar to the value-laden foundations of the public domain, the organisation of our contemporary digital environments, our biopolitical public domain per Cohen, is end-directed towards enabling 'a particular set of information-based extractive activities'. 30owever, despite the inherent characteristic of the public domain as constructed rather than naturally occurring, the vocabulary and conceptual tools that were used to carve out its structure and analyse its governance resembled those of the traditional commons.Terms such as 'enclosure' have dominated the conceptual frameworks and policy debate on the impact of new technologies and intellectual property law on the public domain.This is not surprising.Traditional commons and intellectual commons are both restricted through and by the formal acknowledgment of property rights.Yet, contrary to earthy commons whose nature of existence is restricted the moment a property right is realised (either physically and/or legally), intellectual commons operate on a more complicated basis.
Useful for our endeavour to understand infocomputational resources is the observation that the 'enclosing capacity' of intellectual property law, trade secrecy law, and contract law is only one of the vehicles capable of eroding the fabric of the public domain and the intellectual commons.As the short history of open software indicates, the open vs closed binary is not synonymous with the proprietary vs non-proprietary.Instead, following a short period of fierce resistance, corporations later embraced open software and incorporatedliterally and figurativelyits outputs and culture in their organisations. 31Broumas acknowledges the asymmetry between the commodification powers of capital and the non-monetary values promoted by intellectual commons and concludes that within a capitalist system, regardless of the power of the 'commoning' activity 'commons-based peer production is constantly co-opted in multiple ways as a component to the dominant mode of capitalist intellectual production/distribution/ consumption'. 32t then becomes simplistic to think of enclosure as the only threat against intellectual commons.Intellectual commons are not only enclosed by property rights and similar legal fences put up by the market and/or state forces; rather, like islands under the threat of sea levels rise, they are gradually inundated due to a complex web of socio-economic forces we cannot directly perceive, scrutinise, and control.As a result, when legal and policy attention is drawn to the praxis of 'enclosure' of the 'biopolitical' public domain, part of the picture is missing.
Echoes of the enclosing narratives can also be found in contemporary accounts of closure, capture, and infiltration of the public sector and sphere by Big Tech and its digital infrastructures.These accounts are invaluable for documenting the power and pervasiveness of Big Tech. 33However, if our legal analysis starts from the point an enclosure or a capture is realised, 31  we are led to policy dialogue and legal constructions that risk legitimising the power asymmetries that generate the enclosing capabilities in the first place.
Recent calls for legitimate frameworks or even quasi-constitutional regimes for carving out what is allowed and what not when Big Tech attempts to undertake the provisions of services traditionally belonging under the auspices of states are all examples of such approaches;34 approaches that are based on the, often inescapable, assumption that powerful technology companies are indispensable agents in the quest for change rather than entities that benefit from particular techno-legal constructions and assumptions about the nature of the resources they have come to extract and commodify.
For the problem with digital infrastructures and the functionalities they control and enable is not only the 'overtaking' of power or their 'spillover' to other areas and domains of expertise.Sometimes, Big Tech does not 'take' power; it makes it.Novel forms of power can emerge.As Poon predicted: 'Recent experience shows us that the operational structures that depend upon big data will generate novel mechanisms of value production that were not anticipated by earlier network theorists'. 35he recent deployment of the Apple AirTag, which utilises Ultra-Wideband (UWB) technology to detect proximity among people and objects, exemplifies this form of power. 36What preceded the stalking incidents involving AirTags was the material installation of an UWB chip on iPhone 11 and subsequently on its Android competitors, the Galaxy Note 20 and Pixel 6.This was not just a neutral commercial project.Instead, the newly added function and potential of this material configuration engendered the implied statement that tracking people's surroundings at a population level is a computational project worth pursuing.As a result, the moment this material configuration happened, novel attributes were added to the (biopolitical) public domain; attributes that remained under the exclusive control of their manufacturers and their programmable APIs, chipset requirements, and development kits. 37ower, therefore, takes power, makes power, and ultimately, changes power.Although algorithmic tools are usually imagined as 'helping' and 'informing' decision-makers, past experience with automated tools indicates that infocomputational capabilities do not merely assist decision-making.Instead, expanding on what can be computed extends the boundaries of what can be decided. 38Commenting on the work of Philip Agre, Hildebrandt observes how computation transforms the computed individual and its environment by allowing 'the parsing and reconfiguration of human behaviour in a way that fits the need for formalization'. 39In the same spirit, McQuillan notes how computational models are viewed as the 'method' for transcending from the imperfect world to a 'neoplatonic' vision of society where knowledge is achieved only when people manage to compute their way towards it by amplifying the visible and formalised to the detriment of the invisible and the experienced. 40 real-world example of the transformative (and often devastating) ability of computational technologies to change the rules and norms of the environment where they are deployed is offered by Martha Poon's fascinating work on the materiality of risk score systems and their effect on subprime mortgage lending industry.By tracing the history of these advanced calculators, Poon illustrates how the institutional integration of these systems generated 'calculative possibilities' that shifted the dynamics of an entire sector from controlling creditworthiness by screening to managing creditworthiness as risk. 41As a result of this transformation, for example, the binary 'yes / no' decision on granting a mortgage was abandoned for the sake of a scale of creditworthiness that opened up novel calculative possibilities, thinking patterns, and commodification opportunities.
Hence, just as law contributed to the construction and enclosure of informational commons and set new standards for cultural production, private actors with immense capacity for leveraging code and computation are constructing and sustaining their own private systems of social interaction; systems of generative possibilities for everyone to access, but for few to programme and reprogramme.These systems do not merely substitute existing social and political practices nor do they only enclose domains constitutively destined to serve the public.Instead, through their innovative tools and sprawling webs of influence, they imperceptibly fertilise a computational worldview where rules and norms for future development of social practices are based on criteria of computability. 42If it computes, it can be decided and if it does not, it will.

Infrastructure commons: generative but specialised
Another category that resembles programmable commons, is the infrastructure commons.Merriam-Webster defines infrastructure as 'the underlying foundation or basic framework (of a system or organization)'.According to Frischmann, infrastructures are shared means to many ends whose management is important insofar as it facilitates positive externalities and maintains the social value of the infrastructure by precluding premature optimisation. 43By adopting a demand-rather than supply-side perspective, Frischmann warns that ignoring the demand for infrastructure leads to undersupply and underuse of infrastructure and to optimisation for private gain.Bringing such optimisation to an infocomputational context, Gürses explains why optimisation leads to an asymmetrical concentration of resources towards these companies (infrastructures) 'which can collect large scale data and muster the computational power to process these in the pursuit of financial gain'. 44For these reasons, Frishmann notes that 'society benefits tremendously when leveraging non-rivalry to support non-discriminatory access to [non-traditional infrastructure resources] because doing so enables the public participation in a wide range of socially valuable activities'. 45he analytical framework of infrastructure commons has been extensively used to describe and examine the processes of value creation and exchange in domains such as telecommunications, public services, transportation, and the Internet.Internet governance and software studies, in particular, have been 'turning to infrastructure' to frame problems and shape holistic understandings of the nature of the material ensembles that sustain our digital world. 46In this direction, scholars who study Internet infrastructures have been drawing attention to the political character of the internet stack, whilst others have been highlighting the need to provide measures of accountability for the configuration of the various application layers. 47In a similar context, ongoing work in the intersection between infrastructures and platforms as well as the technologies that create platform-like functions and infrastructure-like dependencies has been highlighting the transformational potential of software and hardware modalities for the market and society. 48ust like commons, adopting an infrastructural lens allows us to see the previously unseen and to conceptualise complex system organisations and the interplay of power dynamics developed therein.Adopting such a lens can also enrich our legal analysis.Arguing for a legal-anthropology perspective of law and technology, Turner and Wiber note that: '[i]nfrastructure, governance, and power as analytical tools […] help to discern the production of normativity by routinization of social practices within a given infrastructural design that is itself law producing'. 49ommons-based forms of governance for the world's infocomputational resources will inevitably depend on some sort of digital infrastructure.However, infrastructures are established systems of organisation and management destined to enable access to and use of certain resources.But with our goal being the study of the resource itself, a debate on 'infrastructural' terms risks obliviating issues related to how we think about infocomputational resources and how we act for their (democratic) governance and (sustainable) management.Breaking down the ingredients of modern digital infrastructures will thus allow us to discern their value-laden nature and will help us imagine what could democratic and sustainable alternatives look like.As Brumas points out: [a] more balanced approach [on the impact of information technologies on social antagonisms] should research and identify the specific changes that have taken place in production, distribution and consumption, and the potentials that they open for anti-capitalist alternatives. 50r this reason, our enquiry for understanding and situating the programmable commons starts from the infrastructure and dives deeper to look into the 'things' that enable their materialisation and how they are produced 47  and function, while also accounting for the economic, political, and social context within which these technologies are built and marketed.
Early discussions on the Internet were inspired and triggered by the unprecedented burst of cooperative activity for creative work that network technologies facilitated on a global scale.Benkler viewed the low cost of computation and communication and the consequent widespread adoption of means of cultural production by the masses as a pivotal moment in the information age. 51Open-Source Software, Wikipedia, and other endeavours of, and for, collective action thus became success stories of a technology that was viewed as the transition from the industrial revolution to the dawning era of information.At the epicentre of this revolution was the Internet, a network of networks, that enabled people from across the world to pool their computational and knowledge resources in the pursuit of a goal they considered as worth pursuing.The Internet and the PCs that connected to it were both generative technologies in that they could be leveraged by their users to run applications and offer services not necessarily presaged by their manufacturers.As Zittrain wrote: 'The generative PC has become intertwined with the generative Internet, and the whole is now greater than the sum of its parts'. 52It is precisely this generativity, meaning the ability of technologies to allow the creation and building of new features and technologies, that encouraged consumers to share information and computing capacity with a myriad of people across the world to create and run applications thereby generating powerful network externalities.
Today, the material conditions of computation have changed as private actors consolidate power in the software and hardware value chains.Access to state-of-the-art 'general purpose' models of advanced computation takes place through vertically and horizontally consolidated providers that build, distribute, host and control (mostly via APIs) the underlying systems' capabilities. 53In parallel, the abundance of available data has transformed AI research and development from a symbolic, knowledge-based endeavour to a world-scale data-driven experimentation project destined to respond to predetermined narrow tasks. 54Deep learning benefited from and has driven the demand for faster and more energy-efficient computation which in turn boosted innovation in hardware.Image recognition and natural language processing applications materialised deep learning's computational potentialities thereby altering the landscape in semiconductors' production. 55As Hwang observes, 'hardware actively shapes the landscape of what can be done with the technology of machine learning, and plays a significant role in influencing how it will evolve going forwards'. 56In a passage that could be read as the historical affirmation of the moment of change in the history of computation, the International Technology Roadmap for Semiconductors report of 2015 reads: 'No longer a faster microprocessor triggers the design of a new PC but on the contrary the design of a new smartphone generates the requirements for new [integrated circuits] and other related components'. 57ndicatively, although the hardware requirements of OpenAI's LLMs are kept secret, we know that their training required 'thousands' of Nvidia GPUs running in parallel to and within specialised hardware and software architectures of Microsoft's Azure platform; 58 a system architecture that builds on decades of experience and that cannot be replicated even if one has the capital and the supply chain resilience to acquire 'thousands' of Nvidia GPUs.
Hence, although it is still unclear whether fit-for-purpose hardware will challenge the dominance of GPUs in the demand for computation-intensive tasks, infrastructural strategies for tailored optimisation coupled with the shift towards specialised hardware 'may not be better for everyone, but it will be better for some' as generative systems and applications that rely on universal chips are likely to be left behind with vendors shifting their business models and assembly lines to satisfy the big players' demand for hardware specialisation. 59Hwang argues that the increasing demand for computationally powerful hardware will intensify the dynamics in the geopolitics of AI. 60 As the semiconductors' shrinking race has almost reached the limits of known physics, a 'looming scarcity' threatens to destabilise the industry and subsequently the world. 61Quite alarmingly, Thomspon and Spanuth predict that by 2026-2032 'leading-edge semiconductor manufacturing will only be able to support a single monopolist manufacturer'. 62Under such circumstances, programmability gradually stiffens and as a result, the dominant firms are left alone in the race of leveraging its generative potential.
Coupled with that, the dynamics of software production have also gravitated towards those firms with the computational power to support it.Deep learning does not only produce applications.Instead, as Luitse and Denkena note, once a deep learning model is trained it becomes a means of production inviting software developers to step in not only to benefit from energyefficient computation but primarily to follow a development path of predetermined possibilities. 63In the same context, Gürses and Hoboken explore how changes in software production generate dependencies on a handful of companies that are in control of modular, always-on service architectures thereby capturing the environment within which actors live, create, and produce. 64In discussing the centrality of the material transformation in hardware and software production, Burrington suggests we start talking and thinking about 'means of production' rather than 'infrastructures'.She writes: 65 What's at stake for both the tech industry and government regulators isn't what is or isn't infrastructure, but what the ownership and profit model for that infrastructure looks like and whom it benefits.
In such a context, to question the ownership and profit model of the conglomerates that sustain and (re)programme our digital world, we need to think about, as well as discuss and regulate not only infrastructures (whatever this term might mean for the purposes of law) but also resources, how they are developed and distributed, and towards what ends we envision their programmable generativity to work.Beniger's account of the historical transition from the commercial era to the industrial revolution is once again helpful here as it acknowledges that the pivotal factors in shifting the dynamics of production towards a new world era were innovations in speed, regularity, and predictability.He writes: Until [the] application of steam power to the material economy, the entire operations of the Second Bank, with twenty-two branch offices and profits fifty times those of the largest mercantile house in the country, could be run by just three people […]. 66ut what would it mean for the speed, regularity, and predictability of production to be dependent on the logistical and computational supply chains of a few powerful actors?What if the engines of speed, regularity, and predictability in global commerce are not generative and open to all but dependent on infocomputational resources that are run by just four companies?Shall private power be left alone in 'gatekeeping' access to such resources?
A focus on the resource itself as well as on the degree of generativity of the infrastructure that has been set to exploit it enables the development of demand-oriented policies for digital infrastructures that extend the boundaries of the political and regulatory playfield and opens up possibilities for transformative institutional actions.Creating the legal-political space for such policies requires not only the often reactionary and ex post analysis of the extractive power of the infrastructural forces, but a parallel positive agenda that would seek to institutionalise the dialogue on the resources themselves, the GVCs that sustain them, their legitimate 'owners', and their value for our collective future(s).Questioning the power of digital infrastructures thus becomes conditioned on understanding how the demand for different infrastructures could look like.

Global commons and the transnational internet
The literature on global commons is vast and has encompassed a broad spectrum of domains including but not limited to public health, food security, the environment, the oceans, the atmosphere, and the markets. 67Contrary to international commons whose resources are shared by particular nations (ie the Mediterranean), global commons are resources to which all nations have legal access (ie the outer space).Inevitably, the magnitude of both international and global commons renders their governance a complex process.This is because regime formation at an international level is contingent on national concerns, the accumulation and use of scientific knowledge for evidence-based decision-making, and the influence of government and nongovernmental actors. 68egardless of their degree of complexity, Young recognises three key elements in institutional regimes: a substantive element (rights and rules), a procedural element (allocation of resources, distributive functions, and resolution mechanisms), and an enforcement element (monitoring compliance). 69By following Ostrom's scholarship, Young identifies three tragedies towards global sustainability (the tragedy of the commons, the 67 Hess (n 1) 32-3. 68 tragedy of private property, and the tragedy of public domain) and argues that there can be no panaceas or global formulas, only institutional diagnostics aiming at providing governance solutions to particular problems on a case-by-case basis.To do so, Young suggests a mix of top-down regulatory measures and bottom-up social practices with normative characteristics. 70t its core, the governance of global and international commons rests on the assumption and belief that ecological and institutional sustainability are appropriate policy goals for resources shared among peoples and nations. 71he legitimacy of these policy goals usually relies on the normative pillar of intergenerational equity though, oftentimes, the demand for their prioritisation in the international agenda is fuelled by arguments of natural law (ie the common heritage of mankind). 72These normative anchors are products of deliberative processes taking place within various institutions across national and international settings.Endowed with badges of legitimacy and rule of law considerations, these principles guide and shape the legal-political agenda on the governance of global commons (ie environmental governance).
Whatever its axiological content, the formal acknowledgment of the normative foundations upon which global commons are established is a necessary but not sufficient condition for attaining the goal of their sustainable management.Markets and the law can interfere in various ways and establish their hegemonic views on how to achieve political goalsfor example, various standards-setting organisations and non-state, market-driven actors commercialised on the political mandate for environmental protection thereby transforming a political project into a risk-based regime that seeks economic solutions to the economic factors responsible for environmental degradation. 73Institutional analyses of global and international commons offer valuable insights into how deliberative and consensusdriven processes translate normative principles into operational choices for national and international actors.
The global Internet is such an example.The literature here is enormous but, generally, a focal point of these endeavours has been the project of multistakeholderism; a form of institutional design that marked the development of the Internet as a global network of networks.De Nardis and Mueller have been leading scholars in projecting the political character of the negotiations taking place within seemingly apolitical Internet bodies and organisations. 74oving from a theoretical to an empirical grounding, Radu, ten Oever, Cath-Speth and others explored the enactment of governance not only through its formal mechanisms but also through its routine patterns of interaction. 75Others focused on the retreating yet orchestrating role of the state as well as that of private power in instituting forms of transnational Internet governance. 76Marsden argued that the manifest influence of US law in the norms and rules that transcend Internet infrastructure suggests caution when thinking of Internet regulation (or lack thereof) as a nascent field of global governance. 77What this scholarship has persuasively demonstrated is that, although the Internet may indeed seem an ideal field to interpret as a framework of and for transnational governance, a closer look at its institutional design and everyday practice illustrates that 'multistakeholder' does not by itself mean 'democratic' whilst 'technical' cannot by itself preclude the 'political'.
Finally, global legal processes for Internet governance and beyond, are often reserved for those who can walk past their entry points.As a result, various other actors, such as grassroots organisations, activists, and NGOs whose role in mobilising resistance against the established legality is critical but remains local, are left out or, at best, have their voices distorted by an arbitrarily designed representation formula centred on consensus. 78o bring this to our case, cultivating a culture or building an institution for the co-production and co-management of the world's infocomputational resources will require much more than an international organisation of multistakeholder nature or a policy strategy of mandates and priorities.Commons do not happen by declaration.Rather, they represent and embody social processes of both 'pooling resources in common and reproducing the communal relations around these productive processes'. 79n this direction, although the creation of community networks requires colossal investments of time and money due to high fixed costs and the need to provide supplementary services (from raising awareness to technical maintenance), there are fascinating stories for socio-legal researchers to study and voices for policymakers to explore and amplify.People and organisations around the world have been active in buildingusually externally fundedlocal networks, mostly driven by the complete lack of connectivity for people in certain areas of the world, from southwest Detroit to Ghana or Somalia. 80osa's ethnographic work on shared networks in the Tseltal and Zapoteco communities (Mexico) showcases examples of indigenous peoples building and maintaining their communal Internet infrastructure from 'first mile' and upward. 81Other bottom-up collective initiatives include data co-operatives, and citizen-sensor-networks. 82 Their value for the dialogue on the governance of infocomputational resources is significant but remains unexplored.Scholars of global governance suggest that we are ignoring bottom-up movements at our peril. 83verall, the study of commons can offer valuable lessons and provide the institutional and organisational roadmap for transformative strategies towards the conceptualisation and establishment of programmable commons.Intellectual commons and their history teach us that enclosure is only one of the ways private power absorbs non-traditional commons thereby prompting us to explore the context within which private and commons interplay; infrastructure commons demonstrate that ignoring the public demand for infrastructure design and development leads, inevitably, to undersupply and/or optimisation for private, rather than public gain; and, finally, global commons exemplify the importance of maintaining an active civil society alongside a robust institutional framework for global governance.

Towards the commoning of infocomputational resources
Today, the available infocomputational resources are largely controlled and managed by a complex interplay between technology companies and states.Microsoft, Amazon, and Google are the dominant vendors of cloud computing services due to their global network of hardware suppliers, data centres, and high-security standards whilst Google's and Apple's mobile operating systems cover more than 99% of the relevant market worldwide with approximately 72% and 27% market share respectively. 84In parallel, Big Tech companies act as leading entities in the GVCs of semiconductors and related equipment by exercising their buying power over their suppliers (chip designers and manufacturers) whilst, simultaneously, paving their own projects for computational and infrastructural autonomy. 85nevitably partnered with these companies, states have been expanding on their 'own' information and telecommunications infrastructures.These usually take the form of data centres, national clouds, and telecommunication networks.Fuelled by narratives of cybersecurity, self-determination, autonomy, and surveillance, 'digital sovereignty' has thus entered the digital policy dialogue as a 'third way' of managing cyberspace as opposed to, on the one hand, the traditional techno-libertarian approaches of the US, and the Chinese and Russian approaches for internet sovereignty, on the other. 86The ongoing geopoliticisation in the political economy of AI has not decelerated the convergence between national and corporate digital strategies.If anything, it has strengthened it.
Today, states are quite comfortable in outsourcing the responsibility for the organisation and management of theirs and the world's infocomputational resources to the private sector often under specialised agreements for data localisation.Many are the examples of public-private partnerships in the development of various information systems for the public sector, from education to public health and the military. 87Indicative of the degree of convergence between the spheres of private and the public is a new division in Google's organisation which aspires to assist US institutions in accelerating their digital transformation under the title 'Google Public Sector'. 88Also, Microsoft, which maintains a senior position under the job title 'Rule of Law and Responsible Tech', has recently launched its 'Cloud for Sovereignty' service which will allow clients to specify the country or region for most service deployments to satisfy industry, national, or global security, privacy, and compliance requirements. 89s a result, computational systems are designed based on specific, valueladen logics and procured through a complex web of deeply integrated trade flows that leave no room for legal imagination, let alone intervention.Doing things differently seems unthinkable. 90It is unrealistic to expect institutions, NGOs, or local communities to build their own mobile operating systems or cloud environments not least because of the knowledge-and capital-intensive character of the task, issues of interoperability, or the presence of market gatekeepers.They can, and probably should, build their own data infrastructures (and some universities have been moving towards this direction), but it is difficult to imagine such initiatives scaling on their own beyond basic storage and processing capabilities.Indicative of this difficulty is the recent collaboration between Hugging Face, an opensource machine learning (ML) community, with AWS for the supply of infrastructure for ML training and inferencing. 91qually, workers-backed solutions for the management of infocomputational resources are difficult to envisage and pursue.Although workers' organisation and mobilisation are critical in disrupting the governance status quo and triggering the dynamics for institutional change, the pragmatic characteristics of the means of infocomputational production render traditional methods for the claiming of ownership over the means of production ineffective.As Burrington argues: [Shifting] ownership of the means of computation is not as straightforward a process as workers taking over a factory or a mine.With Internet infrastructure we're not talking about a discrete piece of property that can be autonomously taken over: it's cables and antennae and spectrum and all sorts of very expensive stuff that requires specialised technical maintenance, not to mention coordination with other interdependent systems. 92 a result, the current status quo is anything but fertile for the cultivation of those social practices and communal relations required for the formulation of sustainable commons-based infocomputational resource systems.Faced with similar difficulties when thinking about the politics of intellectual property, Boyle insightfully drew on the environmental movement and the way it emerged from the ideas of ecology and welfare economics.
For Boyle, it was these ideas that 'helped to provide its agenda, its rhetoric, and the perception of common interest underneath its coalition politics'. 93y tying together issues that would otherwise remain segmented, the environmental movement offered a new normative anchor and, subsequently, new conceptual as well as analytical tools that ignited the institutional and social dynamics necessary for the formation of diverse political alliances.Inspired by this remarkable socio-political force, Boyle envisioned a similar movement for bringing new into the concept of the public domain.
This was, and still remains, a powerful idea.Environmentalism and the public domain are both concepts with deep historical roots that orientate people and institutions towards value-laden thinking patterns and courses of actions.Just like environmentalism is a concept whose roots can be traced back to ancient civilisations, the public domain and the principal assumption based on which law and policy (should) interfere with itnamely that intellectual property rights are the exception rather than the norm -emanates from somewhat natural tendencies and entitlements of human beings eloquently described by Justice Brandeis: 94 [T]he general rule of law is, that the noblest of human productionsknowledge, truths ascertained, conceptions, and ideasbecome, after voluntary communication to others, free as the air to common use.
But what about the programmable commons and the management of infocomputational resources?Where do these come from?What's their history and where does our link with these resources actually lie?What's the baseline argument from which norms and policy emanate?
The rapid development and penetration of smartphones and cloud technologies in large markets did not give policymakers and civil society the time, space, and resources necessary for properly scrutinising, analysing, and mobilising against their impact, power, and affordances.Indicatively, ChatGPT-4's foundational paper ('System Card') explicitly acknowledges that at the time of release 'many risks still remain'. 95nevitably, we were led to regulatory discussions directed at the outputs of digital technologies rather than their end-directed design.In such a context, for example, hardware was not part of the regulatory debate. 96s a result, contrary to the early discussion on the environmental commons whose normative and intellectual lineage traced back to the ideas of ecology and welfare economics, the discussion on the new commons and the governance of digital infrastructures takes place within a pre-configured universe where perceptions on the transformational potential of people and institutions are determined by what private actors deem as possible and acceptable.

of intellectual commons) that marked the political and legal history of other
There is neither a legal playbook nor a rich intellectual history from where to draw normative inspiration or legal precedent.From data protection and human rights to consumer and competition law, lawyers have been puzzled as to how to respond to the multifarious challenges of computational technologies.
The tight bonds and path dependencies that hold the public and private sectors together, leave little room for thinking and building alternatives.However, more recently, alongside the plethora of resources on the need to create and sustain data commons, an emerging branch of scholarly work has been focusing on the 'commoning' and democratisation of computational resources.Verdegem points out that if we are to confront the power concentration of AI capitalism, data commons and computational commons cannot be realised separately. 97In discussing the governance of such resources, Verdegem argues for commons-based solutions to break up the concentration of computing capacity and AI talent in a handful of technology companies. 98Grossman and others' work on data commons for scientific research uses the example of Open Science Data Cloud to emphasise the need to account for computing capacity (from storage to software stacks for management across PoDs, and analytics methodologies) when thinking about and designing data commons. 99In a similar context, Riedl uses the GPT-3 as a case study to warn against the centralisation of decision-making over access to computing capacity for AI system development. 100Linked to that, an oft-cited study by Ahmed and Wahed demonstrates how access to extremely costly computational resources can negatively affect diversity, of both participants and output, in AI research and focalise the latter within a closed system of elite universities and powerful technology companies. 101uring the last 5 years, more and more technology companies have preached their commitment to making data available to researchers and the public. 102But providing access to data and information without a parallel commitment to providing or building the computational capacity necessary for processing large datasets does nothing to mend the power asymmetries in the political economy AI.
By oversignifying the value of open-access initiatives and by battling over access to data, companies, well-meaning data collectives, and the public cement already existing inequalities in the power distribution over computational resources and reinforce the assumption that contrary to databases, computational resources are not there for the taking.For this reason, the political economy of AI requires the pursuit of a more transformative agenda.

The public value of programmability
How we view ourselves vis-à-vis the technological environments we 'inhabit' matters.Ostrom's foundational work on commons emanated from the assumption that individuals are not inherently incompetent, irrational, or evil but, agents who share the same 'limited capabilities to reason and figure out the structure of complex environments'. 103This led Ostrom to observe that the origin of a 'tragedy of the commons' is not some inherent characteristic of human nature, but rather the specific institutional choices over the constitutive and operational rules for the management of a particular resource system.In the same spirit, the assumption that is made here (which also puts a discipline on the argument) is that people who use a smartphone or a cloud environment are not ignorant, irrational, or apathetic agents indifferent as to how the systems they interact with are (re)configured but agents who lack the technical and institutional capacity necessary to pierce the veil of their interfaces and push for techno-legal changes in their environments' underlying systems.
During the last decade, the technology sector has witnessed a plethora of movements, litigation strategies, and campaigns for more transparent, egalitarian, and democratic digital arrangements. 104In the computational sector, activists, engineers, and academics have been raising awareness over the impact that Microsoft's acquisition of GPT-3 can have on innovation. 105pple faced a huge backlash when it tried to add novel capabilities in iOS 15 with the introduction of an on-device Client-Side Scanning system for the detection of child sexual abuse materials (CSAM) in iMessages. 106ore recently, commenting on and accurately predicting the trajectories of immunity certification systems, Milan and others observed how the tionality of immunity passports 'risks enacting an open-ended, digital and largely privatised infrastructure for proving things to entities'. 107Finally, during the pandemic, motivated by concerns over the power and ubiquity of infocomputational resources (data and sensors), a group of computer scientists, privacy engineers, epidemiologists, and others came to together to build and successfully campaign for the mass-adoption of a privacy-preserving protocol for proximity tracing to combat the spread of COVID-19.
Underneath these stories of collective actions and mobilisations rests the belief that programming and reprogramming the systems, architectures, and technologies that mediate and sustain our individual and collective lives is a global project that requires not only accountability and transparency but an active commitment to certain values.Programmability of such systems is understood by the institutional actors involved as a political endeavour to be undertaken by and for the people whilst the diversity of pertinent movements and initiatives are viewed as forms of institutional actions aiming at building capacity and modes of democratic governance.In such a context, participants of infocomputational systems are simultaneously users and potential appropriators and/or providers of infocomputational resources.
OpenAI's models were trained on publicly available data (i.e.books, webpages, and images) whilst Google's Bard was released prematurely aspiring to get better following users' 'feedback' (a term that, by necessity, means 'free labour'). 108Unsurprisingly, neither OpenAI nor Google have disclosed the exact source and nature of the training data claiming reasons of competition and safety. 109Scholars have rightfully condemned the lack of transparency in the process. 110But regardless of whether OpenAI will abide by administrative mandates of transparency, there are important questions that shall not be ignored.What happens when the generative potential of the publicly available informational wealth of, literally, the entire world, is left to be appropriated by those companies with the logistical capital and computational capacity to amass it?Whose is this informational wealth?Shall companies like OpenAI/Microsoft or Google be left alone in creating monetisation opportunities and business models based on this wealth?
Recognising the public of programmability changes the normative and political currents in the techno-legal sphere.That is, we no longer aim at regulating private entities solely because of their immense power in controlling and monitoring online behaviour and determining possibilities for public policy, but because they have come to exploit resources that should have been distributed, maintained, and managed according to different values and priorities.Born out of the natural-law-like admission that infocomputational resources shall primarily benefit the needs of the people who (want to) use them, programmable commons emerge as a viable policy option for economic and social action.
There are, at least, three reasons why thinking and (institutionally) acting in a commons-like manner and strategy can prove transformative.Firstly, at an operational level, developing a toolkit for understanding and shaping policy towards commons-based organising and governance can unify previously separated political agents and forces.In this direction, the vision for programmable commons and the public value of programmability embrace and accelerate the transformative potential of existing movements by connecting agents and initiatives that would otherwise remain separate.In such a context, end-to-end encryption and net neutrality, journalists' and workers' protections for whistleblowing, the development of technologies for the protection of human rights by design (i.e.Privacy-Enhancing and Protective-Optimisation Technologies), strategic litigation for securing digital rights or fair digital markets, campaigning against surveillance for workers, smart cities, or migrants, and so many other fronts of active mobilisation and resistance are all becoming parts of one and the same political project, of one and the same movement.As such, the vision for programmable commons promotes forms of social and institutional action that seek positive changes in the ways our digital infrastructures are (re)configured and motivates people to get politically involved to challenge and question the established political order and power asymmetries.
Organisations of various forms and scopes, from international organisations to local collectives and workers' unions, are thus becoming parts of a large whole whose goal is to establish the constitutive requirements and guiding principles for the future of our digital infrastructures.A new body politic is thereby created; a body politic whose institutional emergence can serve as a viable alternative to the dominant practises and mainstream culture of self-and co-regulation.
Regarding the institutional form of programmable commons, options may vary.Past experience warns against using multistakeholder schemes as an equivalent to participatory and collective forms of digital organisation and Internet governance.Possible alternatives may include an international network of regional/local networks of digital organisations explicitly oriented towards challenging private power concentration in the technology sector; a network of similar digital organisations affiliated with political parties worldwide; community-led community-driven data and network collectives; a regional or global political coalition; an international organisation to supervise, monitor, and intervene in the (re)programming of digital infrastructures and others.
Secondly, at a normative level, programmable commons and the public value of programmability enrich the policy toolkit and legal vocabularies.Like other global commons, the formulation of programmable commons requires not only bottom-up organisation but also, a top-down set of initiatives and political projects aiming at fuelling the dynamics of collaborative action in the infocomputational space.State actors, political parties, organisations, and institutions as well as the international community are thus faced with novel challenges and questions of how to encourage and create legitimate pathways for the bottom-up realisation of collective forms of governance for the world's infocomputational resources.In this direction, rather than building national clouds, data commons, or baking 'sovereignty' into infrastructures that are co-opted and co-controlled by private actors, policy priority is given, amongst others, to: 1) the development of computational technologies (hardware/software) through economic incentives, access to design tools, and reservation of manufacturing capacity; 2) questioning (in law and policy) intellectual property regimes and the ownership status around the so-called 'generative transformers' and other computational technologies; and 3) the potential for mandating positive reconfigurations of infrastructural strategies and the technologies they design and produce.
Whatever its institutional nature and normative agenda might be, the new body politic will require an array of powers to discuss the appropriation, provision, and development of infocomputational resources.Such powers can range from soft-law (ie engaging in multistakeholder and parliamentary discussions) to hard-law (ie mandating the implementation of human-rights preserving protocols and architectures, obtaining 'fast-lane' access to means of compute and machine learning models, or reserving capacity of a manufacturing fab for building a chip).In this direction, a formal, public-law and quasi-constitutional acknowledgement of the public value of programmability would boost the dynamics of social coordination and would establish firm foundations for normative generation.In other words, whatever form(s) programmable commons may take, acknowledging, in our social practices and in law, the public value of programmability opens up the possibility for institutionalising the discussion on the kinds of computations we want to see materialising in the world.But as Ostrom warns, '"getting institutions right" is a difficult, time-consuming, conflict-invoking process' rather than a straightforward result of mere institutional design conducted external authorities. 111Therefore, and thirdly, alongside the creation of new agents and institutional channels in the political reality of our digital world, programmable commons and the public value of programmability can also affect our dialectics, meaning the way different institutional actors 'talk' when discussing challenges and problems, risks and solutions.This is because the questions that programmable commons will be instituted to confront are inherently political and invite potentially adversarial politics.At an international level, law often lubricates adversarial politics by serving as a form of infrastructure for the universalisation of the process through which asymmetrical forces metamorphose into a consensus, a 'collective will'. 112But, when human rights are at stake consensus may not always be feasible and what is 'value' or 'public' cannot be always set in stone.As a result, we are usually led to the least common denominator.The discourse on AI standardisation serves as a good case in point to illustrate that what is often left outside of the 'collective will' may be more important than what has been agreed.As Matus and Veale point out by blending sustainability and machine learning governance: 'Restricting the range of governance tools to only what is possible to standardize itself shuts down broader political questions which standard-setting organizations have never faced head-on'. 113t is for this reason that a commons project for the management of the world's infocomputational resources may need to abandon the legacy of consensus-driven decision-making processes that have guided the history of internet governance in the past.A more pluralistic politics may indeed be needed; a form of generative politics that, as Mouffe writes, would invite a 'vibrant clash of democratic political positions' against a sanctification of consensus that 'lead[s] to apathy and disaffection with political participation'. 114Imagining pluralistic accounts of governance for the world's infocomputational resources is challenging but promises to accommodate and resolve (or at least allow the expression of) different and conflicting visions for our digital future(s); a feature that consensus-driven theories such as that of digital constitutionalism lack. 115roject observed when reflecting on their efforts: '[t]he DP-3 T project is proof that it is possible to build and deploy practical, scalable, and useful privacy-preserving applications without data that could be abused'.120 Yet for the DP-3 T protocol to scale, it required infrastructural support; building hardware from scratch (ie a bracelet for epidemiological surveillance) was simply unrealistic.For this reason, the protocol rested on the wilful cooperation of Google and Apple thereby allowing these companies to define the final computational parameters of/for its integration.Eventually, Google and Apple opted for an operating system update that essentially baked proximity-tracing functionality into their computational infrastructures.121 Such is the level of entanglement in the tech and policy landscape that ideas are transformed into actionable policy on condition that they are filtered by the proprietors of the dominant computational infrastructures.
On the other hand, alongside the sad realisation of the omnipresent path-dependency on private computational infrastructures, the DP-3 T story also represents an empirical example that challenges paternalistic assumptions and predictions about individuals' lack of incentives and capacity to engage in the management of infocomputational resources at a global scale and offers us a valuable precedent to build upon.The vision for programmable commons needs stories like these; stories and political narratives which make bottom-up change look feasible and articulate a way for achieving it even if, at this stage, it requires dependency on private platforms. 122For if at the current state of affairs and power distribution in the digital economy such change requires the active cooperation and goodwill of gatekeepers, future institutional actions and legal-political initiatives may render such goodwill largely convenient, but ultimately unnecessary.
At their core, programmable commons reflect a vision for a form of digital governance free from the asphyxiating terms and conditions that have been set by the market, the state, and their hybrids.Understanding the world's infocomputational resources as commons and challenging, in law and in our social practices, the power concentration in their management are both valuable steps in the path to reclaiming them.Instead of spending our human capital and political energy in responding to whatever new feature or software update is thrown at us, acknowledging the public value of programmability allows us to deliberate on the kinds of computations people envision for their future(s) and to shape institutional actions for their materialisation.
But no commons-based project can be pushed merely by theorising.Commoning is not just about understanding commons as resources but about the active pooling of common resources with a deep connection to the history, culture, and ecology of the place where they exist. 123For this reason, organisations and institutions, from local to international, shall actively explore options to pool financial resources and human capital in order to strategise and mobilise for institutional actions for digital transformation towards more democratic ends.Innovative organisational forms at regional and international level as well as networks of local communities and political organisations may indeed provide the dynamics necessary to breathe life into the vision for programmable commons.
Finally, bottom-up projects aiming at establishing human-rights-preserving architectures for our digital ecosystems, such as the DP-3 T, need to be supplemented and encouraged by positive legal interventions and initiatives.Viewing technology companies as infrastructures will require us to rethink how we regulate them.Helpfully, states and international organisations have been here before.The history of telecommunication and media networks is a history of national and international legal disputes and political battles over the very nature of democracy people envision sharing.There are lessons to be drawn from this history but there are also nuances that require further theorisation and policy attention.More work is needed in this direction. 124

Conclusion
The 'thing' we call and perceive as the 'Internet' has changed.The generative Internet whose experience and potential fuelled people with hope towards a more open and democratic future for cultural production has gradually given its place to a digital ecosystem comprised of partially interoperable walled gardens where the conditions of generativity are determined by what their architects perceive as optimal.Capital and geopolitics determine what people can and cannot do with infocomputational resources and technologies.
In tracing the history of the industrial revolution, Beniger illustrates how the commercial revolution provided not only the material (ie ports and ships) but also the non-material (ie channels of information gathering and exchange) infrastructure that established the preconditions for the industrial revolution.Thereafter, Beniger continues, the advanced industrialisation that followed, created an ever-mounting crisis of control which necessitated a corresponding revolution in technological and logistical infrastructures for rationalisation and bureaucracy; the control revolution.
Data and computational technologies are both products of the same forces.But the rapid expansion, widespread adoption, and pervasiveness of smartphones and cloud architectures have paralysed any attempt to think about, let alone develop policy or build a market for, alternatives.Just like the Internet, they too have spread too fast to think of taming them. 125In a digital world fashioned for them but without them, people rest alienated and resigned, capable of only 'using' an online environment or 'submitting' pre-configured preferences.
In such a historical trajectory, understanding data and computational infrastructures as another instance of leveraging control allow us to confront critical questions for the future(s) of our democracies.Towards what ends is the undergoing centralisation of power in computational technologies paving the way?Shall people have a say, intervene, and/or disrupt technolegal processes that crystalise path-dependencies and are likely to stick around for generations?Is there a societal need or demand for alternative arrangements in the way we organise and manage the available infocomputational resources?
Accepting the ontological possibility of organising and managing infocomputational resources rather than technological outputs will inevitably grapple with and call into question the ownership status and profit models of the public-private status quo.Novel institutional possibilities will be generated.Acknowledging the public value of programmability as the foundational principle of the programmable commons can provide the intellectual and legal toolkit for challenging the established rationality of the stagnated reformist agenda and mobilise towards more transformative institutional actions authentically committed to the collective will of their participants.
The path is neither easy nor straightforward.Commons have histories rooted in decades, even eons of social interaction.What they illustrate, however, is the power that a collective whole can create and maintain when confronted with the challenge of managing resources that can be used or misused, sustained, or depleted.Fundamentally, to change the way our digital world is governed we need to change the way we understand the complex environments we live and experience.Only then will we manage to see the world behind our screens not as something that we enter merely to submit choices to pre-configured options, but as a programmable space where our voices are heard and our actions matter.There are stories to inspire us and hopefully more stories to be told.
69Oran R Young, International Cooperation: Building Regimes for Natural Resources and the Environment (Cornell University Press, 2019).
90 Kevin B Sobel-Read, 'Reimagining the Unimaginable: Law and the Ongoing Transformation of Global Value Chains into Integrated Legal Entities' (2020) 16 European Review of Contract Law 160.