Driverless Governance. Designing Narratives Toward Democratic Technology

: What if governance were driverless? What would be its measures of equality? Happiness? We invented a platform, Sparks Laboratory, to speculate on these possibilities. This ongoing project harnesses designed narratives as a mechanism to explore how pervasive technologies shape worldviews, power and dissent, and how governments might use these worldviews to inform their policy responses to socially complex ethical dilemmas. This paper reflects on our first act of design — the creation of Sparks Laboratory — and our aims to leverage it as an interface for cooperative future-casting about technology and governance. Through designed artifacts from Sparks , we aim to provoke design queries — future-focused philosophical questions and real design propositions — that enable a diverse range of constituents to participate in the forecasting process.


Introduction
"People between 6' and 6'3" who follow swing bands are 23% more likely to adopt this conservation practice. Tomorrow's forecast: inhabitants with this motion profile have been allocated 15% more instances of pleasure… Collective society is, and will therefore be, happy." GLENDA System Efficiency Reading We founded Sparks Laboratory as a futuring platform for extrapolating the implications of rising algorithmic decision-making. The lab designs provocations from future "driverless governments" as discursive tools and convenes transdisciplinary workshops to propose more critical applications of these technologies.
Equality Analytics: Stories of Driverless Governance, Sparks' first provocation, documents a fictitious test-site governed by GLENDA, an evolving algorithm that uses citizen data to distribute resources exactly equally. Citizens here use an energy-based currency, which was proven in early experiments to be highly effective in predicting every resident's burden on the state. After long debates about the government's role in guaranteeing its citizens' happiness, the site's original human bureaucrats scrapped their welfare offices and installed a network of entertainment interfaces through which participants claim their fair share. Of course, these rations are modified by sophisticated predictive models that account for every person's likely energy cost elsewhere in the system. Equality Analytics extrapolates dangers and opportunities connected to these particular technologies through three different viewpoints: a maintenance worker performing diagnostics on GLENDA; GLENDA itself; and a group of citizens receiving their "fair share" (Alipour- Leili, Chang, & Chao, 2016). Abstract and open-ended, the film is a futuring tool that facilitates collaborative inquiry from experts in many fields.

Design Context
The projects of Sparks Laboratory are part of a growing design practice in which curious narratives act as the research mechanism for addressing complex issues. Inspired by a proliferation of utopian visions of driverless technology, post-work economies, and ubiquitous artificial intelligence, we founded Sparks to explore such questions as: What are the implications of increased data-driven governance? What are the definitions of happiness in advanced techno-capitalism? How do we (or our machines) distinguish between equality and equity in a quantified democratic state? The ways we position this speculative mode of inquiry within traditional design (and non-design) practices have been quite intentional. This paper reflects on our first act of design-the creation of Equality Analytics-and how it facilitates cooperative future-casting about technology and governance.
The scope of design is expanding in an increasingly complex world. As noted by the design strategist and DESIS Network founder Ezio Manzini, existing market and government structures have been unable to resolve profoundly difficult problems now rising to the forefront of global agendas-from reducing social inequalities to combating climate change and addressing resource scarcity. In the face of these so-called "intractable" issues, governments are looking to social innovation for "solutions that break traditional economic models and propose new ones, operating on the basis of a multiplicity of actors' motivations and expectations" (2015, p.12). According to Burns, Cottam, Vanstone & Winhall, there have been dramatic shifts in design practice as a response: first, in where the "design skills are being applied and secondly, in who is actually doing the designing" (2006, p.10). In this respect, the scope of design has broadened to include not only the artifacts that mediate social relationships, but also the strategies, behaviors and systemic frameworks within which they are embedded.
Fields such as futuring and speculative design create catalysts that generate "a multitude of worldviews, ideologies, and possibilities" (Dunne & Raby, 2012, p.162). By creating real objects from imaginary scenarios, speculative designers use material narratives to make strategic value propositions. The process of crafting these invites debate as "heuristic rather than telic... [a] dialogue with, rather than within, the text" (Levitas, 2013, p.112). Thus we consider this design practice more useful as a collaborative process-rather than a composite of ideal visions. As practitioners who embrace transdisciplinary methods, we foreground the importance of multiplicity and different viewpoints in shaping a successful design. The ways we materialize the concept of driverless governance bring together experts from many different backgrounds and constituents affected in different ways, toward a more holistic vision of decision-making technologies in our world.

Figure 1. Sparks Laboratory Speculation-Action Loop. We fuse the British Design Council's Double Diamond with Joseph
Voros' Futures Cone to pace a process of speculative design and co-design. Equality Analytics is a design fiction that sits at the heart of this diagram, informed by signals and trends we discuss below.
Equality Analytics is a critical extrapolation of these technologies and the systems they enable, but for us it is the beginning of a process of democratic futuring. We use the technological facts of our speculative world to shape a design fiction about issues of governance. Our goal is to manifest these possibilities in a tangible way, rich with specifics. The narrative then serves as a springboard within collaborative workshops facilitated by Sparks Laboratory, where academics, designers, government employees and economists can begin to co-design around different questions. These include: how might these technologies shape government worldviews and alter modes of power and dissent? How might these governments respond to socially complex, ethical dilemmas?
Working from new prompts, workshop participants recalibrate these future scenarios, backcast their effects, and critically reflect on their own practice. Eventually, these reflections inform new inquiries by Sparks Laboratory, new sets of design criteria, and act as an important exercise for participant practitioners. In the following chapter, we unpack some of the questions raised by Equality Analytics, as we consider the types of behavior change, social rituals, and shifting cultural values that might unfold under GLENDA's rule.

Meet GLENDA: Sparks' Automated Public Servant
"New sensors onlined this week have generated enough data about user energy consumption habits to begin future forecasting. Distribution of equivalent energy shares among citizens has become more precise, with nearly 79% falling within a 150-joule margin-of-error. The training staff is working with GLENDA to improve these allotments. By month's-end, we expect the algorithm to have achieved sufficient self-correcting capabilities to completely eliminate the need for any training personnel." GLENDA Training & Development Report Through an artificial intelligence tasked with fair governance, we found a rich framework to expand the debates about automation and machine decision-making. In the process, we had to debate the algorithmic ingredients that make up GLENDA, its implementers' visions of fairness, and the future of techno-democracy and digital human rights. By careful design we also set up a fictional context to imagine subsequent emergent social forms. In the following chapter, we discuss this construction in further detail using specific details from the creation of Equality Analytics.
In designing elements of Equality Analytics' narrative, we thought it necessary to forefront surveillance and self-tracking technologies, predictive analytics as a tool for resource allocation, and a shift to energy as a unit of currency. These design decisions were predicated by the rise of blackbox algorithms, their increasing impacts on our everyday lives, their focus (at least in the context of the government) on saving limited resources, and their need to quantify and categorize data-points. The back-end logic of GLENDA, the governing algorithm, thus references and extrapolates real-world debates around the role machines play in our decision-making.

Self-Tracking and Surveillance Technologies Pervade Everyday Life
Governments and industries quickly embrace new formulas for greater efficiency and profit, often with dramatic collateral effect. In an increasingly polarized United States, the facts Facebook extrapolates about its users based on their digital footprints have a growing effect on civil society and political discourse in the country. The facts Amazon estimates about each of its users based on their digital window-shopping has a dramatic effect on where it sends and stocks unsold merchandise. The scores that credit bureaus create based on credit history ignore historical and systemic racism and affect a US citizen's ability to get a loan, build equity, and protect their children's' ability to do the same (Brown, 2015).
Led by these applications, we designed GLENDA to surmise facts about its citizens' energy habits in order to reduce waste and ensure absolute equality. Each person's energy intake is recorded through self-measurement wearables and IoT urban monitoring infrastructure-data which feed GLENDA's machine learning techniques, shape its forecasting capacity, and alter citizens' rations based on individual history and demographic profile.

With Ubiquitous Surveillance Comes a Cascade of Data-Points
The prevalence of recorded data-in GLENDA's domain, or in the places we live-do not in and of themselves constitute information. By collecting more data, we have an unprecedented opportunity to find correlations and draw inferences from specific measurements, but we multiply the risks of non-rigorous quantitative science. Data are generally ignorant of social history, rarely judge causation, and yet are often confidently endowed with a certain moral authority. In some cases, algorithms seem to account for and counteract human bias, such as an attempt by GiveDirectly to calculate poverty levels using machine learning and image processing to fairly distribute resources in cash-strapped villages in Kenya and Uganda. 1 In other cases, we see the injustices of our society baked into an algorithm's DNA, such as the dollar differences in settlement amounts for people of different races killed by police in the US (Soffen, 2016). 2 As a test-site, Sparks is also entangled in these tensions. Its programmers have adopted a pseudocurrency based on energy use as a way to quantify nearly every micro-decision and predict the likely behavior of its citizens. As GLENDA learns how to better anticipate the energy cost of individuals within different groups, new demographics emerge. Society may become materially stratified along lines of metabolism, while race-and gender-based biases diminish. Teams of "trainers" are faced with the impossible task of parsing GLENDA's decisions, debating its moral implications, and attempting to correct problematic logic.
The availability of data and tools to parse it have made predictive algorithms incredibly complex. To complicate this further, algorithms themselves have become protected proprietary secrets. These technologies are developed with specific purpose, by organizations and corporations that have particular assumptions about the world. In replacing familiar problems with layers of complexityoften through a triad of techniques summarized by Pasquale as "technical complexity, real secrecy and trade secret laws" (2015)-obfuscation of unjust activities is discovered too late to prevent its fallout. This reality was made evident during the 2008 financial crisis (Gapper, 2011). Without transparency-or legibility-the decisions of driverless technologies, their programmers, their bosses, and their shareholders are inaccessible, and their conclusions are evermore difficult to challenge.
Reality manifests through Sparks Laboratory in two major design queries: the effects of the growing unintelligibility of information tracking due to increasing system complexity, and the new ways in which dissent and activism may arise in response.

System Complexity Affects Control, Diagnosis and Maintenance
The test-site, driven by ubiquitous sensing, has within its neural network the capacity to measure everything: a citizen's steps, the calories in their breakfast, their metabolism, life expectancies, their favorite color. While GLENDA grows ever more complicated, the site's civil engineers are faced with the daunting task of maintenance. 3 Incapable of diagnosing apparent errors, engineers must work backwards, experimenting through training their AI rather than delving into its programming. Like analog government, transparency in digital programs allows a range of stakeholders to make better sense of machine decision-making, then dispute and correct it (Tetlock & Gardner, 2015). In a driverless government, how might we democratize algorithmic training to prevent a biased set of policies? 4 In Equality Analytics, data manipulation and hacking emerge as forms of criticality, dissent and activism. Restricted access and conflicting interests within a black-box democracy demand new modes of civic action. Citizens begin to link some energy-saving behaviors, such as slow movements and prolonged periods of inactivity, to obtaining certain kinds of preferred resources. "While [algorithms] have become critical economic infrastructure, trade secrecy law permits managers to hide their methodologies, and business practices, deflecting scrutiny" (Pasquale, 2015, p.14-15). In a site governed by algorithms, how else might citizens revolt? If resources were allocated on an individual basis, would potlucks, communal activities, and sharing be received as a form of protest? As Washington University professor Neil Richards notes, "[w]e need the breathing space to [protest] in an age of digital surveillance" (Simonite, 2016). How might this form of governance reduce the barriers that dissuade a community's participation?

Manifested in a Universal Income Program, an Energy Economy Leads to New Debates about Equality
"The Steering Committee voted 6-3 to implement a rewards system based on caloric rarity. Calibrating site entertainment interfaces will be simple, but will require weekly updates. For example: cherries are the least common calorie per capita at the moment but will likely be fourth-or fifth-least by mid-summer. The Sands Foundation for a Happy Tomorrow has been awarded the bid to develop and integrate user psychology estimates." GLENDA Training & Development Report Alongside the technologies of GLENDA, we continued to develop aspects of the managers' worldview. Their perspectives, as technocratic public servants, were rooted in a certain vision of the public good and an optimistic attitude about technology's capacity to facilitate it. Driven by projected resource scarcity, soaring automation and subsequent unemployment (Srnicek & Williams, 2015), GLENDA's domain speculates about new arrangements of economic governmental services, particularly an economy based on energy (Meyer, 2016) and a safety net guaranteeing a universal basic income. These prerogatives, embedded in an AI's programming, mirror current ethical dilemmas regarding subjective and objective notions of wellbeing.
As automation 5 and growing resource scarcity continue to eliminate jobs, Universal Basic Income (UBI) proposals are once again gaining popularity across the political spectrum for their promises of security, greater equality, and an overall improved quality of life. 6 Unlike cash-based UBI pilots however, GLENDA distributes resources (effectively measured in energy cost) as a responsive form of public welfare. Citizen shares are calculated by the algorithm to keep the process free of human bias. A new legal definition of equality has emerged from this highly measured entitlement system, focused on residents' equal energy costs. Within GLENDA's domain, the principle is called "Atomic Equality."

Current Definitions of Happiness Affect Public Policy
As we sought to explore the terms of "happiness" in a world where driverless governance was tasked to provide it, we turned to international development standards where the term is already being applied. As stated in the 2016 World Happiness Report Update, "increasingly, happiness 7 is considered to be the proper measure of social progress and the goal of public policy" (2016, p.7)which has led to the UN calling on governing bodies to consider other aspects of wellbeing in their pursuit of current economic, social and political agendas.
Despite a context of resource scarcity, GLENDA was designed to deliver more than its citizens' material needs. This manifested first as entertainment devices through which the government allotted citizens' resources, but quickly the state began experimenting with new ways to encourage conservation behavior. Adhering to the tenants of Atomic Equality, the machines began delivering less common resources to citizens with smaller energy footprints; everyone received their precise ration, but the rarest calories became winnable.
In response to their manifestation through Sparks Laboratory, two major design queries arise: how governance and technology can (and should) intervene in relation to wellbeing, and the human consequences of technologically-derived definitions of equity and equality. In aspiring to achieve greater happiness or subjective wellbeing, the notion of utopia "generally intends to produce more happiness or human flourishing through changes in social arrangements" (Levitas, 2013, p.178). The site reveals generally unquestioned aspects of predictive governance, some toward a preferable world order, but others increasingly problematic-about the impossibility of objectivity, new forms of (machine) responsibility, and our capacity to hold algorithms accountable for filling in gaps. 8 Yet, as Schroeder notes:

Limitations of Driverless Government Illuminate Meaning-Making as Part of the Human Experience
"if meaningful things are going to happen, they're going to be done by people.
[Sparks] creates a space in which people's physical needs are being provided for by the system, but their emotional, personal and social needs have to be provided for by themselves and each other" (Schroeder, 2016).
In this respect, can the human experience be completely encompassed, measured and rationalized through predictive analytics and machine learning? What aspects of social complexity might be missed in the filtration and correlations of centralized self-tracking practices (Neff & Nafus, 2016, p.160)?

Design Principles
"Design objects are equipment and media that can be understood in terms of their contextual references and consequences as well as the way in which they mediate human action, thinking and existence, and thus in terms of the worlds that they open up." (Franke, 2016, p.7) Through this project, we aim to push beyond speculative provocation, and instead lay the groundwork for greater collaboration in strategic planning and design work. We use speculative design as a tool to prompt decision makers to embrace their pragmatic instinct and to entertain blueskies thinking, to make new technologies, their implications and possibilities more accessible to many more constituents throughout the design process. Through the practice of creating Equality Analytics, we have begun to hone design principles to guide the future work of Sparks Laboratory.

Curious Narratives Act as Critical Reframing Tools to Structure a Collaborative Practice
By forming a concrete fictional space for experimentation, our intent is to use this narrative structure as a form of framing and original thinking that folds designers, technologists, theorists and bureaucrats of all kinds into a project of future thinking. Narratives have the power to communicate across disciplines in a way technological details cannot, and in a more intimate fashion.
Design fictions then provide a bridge between complicated technical specifics and an audience's available tools-from their lived experience to research frameworks like the Rockefeller Foundation's City Resilience Framework (2014) featured in Figure 5-for discussing the roles of government and assessing its effectiveness. Gestures of driverless governments are the "affordances" of our narratives, whereby participants can access and debate complicated questions of collective decisionmaking.
We piloted this methodology at the 2017 VergeNYC Conference with thinkers, practitioners and changemakers from across disciplines. In discussing the worldviews that emerged from our sessions, we co-developed the beginnings of design principles for algorithmic governance. Design for friction, for example, was one idea that suggested that the many moments of data collection and machine decision-making happening all around should be made less seamless. Does our passivity reflect our consent? How might these frictions be integrated into our daily interfaces?

Together, Proposition and Critique Prevent "Problem Closure"
Embracing the tensions between pragmatism and blue-sky thinking-which are often set up as oppositional-we try to produce evocative future visions that facilitate conversation; it is not an issue of dilution, but of providing just enough to break out of conventional debate. "Diversity can be particularly important when established research is in a state of 'problem closure,' a situation where the participants believe they already know what the loop of facts and action are, and any new facts that do not fit within that loop are ignored" (Neff & Nafus, 2016, p.51). In constructing a collaborative space for both proposition and critique, we create unconventional openings and opportunities for new stakeholders, viewpoints and approaches to enter the conversation.

Broadening Sociotechnical Imaginaries Facilitates Participation
As Manzini notes, the most important contribution that design for social innovation can bring to a practice is its ability to look at places "through the eyes of the people and communities who live there" and the "people who are starting (or have the possibility to start) to put into practice a new idea of wellbeing" (2015, p.201).
Design fictions can create a space for collaboratively produced "sociotechnical imaginaries" 9 (Jasanoff, 2015, p.4). Toward this end, our method leans on two affordances of design fictions: transdisciplinarity-presented holistically, the effects of new technologies transcend specific modes of study-and shifting temporalities (Hassoun 2016)-stories have the flexibility to imagine origin stories, explore near futures, and extrapolate many lifetimes ahead. In this way, the designed artifacts created by Sparks Laboratory are an interface through which constituents can voice different opinions and direct future design. They bring diverse voices to the forecasting process.
As a plan of action, we plan to take these insights/principles to practitioners in government and activist spaces in order to confirm, challenge and validate them. Part of a cyclical process, the outcomes will be used to define new spaces for inquiry, provocation and action.

Conclusion
At its core, our project is a call for collaboration; a proposition for creating accessible interfaces for reading complex futures. We are not suddenly living in a society that places different governing value on predictions. However, we have achieved a new kind of technology that has enabled predictive decision-making at a whole new scale. In some cases these capacities are being carefully applied, but often they are not; as the Obama White House advised: "Big data techniques have the potential to enhance our ability to detect and prevent discriminatory harm. But, if these technologies are not implemented with care, they can also perpetuate, exacerbate or mask harmful discrimination" (Executive Office of the President, May 2016). In exploring notions of driverless governance through our work within Sparks Laboratory, we create the connective tissue between developers, bureaucrats, sociologists, economists, and stakeholders to enable cooperative design, design applications and policy. 3 As seen in the Washington Post, Facebook's new algorithm is creating its own reality in the absence of human editors. www.washingtonpost.com/news/the-intersect/wp/2016/08/29/a-fakeheadline-about-megyn-kelly-was-trending-on-facebook/. 4 To have a bias-free algorithm training process, more attention needs to be paid to "bias mitigation" practices. For more information, see p. 6-8 in Big Data: A Report on Algorithmic Systems,Opportunity,and Civil Rights. 5 In 2016 Kingma wrote about the high percentage of jobs at risk of elimination due to technological advancement. 6 Canada's pilot, the 1974-79 Mincome Project (Lum, 2015), acts as a precedent for recent global interest in UBI programs. Proposed as a solution to the future of work and poverty in response to technological advancement, Finland and Ontario, Canada, are implementing pilot projects as of 2017 (Government of Ontario, 2016). 7 Using "happiness" and "subjective well-being" interchangeably, the 2013 Organisation for Economic Co-operation and Development (OECD) Guidelines define subjective well-being as encompassing three different aspects: "cognitive evaluations of one's life, positive emotions (joy, pride), and negative ones (pain, anger, worry)." 8 At the root of the issues surrounding machine learning and the hyper-centralization of tracking information are questions related to scale and accountability in relation to government responsibility. For more information see: fivethirtyeight.com/features/whos-accountable-whenan-algorithm-makes-a-bad-decision/. 9 Sociotechnical imaginaries can be defined as "collectively held, institutionally stabilized, and publicly performed visions of desirable futures, animated by shared understandings of forms of social life and social order attainable through, and supportive of, advances in science and technology" (Jasanoff, 2015, p.4).

About the Authors:
Melika Alipour Leili is a design strategist and industrial/interaction designer who uses new media and connected devices. She studies the impact of technology on society and the revolutionary role of design in the modern era. Her work explores the overlap of social media, connected devices and advocacy.
Winnie Tsai Chang is a design strategist, illustrator, and graphic designer interested in examining the social boundaries that define and separate us. She believes design can transform how people think about, relate to and reflect on the things they personally encounter in their everyday lives.
Corey Chao is a designer and ethnographer whose work focuses on environment, history and community. He leverages the production process as a tool for coalition-building, constituent-directed inquiry, and political empowerment.