Nudge goes to Silicon Valley: designing for the disengaged and the irrational

ABSTRACT An array of software apps, from fitness to finance, enrolls behavioral economics, and economists, in their product designs, value propositions, or else sales pitches, to make products more engaging and to afford users new capabilities in their daily lives. Drawing on 30 interviews with product strategists, designers, and user researchers who work on these self-styled ‘behavior change apps,’ this paper empirically studies the behavioral economic proposition and its operationalization in routine practices of software development and design. Setting aside the behavioral addiction and manipulation frame that critical work on app design typically summons, I approach behavioral economics applications as market work and tease out the different, co-existing logics of attachment between products and their users, that emerge from how market actors decide what product to build, which features to have and how to design the user experience. In doing so, I show that strictly focusing on the frequency of repeated interaction is also empirically inadequate. The product is rather strategized, developed, and designed to become something that the user ‘cannot do without,’ not because it is addictive, but because it is made indispensable to the distributed action universe of the behavioral problem that it addresses.


Introduction
People were very critical of [Trading App] for exploiting people's gambling addiction.When I worked [there] a big problem we had is people would come to the platform wanting to learn how to trade stocks, then they wouldn't know what to do.They would put money in their account and never trade it.(Interview with a USbased user researcher, Online, 16 June 2022) Most of the work that the User Experience Research Lead with a PhD in behavioral economics did for the retail trading app 1 , was to devise interventions to get users to place a trade: micro lessons that teach them how to navigate the platform, well-timed messages that comfort and encourage, bespoke features, like 'fractional shares [where] you could trade a penny of [a stock],' that reassure and equip with the right toolsall to create for the user, and get them to exercise, new capacities to act as a 'lay-trader' (cf.Roscoe 2015).That the trading app hired a PhD in behavioral economics for the job was not accidental.In fact, there is now an array of software apps that enroll behavioral economics, and economists, in their product designs, value propositions, or sales pitches, to afford users new capabilities in their daily lives (Wendel 2013).Despite their wide variety of markets, origins, and size -at once in the category are dieting and meditation apps, insurance providers, and financial roboadvisorsthey all face the same challenge: a user that is prone to disengagement, in a market that wants engagement fast.
Engagement is of course a provocative word.'A Silicon Valley byword for having users constantly coming back for more' (Kuang and Fabricant 2019, 400), the standard business metric is now considered by critics synonymous with technology addiction and manipulation of consumer behaviors and desires (see e.g.Zuboff 2019).Taking a cue from my interviewee, however, I will argue to the contrary: rather than manipulating the user into overengaging with the product, behavioral economics applications tackle the threat of disengagement always present in market attachment (Callon 2021;McFall, Cochoy, and Deville 2017). 2 The way in which behavioral economics dispels the threat is by rearranging the existing action routines of users or creating new capacities to act so that the product/intervention becomes the locus and the enabler of action.The market proposition here is that carefully designed interventions into the distributed agencies of users and products can provoke the action (of say, lay trading) that users on their own lack the agential capacity to perform.The proposition therefore marketizes the effect of nudged agency: agency because the purpose is to produce a desirable target action, nudged because said action cannot occur without the intervention.The latter part of this formulation owes a great deal to behavioral economics' debunking of the rational actor model and proposing the 'predictably irrational' (Ariely 2008) and conveniently intervenable one in its stead.
This paper studies empirically how the behavioral economic position is elaborated as a market proposition, and how it is operationalized in software product development and marketing, in routine practices of product design (user experience, features) and product strategy (value propositions, business models).The analysis primarily draws on interviews with practitioners involved in the field (n = 30), supported with material collected during a year-long Covid-complicated fieldwork based on multiple digital platforms, where practitioners migrated, to continue speaking with their clients and each other.I interviewed designers, marketers, user researchers, product managers, and strategists who would either qualify their more conventional title with the adjective 'behavioral' (e.g.behavioral product strategist) or occupy the bespoke positions of the applied behavioral economist or scientist in their companies. 3In interviews, they walked me through their practices around the product as it is being strategized, designed, tested, built, pivoted, launched, continuously optimized, or else dropped.
The empirical emphasis was intentional and meant as a corrective to the literature that largely assumes nudges and other behavioral designs to work as advertised and in the restricted registers of 'online manipulation' (Susser, Roessler, and Nissenbaum 2019) or 'behavior modification' (Zuboff 2019).Instead, I approached behavioral economics applications as market work and sought to tease out the different logics and patterns of attachment, that emerge from how market actors decide what product to build, which features to have, and how to design the user experience.The material I weave together shows how behavioral product practices are ordered and how they unfold, to varying degrees of disorderliness, in the wild (Law 1993), in contrast to the more prevalent understandings of nudge in software development, that propagate the behavioral addiction and manipulation thesis.
Nudge in software development 'Nudge' seems to capture well what they were trying to do, said the editor reviewing their book, originally titled Libertarian Paternalism (Thaler and Ganser 2015).In this 'ingenious bit of accidental rebranding' 4 , nudge became not only the book's title but also the center piece in Thaler and Sunstein's political-normative program of libertarian paternalism, figured as the main device for 'alter [ing] people's behavior in a predictable way without forbidding any options or significantly changing their economic incentives ' (2021 [2008], 8).This simple formulation would have a strong performativity, arguably for the word captures succinctly and conveys all key features of its object: nudges are 'nimble,' 'unobtrusive,' suggestive interventions, cheap to avoid for the subject, cheap to implement for the designer (Thaler and Sunstein 2021;Yeung 2017).They target 'seemingly irrelevant' elements in a choice situation, such as changing how a choice is framed, much to economists' amusement, as people's preferences should be fixed and uninfluenced by non-economic factors.Although behavioral economics, the intellectual home of nudging, has by now identified more than a hundred 'cognitive biases and heuristics' that make actual decision-making deviate from the rational model and that also serve as targets for nudges to reverse engineer.As much as they are playful, nudges' effects are marked by serious intentionality, predictability, and potency, as they target cognitive and affective weaknesses found in every human being (Mills 2022;Yeung 2017).
The internet is awash with nudges, critics contend, or rather the surveillance economy has turned it into one big hypernudging system (Yeung 2017;Zuboff 2019).The designers of websites, apps, and platforms have been 'exploit [ing] the catalog of decision-making biases that psychologists and behavioral economists have been diligently compiling over the last few decades' (Williams 2018, 33) to align users with business models that hinge on user engagement, data extraction, and exploitative mediation of market exchange (Birch, Cochrane, and Ward 2021;Rahman and Thelen 2019;Sadowski 2019).Uber is a well-cited example, if not for briefly having an in-house behavioral economics team.The ride-hailing app is found to nudge its drivers to 'take up more fares' with features like 'forward dispatch' (automatic queuing of the next ride), push notifications, and heat maps that indicate surge prices, activating a combination of 'default bias' and 'people's preoccupation with goals' (Rosenblat and Stark 2016;Scheiber 2017;Susser, Roessler, and Nissenbaum 2019).More broadly, Yeung (2017) proposes that we can consider all 'big data-driven decision guidance techniques' to be working in the mode of the 'deceptively simply design-based mechanism of influence' of the nudge (119).Although unlike their offline, static, and blunt counterparts (e.g.speed hump), these 'hypernudges' (e.g.Google Maps navigation function) dynamically adjust and personalize the informational choice contexts of users, creating self-contained cybernetic regulatory systems that are 'highly potent.'So potent that Zuboff (2019) locates the power of large platform companies in this very mechanism of influence: 'power' she argues 'is now identified with ownership of the means of behavior modification' (691) while the old project of behaviorism is reincarnated 'as a global digital market architecture' (598).
In fact, behaviorism is to software app design what psychoanalysis was to advertising in the heyday of The Hidden Persuaders.Much like the cultural coding of advertising as the technique of subliminal desire manipulation (McFall 2004;Slater 2011), critical work univocally finds 'intermittent variable rewards' to be the design intervention that explains the success of social media platforms in 'hooking' the users and keeping them 'trapped,' with routine references to Skinner's box and Schüll's (2012) ethnography of slot machines (see Alter 2017;Eyal 2014;Harris 2016;Kuang and Fabricant 2019;Leslie 2016;Martin 2022;Vaidhyanathan 2018;Williams 2018;Zuboff 2019).While readings of Schull's book often miss the care with which the author eschews technological determinism, the claim here is that, just like slot machines, algorithmically curated newsfeeds are designed to show dopamine-inducing content 'intermittently,' to keep users coming back for more (Aagaard 2021;Helm and Matzner 2023).Variable rewards are complemented with other behavioral nudges that work to increase the frequency and length of interaction: features like the infinite scroll or auto-replay, whereby stopping a user activity is rendered harder than continuing it, extend user sessions; notifications, or 'triggers,' by calling users back to the system, reduce the time between sessions; habit-inducing streaks and badges gamify the experience and lock users in through the 'investments' they create within the system.The terms, 'hook, trigger, investment' are from product psychology evangelist Nir Eyal's Hooked, 'the closest thing to a bible for designers who want to induce habits in their users' (Williams 2018, 34).In the book, Eyal deciphers the design secrets of large platforms, to guide startups in their quest to similarly secure 'mind monopolies,' who in turn, 'draw on the latest research in behavioral science to punch the right buttons in our brains as effectively and reliably as possible' (idem).
In these accounts of software design, nudges figure as (1) targeting the mind and 'the nonconscious dimensions of cognition' (Dieter et al. 2019, 4) (2) working as predicted, intended and as powerful agents, (3) consequently, creating the effect of 'manipulation' understood as 'the covertsubversion of decision-making power' (Susser, Roessler, and Nissenbaum 2019) or (4) behavioral addictions when deployed to create repeated and increasingly frequent interaction with the system.The latter is indeed a consequence of the changing regimes of marketisation and assetization of software, where repeated interaction measured through 'the engagement metrics that track length and frequency of use' now plays a central role (Seufert 2013, 97; see also Birch, Cochrane, andWard 2021, Schull 2012;Cooiman 2022). 5Yet, to accept that tech companies are 'in the business of … surveilling and changing behavior' as Birch et al cautions, is not to argue that their 'techcraft' 'actually changes individual behavior' (4).The behavioral addiction and manipulation thesis regularly conflates the two, inadvertently exaggerating companies' and their designs' potency and influence over the users (Doctorow 2020).
It is even more suspect that nudgers have advanced the same critique of the attention economy business models, and the resultant operationalization of nudges in their service, except reminding everyone of the libertarian paternalist roots of nudges (see Lembcke et al. 2019;Thaler and Sunstein 2021).Interventions, they have argued, were always supposed to be in the interest of the subjects, that is, meant to make them into self-interested actors, who approximate the rational, fixed preference, cost-benefit analyzing, utility-maximizing actor of neoclassical economics.In this proposition, the positive-normative distinction plays an important role: as Heukelom (2014) insightfully argues, even though behavioral economics challenges the descriptive accuracy of the rational actor model, it does not reject it as a normative ideal.In fact, we might add that it links more and more actions with the aspiration for rational action, thereby rendering them 'economic.' 6 Incidentally, the normative insistence on the rational actor model sets these applications apart from Skinner-type behaviorism that it is typically bundled with, and the 'dark patterns' or 'sludges' that work against nudgees.
The decade following the publication of Nudge, the project extended into the consumer product space, as the practical application of behavioral economics grew beyond nudging, and through hybridization with 'designerly ways of knowing' (Cross 1982), into a design discipline, alternately called 'behavioral' or 'behavior change design' (Datta and Mullainathan 2014;Schmidt and Reid 2021;cf. Michie et al. 2013). 7Nudge travelled into Silicon Valley through more books, trainings, consultancies, and 'gurus' like B.J., Fogg, Nir Eyal, Dan Ariely who teach software developers how to use behavioral research in designing effective products (Beattie 2022;Martin 2022;Nadler and McGuigan 2018;Seaver 2019;Wendel 2013).Teachings were more faithfully put into practice in the new app propositions spanning health and wellness, finance and insurance, education and productivity, where 'behavior change is the core value of the product for the user' (Wendel 2013).Here, behavioral economics figures more centrally and explicitly in business and product development: even if the idea still is to create sticky apps that users 'become tied to and cannot easily leave' (Sequoia 2018), these apps help users stick to a diet, an investment plan, a learning schedule, in any case, a rational long-term goal.Consequently, innovative although at times ambivalent takes on 'engagement design' proliferate (Beattie 2022;Jablonsky, Karppi, and Seaver 2022).An important one for our purposes in this paper is the frame in which in-app engagement and 'behaviors within the product' like 'logins, clicks, swipes,' are figured in service of a real-world behavioral outcome 'within the users' daily lives' like meditating, sleeping, or stopping smoking, dubbed as the 'big E engagement' (Cole-Lewis, Ezeanochie, and Turgiss 2019; Wendel 2013).
The behavior change business intersects with other trends in the cultural economies of products and users, namely, co-optation of the attention economy critique and technosocial practices of selfoptimisation (Jablonsky, Karppi, and Seaver 2022).Silicon Valley and its gurus' move away from 'persuasive technologies' (Fogg 2002) and 'habit forming products' (Eyal 2014) to 'good habits' (Fogg 2019) products that'll make you 'indistractable' (Eyal 2019) can be read as a response to the prominence of technology addiction critique.The prototypical example of this trend is the meditation apps that ironically confer 'attentional sovereignty' to their users and encourage them to disconnect and digitally detox (Jablonsky, Karppi, and Seaver 2022).These innovations afford new subject formations, capabilities, and agencies to users, yet for most part, critics conclude, they discipline users towards socially and 'neoliberally' desirable traits and ends (e.g.Berndt 2015; cf.Schüll 2016) while continuing the routine manipulative practices in software development (Sax 2021).Sax (2021) is a good example of the latter tendency.In reviewing health apps like MyFitnessPal, Fitbit, Headspace, the author concludes that there is a discrepancy between the stated aims of the products (i.e.helping users lead healthy lives) and what their design appears to in fact optimize for in accordance with the business models (i.e.short term engagement).The mindfulness and meditation app, Headspace, for one, 'entice[s]' users-to-be by evoking 'people's natural desire for health,' 'luring people to the app' with free packs of meditation sessions, yet with the aim of ultimately increasing their engagement with 'revenue-generating features (e.g.premium features) and material (e.g.(native) advertising) ' (348-50).
Let us note that these analyses do not study the actual practices of design empirically, keeping their focus strictly on the idealized effects of design choices rather than what drives the design process (Ash et al. 2018).And neither are 'luring in,' 'enticing' and 'engaging,' as analytical frames, exclusively within the purview of technology ethics scholarship.Refigured as the problem of 'attachment,' that is, '[Why] consumers attach themselves more to some goods than others, to the point of agreeing to pay for them?' (Callon 2017, 180), it was 'claimed for sociology' and studied diligently in the intersecting worlds of economic sociology, cultural economy, and market studies (see McFall, Cochoy, and Deville 2017).This literature has likewise produced original and provocative interventions on the notions of 'agency' and 'action' that the discussion on manipulative software keeps invoking.I will now review this work which imparts the tools for a rather 'interesting' analysis of behavioral economics in software development, if we take the accounts that stimulate interest to be those that 'constitute an attack on the taken-for-granted world of their audience' (Davis 1971, 311).

Market attachment and distributed agency
The analytic of 'attachment' is at once a conceptual and empirical intervention.Work in this tradition attends to the fragility as much as the resilience of market attachment, to 'the uncertainty, guesswork, sentiment, luck, mystery and failure that is also inherent in attachment' (McFall, Cochoy, and Deville 2017, 10).For example, Ash et al. (2018Ash et al. ( , 1138) ) challenge the 'assumptions about the smooth manipulation of user action and experience' in their empirical study of interface design practices behind High-Cost Short-Term Credit products, and instead theorize them as 'an experimental process of managing friction.'In addition to being a process of tests and trials, attachment also operates under various logics and modalities.'To entrap is not necessarily to manipulate' Cochoy (2007) remarks, 'to respond favorably to information, an advert, a commercial offer (to be captured) does not necessarily proceed from an error, a mistake in understanding or a cognitive imperfection, as is implicitly assumed by the notion of manipulation' (206-7).There can be willing submission, or 'reciprocal manipulation' launched by the user-consumer who have their own plans and programs.This intricate interplay 'between hunter and prey' is emphasized further in the anthropology of traps, Seaver (2019, 7) argues, which 'drew no essential distinction between mental and physical capture, suggesting that trapping itself may always be both material and mental.'Seaver turns to the literature on traps, to make sense of algorithmic recommender systems that try to 'hook' users.
The 'hooking' and the 'attaching' takes on a particularly iterative and data-driven form in software products under the paradigm of agile programming, while the mundane materiality of market attachment presents itself as another key focus of the literature.Gurses and Van Hoboken (2017, 19) observe 'developers across services to continuously tweak, remove, or add new features using "build-measure-learn feedback loops."'Contemporary software systems are not only packaged market commodities but marketing devices 'listening in on' their users which in turn are 'looped back into production in a tighter temporal frame than imagined in most market models' (McFall, Cochoy, and Deville 2017, 5).The increased 'volume, velocity, variety' of behavioral data capture radically transforms the processes of learning about and intervening in user behavior.However, as case studies of behavior-based insurance apps show, companies' ostensible allusions to behavioral tracking and behavior change techniques do more work to singularize the brand and create 'brand attachment' than accomplish effective behavioral modification per se (Jeanningros and McFall 2020;Tanninen, Lehtonen, and Ruckenstein 2021).And even when apps do modify behavior, in fact, that is how they produce durable attachment, by getting entangled with 'users' routines of action' (Jervis 2020), this is in rather banal, unremarkable, everyday ways.To emphasize this, Morris and Elkins (2015) call apps 'mundane software,' 'software that spreads out beyond the computer and into a vast range of everyday routines and activity,' although with significant material consequences, as they 'distill' from these complex patterns, 'partitionable processes that can be converted into software solutions ' (65-76).
This last point summons the third key insight from the literature, that will be central to our remaining discussion: the notion of 'distributed agency' which denotes that action and meaning springs from assemblages of humans and non-humans, technical and textual elements (Callon and Muniesa 2005;MacKenzie, Muniesa, and Siu 2007).Importantly, if differently aligned, these assemblages or 'socio-technical agencements' (Çalışkan and Callon 2010) offer different possibilities for action: a calculative agency is performed differently with the double-entry bookkeeping, the stock ticker, the shelves, packages, and price tags, as it is without them.The role of theories and artefacts in producing market action is well elaborated in sociological studies of markets, since Michel Callon's field-defining provocation, 'homo economicus … is formatted, framed, equipped with prostheses, which help him in his calculations, and which are, for the most part, produced by economics ' (1998, 51).Callon (2008) has also attended to the literal and not only figurative prostheses, in studying what he calls 'prosthetic' and 'habilitation projects' that seek to 'restore the lacking competencies' of persons with disabilities.These projects, he has demonstrated, create different agential consequences, while constituting a distinct modality of intervention into agency, one that is purposive and strategic.
Contemporary social and behavioral sciences, more generally, exhibit an 'interventionist' approach to socieconomic relations and agencies which they treat 'not as something given and un-changing', as is the assumption in the performativity thesis, 'but as a set of activities, patterns and forms that may shift, expand and are thus transformable' (Marres 2017, 159;Marres, Guggenheim, and Wilkie 2018).This is the line of argument I wish to take up and further here, for behavioral economic science and its corollary program of nudging fit right into this paradigm (Muniesa 2018).Animated by the normative-positive distinction, behavioral economics attempts to sociotechnically engineer the homo economicus into existence, by modelling irrationalities that then serve as the bases for intervention (Heukelom 2014;Muniesa 2018).'One could frame the interventions in question as sociotechnical medicine that assembles a carefully arranged network of humans and non-humans.' Berndt and Boeckler (2016, 23) observe, 'In this assemblage, agency is purposefully designed as being distributed between heterogeneous elements' (emphasis mine).We might add that not only is agency purposefully designed to be distributed, but also 'the intervention' is strategically constructed as the 'agential peak' (MacKenzie 2008), or the key 'actant' in the distribution.Self-tracking products are a good example of this: they are designed in a way that the user 'passively delegate[s]' the actions to the devices that micro-nudge them: 'these devices transfer the burden of trackingand, in some cases, behavior changefrom selves to sensors and computational algorithms' (Schull 2016, 323).
Building off this insight, the rest of this paper will unpack the strategic design processes through which 'the burden' of certain actions 'are delegated' to behavioral economic interventions.The behavioral economic proposition hinges upon the attribution of responsibility for the action to the behavior change product.It is self-conscious and explicit about this fact of affordance: nudged agency is what it sells.On a deeper level, it also depends on the reconfiguration of the user's network so that the action itself, and not only the responsibility for it, is durably delegated to that specific product and not any other device.This, I propose, constitutes a distinct logic of market attachment, and in the next three sections, I will show how it is activated and operationalized in market practices.
The argument is informed by 30 in-depth interviews that I conducted over Zoom, with practitioners broadly involved in the field, and in various localities in North America, Europe, and South(east) Asia. 8I triangulated the interview findings with documentary and ethnographic evidence that I could 'scavenge' (Seaver 2017) on various platforms like LinkedIn, Slack and WhatsApp, virtual workshops, trade conferences, and other events hosted by the community, and public or privately shared documents like decks, blogs, reports, manuals, trade publications.In interviews, we broadly discussed their work practices, focusing on moments and acts of decision-making and evaluation around the product at different stages of its development.The following empirical analysis weaves together fragments from these conversations to show, first, how the behavioral economic value propositions are constructed and what kind of market attachment logic they summon, second, how this logic is actualized in user research, product strategy, and interface design, and third, how it compares, contrasts, and at times hybridizes with predominant modes of production in the attention economy.The three empirical sections are then followed by a discussion where I will reassess the suitability of behavioral addiction and manipulation frames for studying software design, given the empirical insights uncovered in the previous sections.

Constructing the behavioral economic proposition
Behavior change product space is large, and heterogenous in the consumer activities, markets, business models, financing arrangements, even particular traditions of behavioral research that are assembled in the development of different products.Notably, most digital health apps rely on motivational psychology and self-determination theory, and at times explicitly oppose to behavioral economics' limited framing of human judgment and decision-making (Bucher 2020; Villalobos-Zúñiga and Cherubini 2020).I will nevertheless argue that it is a uniquely behavioral economic problem that forms the basis of and unites the disparate market and expertise offerings in the field, at once performing the narrative function of business models (Doganova and Eyquem-Renault 2009;Geiger 2020) and helping behavioral practitioners frame the meaning and value of their work.This is the problem of 'hyperbolic discounting,' a cornerstone of behavioral economics science and its applications alike (Heukelom 2014).The managing director of an applied behavioral economics consultancy in San Francisco, colloquially explains it as follows: We tend to favor things that give us immediate benefits and deprioritize and undervalue things that give us a future benefit … I favor sitting on my sofa watching Netflix … I'm less likely to get up from my sofa and go for a run … It's a hedonic immediate pleasure versus a long-term functional benefit … Health is a classic example of that, finance is obviously another classic example: by putting money away for the future, I'm not getting to spend it on something enjoyable today.(Interview with a US-based consultant, Online, 18 August 2022) 'Behavioral science,' she continues, 'is a great fit for anything that requires self-control or delayed gratification' and that is why 'health, fitness and financial decision making are some of the biggest areas' in which they work and make an impact, despite the consultancy's famed collaborations with large platform companies.Of the people I interviewed, most, primarily worked in these domainsand all, at least oncesuch that one practitioner called theirs the 'fitness and finance' industry.The trade literature of behavioral product design also mainly supplies case studies from these 'most active and exciting areas of application': the wearable space, medication adherence, mobile dieting, investments, and day-to-day money management (Wendel 2016, 98-102; also see Andorsky 2020; Bucher 2020; Wallaert 2019; Wendel 2013).Noom and Betterment are two classic examples, illustrative of the fields' co-existing divergences and convergences.Noom is a dieting app, Betterment is a financial roboadvisor.In the app economy where barriers to entry are low, 'psychology is the lynchpin of Noom's business model,' informing the 'habit-based' approach to weight control that differentiates the company from the market alternatives that promise quick yet impermanent weight loss while bolstering its subscription-based revenue model (Thau 2021).The app embeds behavior change techniques like 'goal setting, feedback, self-reward, social support' to endow users with the capacity to 'explore and develop habits.' 9 Betterment's revenue comes from 'management fees' yet approach to investment is similar: investment is a lifelong activity and lifelong is it under the threat of the 'irrational choices' of investors (Hayes 2021).Behavioral finance is what sets it apart from other financial advisors: Betterment 'explicitly applies' lessons from the literature 'to help investors avoid common mistakes,' such as adding a 'tooltip' to their design, that warns users 'about the dangers of hasty withdrawals' 'during market downturns' where anxious investors wrongly 'remove their money from the market' when 'their investments are then at their lowest value' (Wendel 2016, 101).
The value proposition for users, then, is the product's superior efficacy and effectiveness in achieving long term, lasting behavior change or behavioral correction, by using science.The business, on the other hand, purports to create and capture value from long-term sustained behavioral outcomes, and not short-term engagement or enjoyment that the product inspires in the userconsumer.In other words, the products are economically valuable and profitable insofar as they are 'behaviorally effective'; the product delivers sustained revenue or 'an interested, engaged audience for advertising' insofar as it succeeds in enacting the targeted behavior (Wendel 2013;cf. Sax 2021).That is why for products targeting 'a repeated behavior that people often want to change in their lives,' standard business models like subscriptions, freemium, or advertising 'work well and align user success with business success' (Wendel 2013, 283). 10 Behavioral economics is an ingredient in the value proposition, but behavioral economics is also the backdrop against which this proposition makes sense.Both Noom and Betterment assume people want to act rationally but fail inexorably due to cognitive biases they cannot overcome.This failure is not a disease that can be cured, but a disability that needs to be compensated for with 'prostheses' (Callon 2008) that are life-long, habit-based devices that constantly and dynamically correct the user, and nudge agency towards rational behavior.Homo economicus, now chronically dependent on his prostheses, pays for -or gets subsidized-to become and stay rational.
This makes the behavioral economics approach a distinct 'sociology' of users, products, and attachments between them (Cochoy 2007), putting in motion a development process focused on producing and monetizing the action that the product targets while conjuring up a figure of the user significantly different from the one we commonly encounter in accounts of manipulative and addictive software.We will now unpack what this process entails.

Designing for the disengaged, rearranging the actor-network
Because the end goal is to be the enabler of a particular action, behavioral product strategy 'starts at the end' (Wallaert 2019) by asking 'what is the action that the user is trying to accomplish?' and works backwards.A freelance behavioral product strategy consultant from the US, who specializes in early-stage productivity applications, illustrates this: One of my clients told me at one point that they want to replace a good amount of your time that you spend on Facebook news feed with their app.If you're approaching this with the behavioral lens and you're trying to think 'Okay, what are people doing when they're on their newsfeed?'It's essentially an exploratory search where they don't have any goal and they are low energy.They're looking for a goal and that's what the news feed does for them.Do you have anything that could fit into that gap? … because you're not going to replace that time with a thoughtful activity.(Interview with a US-based product strategist, Online, 25 April 2022) This is 'outcome-driven innovation' where 'effective customer segmentation relies not on … demographics or location … product features and prices -but rather on a deeper understanding of what the consumer is trying to accomplish' (Thompson 2018).Part literature review of behavioral scientific research, part vernacular theories and practice-based knowledge of the expert, behavioral approach, by contrast, gets to the bottom of what the consumer is trying to really accomplish -'do people want to lose weight or actually do they want to feel more confident'-as 'a behavioral scientist favorite tagline is people don't always know what they want.' 11Acting as 'a technology of revelation' (Schneider and Woolgar 2012), the behavioral approach 'reads between the lines' of what 'your users are telling you' 12 , and offers a deeper, truer, and causal, understanding of user behavior that neither the unreliable user-centric research methods nor the correlation-based data analytics tools can supply.
Applying a behavioral lens, however, is not only about knowing, learning, and reasoning about user behavior, but also about singularizing, positioning, and ultimately 'marketing' the product.The startup literature more broadly conceives of this co-elaboration of specific user needs and the corresponding product features as 'the essence of product strategy' (Olsen 2015, 7).The following example is illustrative of the behavioral version: A behavioral design and gamification consultant that helps web and mobile-based companies optimize their user experience, was hired to build from the ground up a fitness app that aimed to 'target specifically people that don't enjoy exercising.' Knowing 'extrinsic motivation works for people that are not naturally interested in an activity,' they designed a micro incentive-driven rewards feature that gives users discounts at partner stores every time they hit a prefigured milestone.They did not, by contrast, include a social feature that e.g.allows users to compete with their friends, like the one market incumbent Fitbit has, because 'Fitbit is not built for motivation, it's for people that already like exercising' and that is why 'they have community [as] a core feature.' 13  We can see here how behavioral categories that help define the product's customer segment (motivated versus unmotivated users) guide selection in the 'modular, ever-expanding feature space' (Gurses and Van Hoboken 2017), 'singularizing' the product, through differentiation from and imitation of the alternatives in the market (Callon and Muniesa 2005;Callon, Méadel, and Rabeharisoa 2002). 14We can also notice the qualities of the imagined user-consumer of the product.The users enrolled are users who are unmotivated, who resist, and who are difficult.The fields' overall emphasis is on dealing with the hard, 'wicked' problem of 'behavior change': the offering is to change complex behaviors that need to be studied, modelled, and intervened in carefully (Bucher 2020;Schmidt and Reid 2021).Importantly, this equally applies to consumption and technology usage behavior, in stark contrast with the accounts of the same behaviors, typified in the design manual Hooked.An independent behavioral insights and strategy consultant with a 15-year career and a large following on LinkedIn, reproaches the book for creating a 'Zombie-like caricature' of users/consumers, in the following excerpt from one of her highly engaged posts: [The] lack of empathetic consideration for users/consumers leads to a kind of Zombie-like caricature-and of course I can see why it's every marketer's dream to be able to increase the business' return on investment indefinitely with consumers so "hooked" (addicted) to their product and watch the money roll in … unfortunately life and behavior aren't as simple as thatnot even if you are training dogs instead of trying to "hook" people to using your products.Behavior is complex, contextual and variableeven the best dog trainers in the world cannot achieve the kind of "addiction" Eyal is promising you can with these techniques.So how could it possibly be so simple in humans?(Halonen 2022) The consultant refers not only to the peculiar problem of behavior change and the special status of the unmotivated behavior change subject but the broader problem of market attachment and the user/consumer whose 'default state of being' is 'not using your product': 'you're trying to get them to do something different in using your product … Remember, you're competing against doing nothing and against pre-existing habits.' (Haisfield n.d.)The evidence that supports this view range from general consumer behavior explanations -'the psychology of new product adoption' (Gourville 2006)-to those particular to the 'app economy' which emphasize the abundance of apps that enter the market, that are downloaded, 'tested/tried' and 'forgotten,' never to be reengaged with (Morris and Elkins 2015).Computational experience, distributed across the many apps competing for attention, is thoroughly compositional, it cannot be comprehensively designed and thus controlled (Dourish 2021).You have the flip side of excessive engagement: this is designing for the always almost disengaged user.
Most behavioral design work corresponds to the optimisation of user flows to prevent users from dropping out at different stages due to distractions, negative reactions, or general disinterest (Wendel 2013, 40).The first target is onboarding, which in software app development refers to 'the procedures for establishing or otherwise setting up a new user account' (Dieter and Tkacz 2020, para. 5).Attaching during onboarding is considered important in the industry, for products across the board, as reflected in the wider startup literature: 'A user's first session with a product is a critical determinant of the user's lifetime with the product; it is therefore worthy of the product team's when trying to optimize the user experience.' (Seufert 2013, 98) One deadly sin is to ask openended questions during sign up as it puts extra cognitive burden on the user (Ariely, Hreha, and Berman 2014).Behavioral designers well versed in gamification pre-emptively design for 'failure states' to bounce back from drop offs.A common intervention for onboarding optimisation is a 'commitment device,' an example of which was given by a 'serial founder' and entrepreneur who at the time of our interview was the CEO of a digital creative agency with a behavioral emphasis.He recounts how they increased the 'pull through rates' for the application process for a client who gives personal loans (similar techniques observed by Ash et al. 2018): We said, "We know that your financial future is very important.It's important to not only you but also to your family and those that you care about most.Help us understand what the purpose of the loan is" and we had a bunch of options.One was to provide relief for medical bills, etc.One is to provide vacation.You click it, and whatever you clicked in the next step of the application, it would have that there for you.So, you were constantly reminded that there is this sort of commitment device … we pull that experience that was tied to this emotional personal goal through the application itself.(Interview with a US-based entrepreneur, Online, 18 March 2022) This is a typical nudge, acting, in its typical fashion, as a 'critical actant' in the action of applying for a loan.Yet one needs to be careful with nudges and their agential power.First, all interviewees cautioned that a deeper appreciation of the context is a precondition for a nudge to work as intended.A report by the famed behavioral economics consultancy, Irrational Labs, confesses that the commonly used tactic 'loss aversion' 'is a tricky force to wield-users are just as likely to flee from or exit from your product as they are to use it' (Ariely, Hreha, and Berman 2014).Second, the loss of choice and autonomy inherent in nudges can at times be counterproductive, as sometimes what begets engagement is the ability to choose, to 'invest' in the product to quote Eyal (2014) from earlier.On the other hand, as simple, easily transportable design interventions, nudges are bound to be easily adopted across the industry, slowly losing their effectiveness (Doctorow 2020).Finally, the behavioral research informing some frequently used nudges is increasingly coming under attack for their lack of replicability if not claims of fraudulent data. 15 Once the user is successfully onboarded, and all flows optimized, the key problem for behavioral design becomes how to secure repeat, continued, and even habitual usage.The ideal of sustained engagement features centrally in the behavioral design literature, and for attaining it, authors advise tactics like the following: Uniquely become part of the person's environment: One way to remind people to use a product is to ensure it's seenby placing the Nike + FuelBand by the side of the bed or by making your application the home page on a browser.… Uniquely become part of the person's expected routine: At a particular time of day (or situation), train the user to uniquely think of the application as a way to do something or relieve boredom … ask the user to plan out a particular time to use the product … Build strong associations with something that is part of the user's environment or daily routine: If you can't get in front of users' eyeballs directly or reserve a slot on their daily calendar, build on what's already there.(Wendel 2013, 279-80) This resembles the idea of 'establishing a mind monopoly,' proposed by Eyal (2014), and received critical attention from social scientists (see Balzam and Yuran 2022;Seaver 2019), but where it differs is worth exploring.Building off Eyal's notion of internal cues like 'boredom, ' Wendel (2013) extends his prescription to their material foundations, focusing on how to embed and entangle the product temporally, spatially, physically into the everyday.This is not so much about mind monopolies as is about rearranging the mundane, about 'entangling' the product in the everyday 'actornetworks and routines of action' that the user is already enmeshed in (Hodder 2012; Jervis 2020).
Ultimately, 'the goal of behavioral product strategy is to turn usage of your product into a default behavior for certain goals,' and even better, if the product can 'enable the user to accomplish multiple goals' (Haisfield n.d.).The actions to be delegated to the product, however, need not already exist; product makers can create them from scratch and then incorporate them into the users' lives.The example that opened this paper is a case in point.We had a peek into the problem that the behavioral economist turned user experience lead of the trading app was trying to solve: the problem of how to get lay people to engage in trading.In contrast with the media reports that portrayed the trading app as 'exploiting people's gambling addiction,' the interviewee further explains that: A big chunk of our users was nervous about trading.We built a new user flow that walked them through the process, the stages; instead of just saying 'Here you go, you signed up, now you can start trading!' and assuming they knew what to do.We would give them little lessons, walk them through different steps to take and nudge them on like 'Okay how about placing a trade?' Or even building features like fractional shares, … then you could trade a penny of Apple or Tesla or something … instead of having to buy a share of Google which was like $1,000. 16(Interview with a US-based user researcher, Online, 16 June 2022) All design decisions are directed at 'getting people to trade.'By teaching, encouraging with words, and equipping with tools like fractional shares, the app creates for the lay trader, new capacities to act.My interviewee frames the process as helping users 'do something that [they] came here to do' and so, the manipulation frame fails to resonate with his experience, even if his work is to intervene in the agency of the users.The mode of intervention his account evokes is productive, it creates capacities of action that are previously lacking for the user as the individualized actor, while attributing the agential power to the individualized intervention.This is not to conclude that increasing the frequency of use does not matterin fact, the more trades a user places, the more money the app makes.Rather it is to show that the purpose of design is not always to increase the frequency of interaction, but sometimes to initiate the interaction and prevent it from stopping. 17In fact, the balance between designing for more frequent interaction ('getting people to use the app more') and designing for preventing the interaction from becoming less frequent ('getting people to use the app, period') arises as a key productive tension in behavioral design, as we shall explore in the next section.

Pragmatic participation in the engagement economy
As we noted, the behavioral economic proposition is predicated on enacting a rational behavioral outcome, and the purpose of design interventions is to produce target actions, by affording or constraining the user's agential capacities.This is the case in the commitment device that gets users to complete loan applications, the rewards feature that pushes users to work out regularly, and the tooltip that stops users from withdrawing their money during market downturns.The concern is not to increase the length or frequency of interaction with the technical object, but to design the object into a key actant in the action being performed.
It is important to stress this point, as it forms the basis of the behavioral economic proposition, and its market critique of the attention economy and its modes of production.Practitioners summarize the state of the field as: 'Let's move really fast, find something people will download and spend time on,' rather than building 'something that six months later they'll still be using, and they'll still have kept off the weight or increased the exercise or whatever the real-world behavior is.' 18  A behavioral scientist working for an online education platform shows just how fast startups move, and in contrast, how behavioral practitioners 'add some friction to a startup mentality' and 'to their programming efforts': When I wasn't there the MVP [i.e.minimum viable product] took like 22 days to get ready.Now I have come in, I know I'm going to take one quarter at least for the next version of the product to be ready, because I don't want to go ahead without getting any kind of feedback or not seeing the behavior that I want to see.(Interview with an India-based behavioral scientist, Online, 4 February 2022) The MVP is 'minimum viable product,' a cornerstone in the Lean Startup Methodology or LSM (Blank 2013;Eisenmann, Ries, and Dillard 2012;Ries 2011) and a key device for 'finding something people will download and spend time on.'Lean methodology contrasts with 'waterfall' methodologies of software development that takes a product to the market after planning and executing a full-blown production.Instead, the idea here is starting with 'an early product that is terrible, full of bugs and crash-your-computer-yes-really stability problems' (Ries 2011, 15), because it is 'the smallest thing you can build that will create the value you've promised to your market' (Croll and Yoskovitz 2013, 6) and will allow 'the product to be deployed and tested in the field' (Wendel 2013, 90).Serving as a device to test the market, MVP collects data on what to keep, what to change, whereabouts to iterate -briefly, to find out what 'the market actually wants' (Ries 2011;Wendel 2013).
Favored is flexible pivoting and iterative research as opposed to following a predetermined plan, for startups do not deal with known variables but are 'temporary organization [s] in search of a scalable, repeatable, profitable business model' (Blank and Dorf 2020).The product can end up being quite different from how it was initially conceived -even switch to a completely different market than the one it was initially positioned-as it keeps responding to market signals.The market, on the other hand, speaks through metrics and data (see Lean Analytics).At early stages, the key metric is growth, 'growth in users, engagement, and conversion for consumer-focused startups' (Kenney and Zysman 2019, 43), reinforced by the intersecting forces of financing pressures to grow fast and platforms that render visible apps that are growing fast (Cooiman 2022;Sax 2021).As an account manager in a behavioral design agency that works with healthcare, education, and charitable organizations on optimizing their digital products, explains: You can make a lot of money by just hooking people into X amount of time and then constantly flowing through more users' because 'you get rewarded by the promise you put on AppStore … so you put pictures of beautiful people and say in 30 days you'll lose however much weight.(Interview with a US-based account manager, Online, 21 April 2022) Equally important, though, is continued usage and engagement.'Engagement is one of the best predictors of success' argue the authors of Lean Analytics illustrating with a well-chosen example: 'Facebook's early user counts weren't huge, but the company could get nearly all students in a university to use the product, and to keep coming back, within a few months of launch' (Croll and Yoskovitz 2013, 47).The most comprehensive social scientific definition of engagement as a business metric is offered by Birch, Cochrane, and Ward (2021) who define it as 'a specific measurement of a person's time, activeness, regularity, and repetitiveness in "using" an ecosystem' (4).They further observe that engagement is made 'legible and measurable' through 'new metrics of political-economic performance' such as '"daily average user" (DAU) and "monthly average user" (MAU)' which operationalize 'specific techno-economic activities' like 'searching, scrolling, viewing' (2)(3)(4).Engagement metrics take center stage in 'the develop-release-measure-iterate feedback loop' under LSM, taken as proxies for 'users' satisfaction with the product' (Seufert 2013, 98).
Behavioral practitioners oppose engagement-driven development on different yet related grounds.One dismissively says that 'most companies just sort of build, put it out there and see what sticks' 19 , while another notes, 'Lean Startup is 'all about' 'literally pivot[ing] to whatever people want … it's very hard to have a vision with that.' 20 LSM is considered a poor fit when the goal is to build a product that is purpose-drivena product that people should wantand one that is adopted for its use value, its success in getting users to accomplish a goal, not its cunning in 'hooking' them in the short term.'Easily calculated, usage and retention have become the default standard for assessing a "successful" product,' an applied behavioral scientist complains in a blog post, and the standard privileges designing for frequency and length of interaction, over the product's purpose in its user's daily life (Joyce 2022).
Headspace embodies this problem and was brought up by several interviewees who had conflicting views on how its designers were mobilizing engagement.A well-known behavioral science 'evangelist' and former Director of Behavioral Science of a large platform company, observes the following about the app: Yet, for people to stick to their habits, and products to stick to their people (to echo McFall et al [2017]), investment in interaction might be necessary.In other words, 'to captate' (Cochoy 2007), the product also must entice, entertain, and engage as a consumption objectand in the app economy, on a daily basis.The account manager quoted earlier, who also happens to be an avid user of Headspace has a different interpretation of the app's design choices.He recalls when Headspace first started, they were 'pushing forward courses': 'You want to manage stress, you want to manage sadness?Take these 10 or 20 different classes.'Only later, 'they have migrated to a place where now they pushed more stackable content.'He continues: Now I can be cynical and say that's only keeping people in the app and it's not actually helping them be more mindful, but I can also sympathize with Headspace that you have to get people form a daily habit of using the app first, then you can offer them opportunities to dig deeper, make a bigger commitment, work on one area for a longer period of time.I see in that the balance of the business need of if we keep pushing these hard things, people are going to go away, so you have to have both … .First, they had to create an experience that was sticky enough that you could form a habit around it.(Interview with a US-based account manager, Online, 21 April 2022) Content stacking is an example of the behavior change technique (BCT) called 'graded tasks,' in which the intervention designer 'set[s] easy-to-perform tasks, making them increasingly difficult, but achievable, until behavior is performed' (Michie et al. 2013). 22At a design workshop that I observed, the facilitator, a senior interaction design consultant in the health and wellness domain, named seven more BCTs that Headspace uses to achieve their 'primary target behavior' of getting users 'to meditate at least once per day.' 23 In this framing, the app's features are 'specifically designed to influence' health behavior outcomes which in turn are dependent on the users' exposure to these 'active ingredients' (Cole-Lewis, Ezeanochie, and Turgiss 2019).The implication is to design an engaging experience to encourage users to have 'the appropriate level of interaction' with the product, so they could have continued exposure to the behavior change interventions (Bucher 2020;Cole-Lewis, Ezeanochie, and Turgiss 2019).
Engagement is therefore pursued, according to this view, not as an end in itself, but as a means to securing long-term real-world behavior change.The app's evident participation in the engagement economy is pragmatic, permitted by the habit-centered epistemology of behavior change theory, which also happens to be a convenient framework for revenue models that depend on habitual usage.The emphasis on 'use value', however, is not a pretense, if not for the value propositions that promise long-term change, then for the market dynamics that are themselves changing.As the behavioral researcher for the fitness app, quoted earlier, notes, 'companies are realizing … when you just build for usage, to get people to engage with it more,' it no longer guarantees market success.'You could use every behavioural science principle in the world to get that to be the most engaging, gamified, super fun application in the world' but: As consumers, I hate to say consumers get smarter because that's not exactly what I mean, … as the market gets more saturated with applications that are intending to provide weight loss, … just as the market matures, people are looking for solutions that actually achieve what they want.(Interview with a US-based user researcher, Online, 25 February 2022)

Discussion
The behavioral economic proposition, as it applies to software development, is to create products that equip users with rational agential capacities that they otherwise lack.The product is strategized, developed, and designed to become something that the user cannot do without, not because it is addictive, but because it is made indispensable to 'the distributed action universe' of the behavioral problem that it addresses, to borrow from Caliskan and Wade (2022).This starts with the assumption that, left to their own devices, people cannot act in their best interest or rationally: they prefer sitting on their couch and watching Netflix to getting up and going for a run.Once inserted into the 'actor-networks' and 'routines of actions' of the user, the product can create the capacities for acting rationally.To create an enduring attachment, the actor-network is rearranged to turn usage of the product into a default behavior for certain goals, to the point that user-consumers agree to pay for it (Callon 2017).
On the other hand, the behavioral economic proposition is a productive critique of the attention economy and its predominant modes of production.It argues that lean, iterative, metric-driven frameworks are good for 'finding something people will download' but not for building 'something that six months later they'll still be using.'Products that are designed to 'hook users for X amount of time' do not help users 'actually achieve what they want.'Behavioral approach offers an alternative framework and set of devices that are oriented towards optimizing the product for its purpose, to ensure long-term retention, over frequent engagement.Put differently, the aim is less to increase the frequency of interaction with the system, than it is to prevent decrease in frequency and ultimately stopping.My choice of the word 'proposition' is intentional: a proposition is a promise heavy with performativity yet only partially fulfilled in practice.The attachment logic behind nudging agency is an idealized form, and so it is certainly hybridized with existing logics of engagement or growth.Furthermore, we encounter a pragmatic stance towards the engagement economy: while actors emphasize their interest in designing for long-term, sustained behavior change, they do not refrain from using techniques typically understood as manipulative or addictive design interventions. 24Finally, in application, it is often hard to disentangle what is implemented for behavioral effectiveness, and what is to make the consumption object more engaging.The behavior change techniques and engagement techniques often mix, overlap, and intertwine.
The point, however, is not to offer a different, yet likewise purified, account of attachment, as the alternative to the behavioral addiction and manipulation frame.Rather, the point is to show that multiple logics of attachment exist in software design, and their effects and marketisation functions vary: pictures of beautiful people, with promises of quick weight loss, is an attachment device, good for growing the userbase, bad for retaining them, as the interviewee suggested, it is a short-lived, fragile trick that requires a constant flow of more users.This is assuredly a market that tends to privilege high growth at the expense of retention, yet times might be changing.'As the market gets saturated' with alternatives, neither short-term engagement nor lean, iterative models are suitable.A software product that fulfils a user goal and becomes the default for that goal now appears to be considered more effective.
A dynamic perspective such as this one is lacking in the manipulation thesis, which has a rather atemporal, unchanging view of the so-called surveillance economy's extractive projects.Similarly, the mundane materiality of attachment, made palpable in the attempts to put the software on the home page, the hardware on the bedside table, is not captured in accounts that appear to be obsessed with mind control.Finally, and perhaps most importantly, the passive and powerless, 'Zombie-like caricature' of the user in these accounts is challenged, along with the predictability, intentionality, and potency of nudges and behavioral designs. 25The disengagement frame that I proposed in its place, corrects the overemphasis on the power asymmetry while not ignoring that software designers mobilize all 'technique and sentiment' (McFall 2014) to influence their users, as well as investors and the public, on the success of their inventionsalthough definitions of success might be more contingent than we take them to be.

Notes
in the review process.The brief encounter with Franck Cochoy reinforced my belief in the paper's argument, I thank him for that.These individuals, both as interlocutors and as authors, have profoundly shaped my thinking.Yet, any mistakes or misreadings remain my own.

Disclosure statement
No potential conflict of interest was reported by the authors.
15 April 2022)eadspace is to get people to meditate … but that's not actually what they monetize … they monetize content.And really what they need to get you to do is engage with the app, whether or not you meditate is irrelevant to their actual business model … If you look at the senior people at Headspace, they're all content people, ex-content sellers, media people, etc.And ditto at Calm [another meditation app], you know the person who headed Product, just left but, Dun Wang … what was her previous company?Zynga, right, games! 21Interview with a US-based behavioral scientist,15 April 2022)