Identification-imitation-amplification: understanding divisive influence campaigns through cyberspace

ABSTRACT Cyber-enabled influence campaigns sit at the nexus of intelligence-based deception and strategic-oriented delivered effects. They represent an increasing potential for states to re-configure domestic political dynamics at scale. We offer an analytical construct to better understand the mechanism by which cyber-enabled influence operations work and better discern the strategic goals behind cyber-enabled influence campaigns.

To begin, we examine the cyber-enabled aspect of influence campaigns as a tool of statecraft and of domestic politics with a distinction between operations and campaigns.We then use the military concept of the Observation-Orientation-Decision-Action (OODA) loop, which when applied to conflict environments focuses on collapsing the orientation of targets and re-focus that emphasis to the strategic competition environment in which the fundamentals of observation can be manipulated at scale.We then argue how the Identification-Imitation-Amplification framework can better guide research and policy prescription.Ultimately, we conclude that cyberenabled influence operations can filter, by overweighting or underweighting, and decontextualizing information, to impact how targets observe the flow of information.Linked into a cyberenabled influence campaign, we suggest that this filtering can create sufficient non-cooperative centres across a society -enough societal division -as to undermine effective strategic behaviour in targeted society.

'Hacking' the human mind: cyber-enabled Influence operations and campaigns
Information has been used as a tool of statecraft throughout history, including by countries such as China, and Russia. 8Cyber-enabled influence operations (CEIOs) are part of international competition short of armed conflict.Developing from what the US Central Intelligence Agency (CIA) used to refer to as 'political warfare' in the 1950s, information as a tool of political contestation has been used in various forms throughout the 20 th century. 9Well-known efforts include Soviet Union's Operation Infektion/Operation Denver aimed at spreading the lie that the United States manufactured AIDS. 10dditionally, in 2013 the CIA admitted that 60 years prior, it had used information operations, including propaganda, to destabilise and ultimately help topple the Iranian Prime Minister Mohammad Mossadegh. 11While states have historically engaged in covert competitions, similar efforts were also applied elsewhere in the form of covert operations. 12arying terminology is currently in use to refer to the use of information to affect foreign audiences.The term Influence Operations is used interchangeably with the term Information Operations, and other terms such as propaganda, Information Warfare (IW), information intelligence, surveillance, and reconnaissance (ISR), electronic warfare (EW), and psychological operations (PSYOP), as well as Information/ Influence Warfare and Manipulation are also used. 13Across all of these terms, the underlying goal is a 'deliberate use of information by one party on an adversary to confuse, mislead, and ultimately to influence the choices and decisions that the adversary makes'. 14The US understands such operations as 'the integrated employment, during military operations, of information-related capabilities (IRCs) in concert with other lines of operations to influence, disrupt, corrupt, or usurp the decision-making of adversaries and potential adversaries while protecting our own'. 15he term 'cyber-enabled influence operations' refers to those activities conducted through cyberspace aimed at impacting cognitive and psychological audience perceptions. 16For the purposes of this article, we expand the definition of this term to include effects of CEIO on behavioral changes.CEIOs refer to those activities conducted through cyberspace aimed at impacting cognitive and psychological audience perceptions with the goal of producing behavioural changes in target audiences, that include, but are not limited to audience resonance and mobilisation (to the point of potential disruption of group identities).We argue that when considering state behaviour short of armed conflict, we need to consider CEIOs.Although CEIOs do not necessarily involve direct manipulation of computer code, these operations are conducted in and through cyberspacebased platforms and networks, and may have considerable effects that are overlooked when a narrow definition of cyber operations is applied. 17When used in isolation or in concert with other tools of statecraft, CEIOs can emerge as cyber-enabled influence campaigns that have the potential to undermine social cohesion in a state.
Evidence has shown that cyber-enabled influence operations are part and parcel of how states compete in cyberspace.Thus, to successfully craft responses to cyber threats, states must be able to respond to CEIOs, in addition to protecting their networks and cybersecurity infrastructure.

Strategic campaigning in cyberspace
The use of cyberspace as a tool of statecraft to maintain or enhance international standing has been increasingly accepted and viewed as widespread.Cyberspace is seen as a unique strategic environment that can facilitate a shift in a relative power between states and where we do not see the type of 'cybergeddon' scenarios that were initially anticipated. 18Over the years, cyberspace has become one of the primary environments of inter-state struggle for power and autonomy, which allows governments to engage in geopolitics advancing their own interests while hindering those of their foes. 19The United States started to recognise in its 2017 National Security Strategy (NSS) that adversaries can achieve strategic gains by acting through cyberspace below the level of armed conflict. 20The 2022 NSS stated directly, the United States is 'in the midst of a strategic competition to shape the future of the international order'. 21Evidence shows that through operations linked into campaigns, states are using cyberspace to achieve gains that would enhance their relative power without resorting to war-like behaviour. 22This recognition is not limited to the scholars and military thinkers in the United States.The United Kingdom's National Cyber Strategy recognises this strategic competition, and its National Cyber Force's Responsible Cyber Power in Practice document explicitly notes that 'combining several operations . . .into a campaign for cumulative effect also supports longer term outcomes'. 23Additionally, while Russia has been using 'active measures' historically, it is argued that greater emphasis on new ways of campaigning has entered the Russian understanding through blurring the lines between war and peace due to new technologies. 24hile there remains more to be done to operationalise the strategic competition framework, states pursue cyber operations below the threshold of armed conflict because they see value in those activities. 25These cyber operations could be linked into campaigns that have significant potential effects.Thus far, the focus of the study of such cyber operations has mostly been on what can be broadly construed as hacking networked systems, while the use of cyberspace to 'hack the minds' of target audiences came into focus only recently. 26ecognising the potential for strategic campaigns does not, however, detail how they work, and it is in understanding those details that one may develop strategies to counter their effects.If there are generalisable patterns that can be discerned, the ability to develop countercampaigns that disrupt CEICs becomes possible.One can posit that, in fact, an inability to stop cyber operations does not ipso facto eliminate the possibility to break the cumulation sought through campaigns and thereby make the adversaries' efforts inconsequential.If CEIOs are strategically connected across platforms and media to target adversaries systematically and persistently over time with the goal of using cyber tools at scale to gain advantage in competition, such phenomena should be understood as Cyber-Enabled Influence Campaigns (CEICs).States would engage in such campaigns because their value is potentially strategically significant, by allowing gains to accumulate over time that would help gain advantage in the international competition in and through cyberspace.This is especially true for democratic societies that are uniquely vulnerable to such operations and campaigns. 27However, authoritarian states also see themselves as targets. 28In the international system, countries are interested in, closely observe, and learn from each other's behaviour.States learn not only 'good behaviours' but also learn how to exploit technologies, including how to use cyberenabled influence operations and campaigns as tools of statecraft.For example, China is already learning from Russia's election interference in the 2016 US Presidential Election. 29In other words, the use of CEIOs/CEICs is proliferating.Provided that cyberspace is complex but adaptable, CEIOs/CEICs could be used across different national contexts.To be able to respond to these types of cyber activities, it is important to know not simply where, when, and how vulnerabilities manifest, but understand the mechanisms by which CEIOs/CEICs work to exploit those vulnerabilities.

CEIO mechanics: getting inside a re-thought OODA loop
The story of cyber-enabled influence operations is a story of interaction of domestic socio-economic politics and adversarial intent as CEIOs involve attempts to moderate and affect behaviours of domestic audiences through leveraging and manipulating information flows via digital platforms.For the purposes of this paper, we focus on foreign adversarial action as part of strategic competition, but domestic groups seeking to undermine societies and governments from within can use the Identification-Imitation-Amplification (IIA) mechanism to produce similar effects and outcomes.Regarding our focus, in his seminal piece on the logic of two-level games, Robert Putnam noted that '[d]omestic politics and international relations are often somehow entangled, but our theories have not yet sorted out the puzzling tangle'. 30In the era of CEIO, we find examples of how domestic politics can be manipulated by foreign adversaries for producing effects on international relations, and in particular, the distribution of power between states. 31However, the obstacle associated with examining CEIOs is a similar one Putnam talked about -as a field focused on states as legitimate units of analysis, it is challenging to study operations that involve multiple levels of analysis -in this case, foreign actors attempting to influence domestic audiences.Foreign actors might pursue CEIOs to hurt and undermine their adversaries from within.The guiding principle for seeking to achieve such effects could be summarised by the notion of finding a way to weaken the adversary while avoiding war. 32EIOs are an outgrowth of a historical practice of intelligence, which rests on the practice of using covert and overt tools of statecraft to 'understand or influence foreign entities', with the ultimate goal to 'reduce risks, mitigate threats, and to create and use opportunities to win and preserve what they [states] see as their interests'. 33States engage in, 'nonviolent operations that produce cumulative, strategic impacts by eroding the US military, economic, and political power without reaching a threshold that triggers an armed response'. 34In the past, it was not uncommon for great powers to interfere in domestic politics of smaller countries with the goal to affect electoral outcomes.For example, in the period between 1946 and 2000, either one of the superpowers engaged in 117 partisan electoral interventions. 35According to the US Senate Committee on Foreign Relations in the age of re-emergent strategic competition, powerful states in the system have found in cyberenabled influence operations a tool that could be used for directly affecting the domestic political environment, and that could offer indirect or positional benefits on the systemic level. 36Specifically, CEIOs can be used as a low cost tool of statecraft that could help states gain strategic advantage without the corresponding risks. 37Cyberspace has been described as a new strategic environment of persistence, where exploitation of cyber vulnerability is the central activity. 38Amongst its other manifestations, pervasive vulnerability is expressed as an increased number of targets that can be accessed in and through cyberspace.While operations such as continuous cyber operations are used to target networked systems, cyber-enabled influence operations are used to target people -both politicians and civilians -and their organisations by leveraging their reliance on the interconnected technical environment of networks of networked computing.Cyberspace has enabled unprecedented access to domestic populations worldwide, which can be targeted through the topmost layer of the environment.
We propose that the concept of the Observe-Orient-Decide-Act (OODA) loop can help clarify the intent behind CEIOs (and their incorporation into CEICs) and how the mechanism of IIA works.As an adaptive and fluid tool of statecraft, CEIOs can be used to pursue multiple goals including manipulating and leveraging a target country's ability to act in a strategic manner informed through stable consensus on reasonably agreed upon national goals.If applied successfully, CEIOs can be used to create internal friction in a target country by creating many non-cooperative centres undermining socio-political cohesion.As such, CEIOs combined into CEICs can be used as a new form of 'divide and conquer' applied to the geo-political condition of competition rather than in war.
While the OODA loop concept was originally developed to help pilots improve their situational awareness, and it has been applied across many fields -in military strategy including maneuver, information, and network-centric warfare, as well as in fields such as business and IT. 39In cybersecurity, it has been used for computer readiness and incident response planning. 40However, Colin Grey concluded aptly that the OODA loop, may appear too humble to merit categorisation as grand strategy, but that is what it is.It has an elegant simplicity, an extensive domain of applicability, and contains a high quality of insight about strategic essentials, such that its author well merits honourable mention as an outstanding general theorist of strategy. 41e OODA loop was developed by Colonel John Boyd, and it is a 'model for how we think, and the means by which we both compete and collaborate'. 42The OODA loop consists of four components that interact (see Figure 1). 43Observation refers to perceiving the environment, including 'assessing the environment, one's place in it, and the interaction of the two'. 44Orientation refers to 'mak[ing] sense of the observational data . . .that creates a mental picture of the situational reality' based on a set of inner filters, including previous experience, cultural traditions, education, and new information. 45Decision refers to choosing based on making sense of what is known leading to action through a decision-making process.
In his work, Boyd used the concept of 'mental models' or 'cognitive maps', referring to the 'internal images or mental representations of spatial relationships (or other kinds of knowledge)' which support orientation. 47Mental models can be used as different ways of viewing and understanding the world, and 'form belief systems, give meaning to events, and we interpret our experience in light of them'. 48n A Discourse on Winning and Losing, Boyd argues that victory can come by morally, mentally, and physically affecting the adversaries to 'fold adversaries back inside themselves', i.e. to collapse their OODA loops and achieve the effect of disabling the opponent from being able to 'appreciate and keep-up with what's going on'. 49 This is done through disorienting the adversary (twisting mental images), disrupting the ability of the adversary to deal with the menace, and overloading adversary's capacity (physical or mental) to adapt to the new situation, and ultimately collapsing the ability of the adversary to carry on. 50What we propose, is that in competition, the focus of targeting shifts more toward the observational part of the loop with the goal of generating as many non-cooperative centers of gravity so that divisiveness becomes the core strategic objective (the generation of societal-wide friction.As Michael Warner and John Childress point out, the age-old adage 'divide and conquer' is a different way to describe this creation of non-cooperative centres. 51Whereas Boyd proposes that infiltration and insolation can be executed, inter alia, by "exploit[ing] critical differences of opinion, internal contradictions, frictions, obsessions, etc. in order to foment mistrust, sow discord, and shape both adversary's and allies' perception of the world . . .thereby creat[ing] atmosphere of 'mental confusion, contradiction of feeling, indecisiveness, panic", in their mental schema -the orientation component, we propose that it is societal division via influencing observation that CEIO/CEICs can ultimately target. 52hat mechanism does CEIO/CEICs rest upon?Unlike in military conflicts, where the goal is to collapse the adversaries OODA loop, in competition in and through cyberspace, the goal is to leverage pre-existing orientations to create wide-scale division on the basics of how information is observed and then understood.We suggest a related three-pronged framework of continuous cyber-based identification-imitation-amplification (IIA) that both loops in a feedback and feed forward dynamics (see Figure 2). 53

Identification
The first step in developing a CEIO is identifying the audiences that should be targeted.Here, existing orientation schema represented by race, socio-economic standing, education, previous experience, and traditions are not the targets for manipulation and collapse (as in traditional military OODA loop strategy) but rather are the foundations upon which the dual tactic of in-group/out-group reinforcement is built.
Specifically, CEIOs necessitate as the first step, identifying divisive issues, as well as polarised groups and targeting them with specifically crafted messages to reinforce their pre-conceived beliefs, and consequently, promote the sense of in-group belonging, at the expense of the sense of belonging to a larger whole.Identification of salient foundations for division has been made significantly easier in that the essence of digital social media rests on grouping individuals and organisations around 'likes'.The digital space has structurally evolved into self-perpetuating (and reinforced through algorithmic calculations) echo chambers so that the identification stage does not require a 'construction' stage to follow -the in-group/out-group foundation already exists.Identification can more narrowly focus on which groups around which issues have divisive growth potential.All pluralistic societies have disagreement on best practices and policies and views of Identify Imitate Amplify history and contemporary issues.CEIO/CEICs, however, are not about creating disagreement, they are about creating division in the capacity to act strategically as a nation.This is a critical analytical distinction.So while polarisation is a useful starting point for CEIOs, it is whether that polarisation can be imitated and amplified for divisive effect that becomes the salient identification objective. 54

Imitation
Social media can be used to reflect images of reality where certain topics and issues feature more prominently than others.Social networking sites are seen as open fora, where many aspects of modern and political lives come to be expressed.These sites are also used as primary sources for news in recent years and are places where organic engagements are expected to take place. 55owever, it is possible to subvert the original purpose of social networking sites, which is to connect people, by creating fictional online personae imitating membership in a certain target group, and then generating and promoting content to that target group for malicious purposes as part of a CEIO.
Social media sites provide access to CEIO's groups of interest.Target audiences can be accessed through microtargeting, which is facilitated through inference algorithms used by social media platforms. 56Computer algorithms are used to determine categories of identity of users based on user behaviour on a given platform. 57Moreover, 'digital traces of user behaviour are translated into probabilistic categories that can be used by advertisers for audience targeting'. 58Facebook is representative of the prevailing mechanism across social media platforms, although each has their own nuanced algorithms. 59According to Michael DeVito, the information presented to Facebook users is based on 'explicit user interests and implicit preferences'. 60As Thorson et al. note, evidenced through a series of patents, Facebook has developed machine learning algorithms to infer users' interests based on their own, and their friends' behaviours to facilitate 'inferential ad targeting'. 61By using Facebook for showing targeted advertisements, it becomes possible to reach narrowly defined audiences.Facebook enables micro-targeting, which allows for an easy access to the audience of interest.'Facebook offers anyone who pays for promotional messages a menu-style, microtargeting tool for free that includes an array of options for the type of targets based on users' demographics, geographics, media consumption patterns, political profiles, issue interests, hobbies, friends' networks (e.g., number of friends), Facebook engagement (e.g., liked post by NRA), and the like'. 62argeted groups would subsequently receive tailored messaging.
Based on user identity, messages could be crafted to influence those users.Those messages may take the form of traditional disinformation, but they may also employ factually correct information.Thomas Rid writes that, '[s]ome of the most vicious and effective active measures in the history of covert action were designed to deliver entirely accurate information', recounting that in 1960, Soviet intelligence distributed 'a pamphlet that recounted actual lynchings and other gruesome acts of racial violence against African Americans, . . .[and] then distributed English and French versions of the pamphlet in more than a dozen African countries'. 63Indeed, facts can be used as part of influence operations to affect the adversary by creating a slanted vision of reality/surrounding environment by emphasising those facts at the expense of other informational input.For example, injustice and cruelty are phenomena that spark outrage regardless of the country or the historical era.By emphasis on such images, CEIOs can be used to alienate and further victimise marginalised groups in the target state.
On social media, individuals are exposed to multiple information stimuli that may imitate reality but not necessarily be fully reflective of it.In a Russian CEIO on Twitter, for example, negative news has been disproportionately highlighted to paint a bleaker picture of reality 'to show Twitter users a world more dangerous and unrestful than they may otherwise experience'. 64Facebook newsfeed shows personal user updates, news stories, advertisements, updates from pages that users follow, as well as other content algorithms decide to display. 65Research indicates that social networking sites can be used for 'agenda-setting', which refers to the idea that news media might not tell their audiences what to think, but they tell them what to think about. 66In this fashion, by emphasising certain information and omitting and neglecting other, a slanted image of social, political, and economic environment may be presented.During Congressional testimony, US Cyber Command's General Paul Nakasone discussed concerns over the Chinese-owned platform TikTok, noting, 'Influence operations . . . it is not only the fact that you can influence something, but you also can . . .turn off the message as well when you have such a large population of listeners'. 67ocial media enables the promotion of chosen views under the veil of anonymity.In the words of the iconic cartoon from the The New Yorker created by Peter Stein in the 1990s, 'On the Internet, nobody knows you're a dog'.Indeed, covertness and deception are some of the core features of cyber operations. 68In the CEIO context, obscuring the true origin of information, actors may imitate the behaviour of authentic users and participate and influence organic conversations online in ways that would not be possible if the genuine identity of these actors was known.In Facebook lingo, this type of activity is labelled as 'inauthentic behavior'. 69For instance, a fictional LGBTQ United organisation page may be created and promoted on Facebook by users who remain anonymous.There are several mechanisms that allow for obscuring the identity of those wanting to engage and influence audiences.
The kind of deception at scale described above is enabled in two ways.First, it is possible to create fictional personae on social media, including creating profiles/pages for entities that do not exist 'in real life' (IRL).For example, elaborate identities can be built on social media networking sites that can be presented across platforms for the purpose of influencing selected communities.A study of the Russian Internet Research Agency (IRA) -an organisation that was behind the social media campaign targeting the United States -shows how the IRA built thousands of identities on Twitter that impersonated Americans. 70Second, without revealing who the sponsor is, certain content may be created and disseminated.Native advertising, which is part of the financial model of many social networking sites, does not necessitate that the identity of content creators ever be revealed.Native advertising refers to the type of advertising where 'branded content . . . is integrated in or similar to the format or design of the platform, including social engagement features of the platform'. 71In other words, 'paid content is deliberately designed to look like non-paid, user-generated content . . .that resembles news, videos, games, memes, and other non-marketing content embedded among regular posts by social media users'. 72Native advertising has been studied in the context of electioneering, where issue campaigns are promoted without revealing who is behind the campaign. 73Research shows that the non-intrusiveness of native advertising was positively related to the propensity of users to share the content. 74By creating pages and profiles for fictional entities, generating content on those pages, and promoting them via native advertising, targeting specified groups, different actors, including states, can manipulate organic conversations online.Promoted and nonauthentic content can then be used to exert influence on their targets without the targets being aware that the content is either paid for or created by individuals other than the profiles/pages with which the content is associated.Consider the following fictional example.Imagine a post by a recently enlisted soldier who recounts their lived experience of seeing September 11 th on TV and tells that it was this event that motivated them to join the military.Imagine further that this is a post that was shared by the Veterans Across the Nation (VAN) Facebook page.The content of this post may be entirely fabricated, and VAN may not exist 'in real life'.However, there are certainly many individuals in the United States who lived through 9/11, whose lived experiences would resonate with this scenario given that 9/11 has cultural and political significance in the country.This experience may have motivated many individuals to join the military and defend the nation. 75In that sense, the 9/11 Facebook post in question imitates reality.However, such a post while fictional can have the emotive impact of authenticity, particularly in how it is visually presented on the digital media platform and can be leveraged for purposes other than promoting a shared sense of patriotism depending on to what content it becomes linked and amplified through.

Amplification
With target schema identified and access obtained through credible imitation, the most significant cyberenabled component of CEIOs can be put to effect with a speed, scale, and scoping that traditional influence operations could never obtained -digital amplification provides a capacity for effect.Here selected information can be amplified on social media by increasing the reach of CEIOs.The latter could be accomplished in terms of content, including increasing and diversifying the number messages that are targeting a certain group, increasing the number of groups that are being targeted, and building on previous messages to create concentric loops between different target audiences.Messages placed through native advertising can also be amplified by increasing the ad spending at low cost, thus increasing exposure of targeted audiences.Amplification can also occur by expanding CEIO from one social networking site to a different site, for example, from Facebook to Instagram, by incorporating an ad on Facebook for a page on Instagram.This cross-platform presence with slight changes in format (from text, to picture, to video, for example) adds to the authenticity of the information as it is shared by different groups of social media cohorts (friends) or shared interest cohorts in an increasingly complex interconnected web of relationships.Thus, the same core fissure point of information can be scaled across systems and scoped across audiences with speed and negligible cost if the right initial content can be tied to the right influencers within core nodal centres of information dissemination.Thus, it is a dynamic of not only spread but of depth -in which the convergence of belief around an initial piece of information itself propagates more belief (it must be true because so many people are sharing or liking it).Simple bot-net amplification at the right time can send that message viral -essentially achieving cyberspace launch velocity.One concern raised about TikTok is that the platform operators themselves can push content tied to the interests of sufficiently tailored groups such that viral levels occur not due to user connections, but by 'operation intervention'. 76State entity use of that platform or future platforms based on similar algorithms would only exacerbate the amplification opportunity and challenge (from a counteramplification standpoint).
Through the process of Identification -Imitation -Amplification (IIA), cyber-enabled influence operations can be built to target increasing numbers of individuals.The IIA loop can ultimately be created, wherein information can be used to produce effects, and then the evidence of those effects can be in turn be used to generate more content for both widening and deepening effects.This is fundamentally how private sector social media algorithms technically work off the construct of likes.The state influencer (or domestic actor) need only develop the logic layer filter to follow the trends it is seeking to push.For instance, information about social injustice can be used to spur individuals to attend protests against social injustice; and then information and images from the protest could be used to spread the messages even further to both in-and out-groups.Information about an event in one target group could be used to foment adverse reactions in a different target group by playing groups off each other.For example, a member of the African American community may be targeted on social media, with images describing violence against African Americans. 77Inspired by what they saw online, this individual may feel the need to protest such treatment of African Americans.Imagine that this individual then participates in a protest where a violent altercation with police occurs.Captured on camera, images of that violent encounter could then be used and (re-)contextualised to serve strategically constructed particular narratives.Images showing violent police encounters can be manipulatively presented as evidence that all members of African American community are violent against the police.These images could then be shown, via social media microtargeting, to white-nationalist leaning individuals to emphasise a negative disposition towards African Americans community members creating both a deepening of the in-group and an intensifying of the out-group identities of all receiving these easily constructed and disseminated messages.Boyd's modelling becomes manifested in deep information echo-chambers in which individuals no longer become exposed to alternative views but get their beliefs reflected back to them in a continuous ever-tightening constructed reality, which if set against an outgroup (e.g., the federal government, an opposing political party, the scientific community, the elite, an opposing social or ethnic group, etc).
In the manner described above, the process of targeting different social groups through the process of Identification-Imitation-Amplification (IIA) looping can be continuously expanded and connected into a cyber-enabled influence campaign.The targeting of each individual social group through IIA can then be understood to constitute an operation, which is then linked to other operations to build a campaign targeting a society writ large.Such campaigns would target multiple groups within a given society simultaneously (see Figure 3).The goal of such targeting would be to hit all three critical aspects of observation so that how information is observed is so tightly tied to divisive schema as to produce wide scope and scale of non-cooperative centres within the target society.Thus, the IIA mechanism is not only providing outside information (new, recycled, re-framed) but also facilitating unfolding divisive circumstances (protests in one city become imitated and amplified to scope to other cities in their local contexts) that will be observed by others.In combination, done effectively, the IIA of CEICs can thus create an unfolding interaction with the information environment itself so that the manipulator has begun to control decision and action in the direction they seek -which in our framework is primarily an effect of divisiveness across society.
Cyberspace is enabling this precision, because the behaviour and attitudes of groups are being collected, shaped, and fed though the algorithmic structures of social media platforms now ubiquitous globally.CEICs need not rest on assumed classification of groups (stereo-typing) but rather draw from the data that individuals are generating themselves publicly every day.The 'cyber-enabled' aspect of these campaigns moves beyond generalised assumptions into precision targeting aided by the cyber behaviours of the target audiences themselves.CEICs do not make simple assumptions that African Americans care about race in America because they can rest on data about how African Americans care about race.Given that each group can be presented with tailored messaging aimed at strengthening previously held beliefs, the power of leveraging precise behavioural data imitation opens the door for truly manipulated belief amplification at the expense of social cohesion.
In the social injustice example hypothesized above, one narrative would be shown to African American targets, and a different narrative, surrounding the same event, would be shown to a Blue Lives Matter group, for example.Through amplification, a multitude of messages promoting similar themes would be targeting a given social group, with the goal to present an image of reality that stands in contrast with another strongly held perception of reality.Unable to find any grounds for consensus in the CEIC-established information environment, cyber-enabled IIA interaction will deepen and broaden the non-cooperative centres across society (and within government as well).In a nuance from Boyd, the CEIC seizes the OODA loop not to collapse it but to leverage it to divide the nation informationally sufficient to preclude effective coherent and consensus-based national action.Cyber-enabled campaigning allows the dysfunction Boyd predicts to happen at the scale and across the scope of society without requiring armed conflict.The operational focus on pushing target groups to exist in their own echo chambers that are internally cohesive leads to the campaign goal of creating so much non-cooperative distance between groups that societal cohesion begins to fray and suffer.This becomes the possible identifiable end goal of influence and interference campaigning through cyberspace.

Conclusion
Individuals need to correctly observe their environments to make decisions that will advance their interests.Collectively, societies need to do the same and through their governmental structuresobserve, orient, decide, and act.If the informational environment though which all three (individuals, society, government) observe reality is actually filtered through a manipulated sense of divisiveness, the imitation and amplification of identified cleavage points can turn manageable disagreement into real and potentially unmanageable division.We have built a digital architecture ripe for CEIO and CEIC use of an IIA strategy to produce divisiveness.
The deepening interconnected nature of cyberspace is creating the potential for cyberenabled influence campaigns to create and sustain many non-cooperative centres simultaneously.Cooperation in the end requires sustainable trust. 78By creating numerous noncooperative centres, CEIC undermines trust in the target society, which rests on people being able to 'feel confident about the information they are gaining about their situation and any risks it entails for them'. 79Through the creation of many non-cooperative centres, overall trust in society and its institutions can be undermined.This is the 'divide and conquer' strategy of the 21st century to which all, but particularly, open democratic societies are susceptible.
Scholars and policymakers must address the growing threat that information technology platforms can be used to affect political stability by undermining trust, sowing doubt, and ultimately creating new beliefs about the surrounding social environment.This continuum of undermining trust to a sowing of doubt leading to anew belief creation might represent the ultimate effects of CEIC that would lead to varying degrees of manipulation and control that can undermine societal-wide cohesion and produce strategic effects heretofore reserved for the prosecution of war.Competition below armed attack in the information arena can become strategic, if we do not begin to address the vulnerabilities the technology is introducing to the maintenance of institutional trust.
Empirical analysis in a separate project suggests that information need not be 'fake' to be weaponized by adversaries and thus as a starting point for policy is that we need to reorient toward managing malinformation rather than dis-or mis-information. 80This research suggests that CEIOs using truthful information are directed toward affecting the observations, rather than the orientation of individuals scaled across society.
The disruption counter-campaigns called for in the 2023 US National Cybersecurity Strategy and their related approaches of persistent engagement and defend forward on the military side of US strategy are a good place to begin to focus on a greater integration of forms of intelligence and forms of military operations seeking strategic effect.The challenge of combating cyber-enabled influence campaigns will only intensify as more states experiment with this potential alternative to war as they seek to advance their goals in 21 st century strategic competition.Getting the analytical definitions and framework correct is a necessary but not sufficient step to meeting this challenge.Understanding how IIA impacts observation can enable new lines of academic research and policy prescription.
Assessing Cyber Conflict as an Intelligence Contest.Georgetown University Press, 2023.On intelligence practices, see Warner 2002, "Wanted: A Definition of Intelligence," 10; and Warner, The Rise and Fall of Intelligence, 2. 34.Nakasone, "A Cyber Force for Persistent Operations," 11. 35.See note 1 above.36.United States Committee on Foreign Relations, 2021.