by Sean W. Havel


In July of 1870, in the spa town of Bad Ems, a polite conversation between a Prussian King and a French diplomat took place over the candidacy of a Hohenzollern prince to the Spanish throne. The two departed on good terms, despite the King’s refusal of the diplomat’s demands to withdraw the candidacy of the Prince. The King then asked his Chancellor, Otto von Bismarck, to release an account of the events. Bismarck’s press release – which should have created little fanfare on the European stage – became the catalyst for the rise of the German Empire.

At full liberty to interpret events as he wished, Bismarck removed the King’s conciliatory language and reframed a cool-headed conversation as one of mutual insult. He sharpened the narrative to suggest the French diplomat had made his demands under the threat of war and then characterized the conversation as mutually disrespectful. On July 13, the released communique sparked stark reactions from the already enflamed French people, confirming to them German disrespect of France’s leadership and position in Europe. France’s leaders shortly after succumbed to public opinion, mobilized their armed forces, and declared war – a war that would end disastrously for France, ending her empire and forging a new German one.

This brief example demonstrates how war and politics are one in the same. To advance his own interests and those of his state, Bismarck used an opportunity to reflexively control French foreign policy through its civilian population, setting in motion conditions where military victory could be secured through a defensive war. Just as Bismarck mobilized a narrative to achieve his political objectives, modern states now pursue political as well as military objectives in the now far more ubiquitous information environment (IE).

The modern IE is interconnected via physical networks, computer logic, and virtual platforms that enable not only the dissemination of information, but also amplification of information through mass participation and social media algorithms. Unlike in the 19th and 20th century, there is no escape from being part of the digital IE. Anyone with a phone can amplify what is being injected into that system to you, and thereby influence you. Thus, every time you observe, orient, decide, and act with information, you may well be playing a role in international conflict.

For these reasons, the relationship between influence in the IE and the conditions of the battlefield represent a key dimension of warfare. The ongoing Russo-Ukraine War exemplifies this reality, being a war for territory and a struggle for the hearts and minds of online world. How actors frame reality holds serious military implications, potentially justifying aggression, disrupting political processes, degrading a nation’s will to fight, and potentially even exerting reflexive control on that nation on behalf of an adversary.

So how can defence circles conduct research of this relationship in a controlled environment? By what means can we prepare as an institution to operate in this domain? These are questions I was asked to help answer through NATO System Analysis Studies-151 (SAS-151) working group between July and October 2021. Our team’s objective was to tackle this domain by executing a series of wargames to derive insights into how actors conduct activities in the IE, and how effective their methods are. What is described in this article is our methodology and findings, which have important implications for Canada, specifically, and democracies, globally.


Audience-Based Approach to Wargaming

Initial attempts of the working group to wargame operations the IE used Matrix Games, which are games with freely described actions whose success is determined based on structured argument and discussion. Matrix games proved, however, to be insufficient for capturing impacts in the IE, primarily due to the subjectivity of the arguments used to anticipate those impacts. If anyone could predict the success of a social media campaign, that would answer a century’s worth of questions about how humans respond to communications – an unlikely prospect.

In truth, the IE represents a domain that is too complex, obscured, and volatile to be successfully rationalized in a single discussion. We therefore required a way to conduct wargames in the IE that could capture its chaotic nature in a controlled setting and reveal what would happen rather than what should happen. To do so we relied on two key academic underpinnings that matrix games rely on: Crowdsourcing, or the theory that crowds cognitively outperform individuals; and the predictive power of role-play, focusing on what will happen rather than what should happen, offering a better basis for the prediction of outcomes than expert opinions or game theory. Then these concepts were scaled.

The solution was to introduce an audience of real people who could react to a developing situation, respond to messages created by teams, be misled, create their own stories, and propagate their own narratives. The virtual audience thereby acted as an ‘engine’, providing automatic adjudication of the wargame based upon their interactions to content supplied by players as well as their responses to surveys of their attitudes to key topics in the scenario.

Yet the audience alone was not enough, and creating an authentic IE also required two additional items. The first was a simulated social media platform, which we achieved through the popular online chat service Discord. Discord is a platform developed for gaming communities, where individuals can create servers featuring various text and voice ‘channels’ to feature content on. Those channels can be quartered off by specific permissions, meaning individuals only see what the designers of the server intended. This is in addition to advanced features, particularly the ability to program bots to automate specific processes on the server. Bots allowed much of the exercise to be fully automated, including simulating trending topics based on audience interaction and allowing audience members to have their own ‘feeds’ with targeted social media content. In a short amount of time, a Discord server was created for the exercise, given the structure of a social media platform, and provided rooms for teams and facilitators to meet, plan, and manage the wargame. Discord also had a robust mobile app, which allowed all participants, especially audience members, easy access through their mobile device at all times. This further meant that the exercise could be conducted in a purely distributed fashion, opening it to participants from across the Atlantic.

The second piece was the development of a comprehensive wargaming scenario/narrative for the audience to form their reality around. Instead of the traditional Blue-Red, we created names and backstories for our geopolitical actors. These were the Organization for Collective Security (OCS), a collective security alliance made of liberal-democratic countries, and the Illyrian Federal Republic (IFR), a presidential republic with aspirations to challenge the local security order. Respectively, teams were given detailed breakdowns of their own geopolitical actors, their capabilities, and their relationship to bordering states.

The nation audience members were from, and where the game took place, was called the Hypatian Commonwealth (Hypatia). It is a newly independent state from the IFR, hosting a large Illyrian minority, which politically seeks greater integration with the OCS. Hypatia was heavily detailed, including interest groups who carried specific loyalties and personalities, detailed descriptions of political parties and their respective positions on key issues, and demographics mapped from region to region that described their economic, ethnic, and political makeup. Audience members were then given ‘personas’ based upon these demographics, including where they lived, their occupation, their ethnic affiliation, and various other factors which allowed audience members to ‘build’ their own character and own it, but without a pre-written script to follow.


Lessons from an Information Wargame

Overall, the system described above ran for three games in a series, with the events of one game building the starting scenario for the next. Teams represented the OCS and IFR in each scenario, with the goal of the former to assist defending up to 50 live audience members against IFR disinformation and influence. The results our team produced demonstrated the added value of an audience in wargaming operations in the IE as an innovation while also providing interesting insights into the following topics:

  • Realistic Social Phenomena and Emergent Behaviour: It was observed that despite the scenario and exercise not being in the ‘real world’, polarization on political topics, marginalization of moderate voices, political fatigue, radicalization, and other phenomena were observed. Audience members also responded to team actions by creating their own original content, becoming in some cases influencers of their own. They also were seen supporting narratives and accidentally distributing disinformation to sway others on Discord. Furthermore, they formed interest groups based upon given personas, which in some cases enabled injects into game state made from audience behavior, such as a referendum in one case. Importantly, team influence only had marginal effects on audience behavior, just as in the real world, with only slight nudges in attitudes.
  • Treading the ‘Human Terrain’: Players had to understand, consider, and predict audience behavior, often having to deal with unexpected results. Real audience-centric decision making drove the wargame, with teams constantly having to figure out what audiences members were responsive to, what issues they cared about, and how they might react to larger political decisions.
  • Powerful Capabilities Require Restraint: The use of large-scale cyber capabilities, proxy groups, and overt force in some instances led to heavy audience reaction that was impossible to control. While removing the ability of an influencer to post online can be effective, the news of such an event creates discussion and distrust to the conducting actor. Capabilities used in the presence of an information domain need to be contextualized with how they will be perceived and in what narratives the action plays into.
  • Unsophisticated Tactics are Powerful: What proved the most effective in all info games was not the use of sophisticated capabilities, but low effort and simple techniques. Hyperbole, misattribution, simple bot-nets, and spreading disinformation were far more effective than the deployment of any info related capability. Deployed through the medium of memes and narrative-led techniques, they were powerful in their simplicity and popular with crowds.
  • Key Audiences Win Info Wars: The first team able to identify the key audiences won the game. Some audiences, based on their personas, were either set in their stances or too removed from contentious issues to be influenced by adversarial actors. In one example, the IFR team focused on specific persona types and slowly introduced their narrative to them. The defending OCS team was broader in their approach, messaging to everyone and not getting the message out to IFR targets. This resulted in IFR targeted personas becoming empowered on the social platform, working as a jumping off point for further IFR spread of their narrative.
  • Influencers Win/Lose Audiences: Building relationships with influencers was highly important to amplify messaging but equally vulnerable to becoming pariahs used to prove a specific narrative. In one exercise, the Hypatian Defence Minister was systematically made the symbol of corruption and ethnic oppression in the country, meaning when military action was needed, audience members interpreted the move as self-interested and hostile towards those personas being ethnic Illyrians. In another exercise, the OCS team was able to know an influencer was being targeted, allowing them to defend the individual and even find insights into the IFRs plan based on that target.
  • Preoccupation with Adversaries is Dangerous: The OCS team lost wargames when they focused too much on countering their adversary instead of focusing on their own plan. By the end of some games, the OCS narrative was diluted because the game became about those IFR posts. When countering disinformation, it is important to also ‘play your own game’ and tell your own story to offer an alternative.
  • Cognitive Diversity and Out-of-the-Box Thinking: The teams with the most cognitive diversity and that thought out of the box were the most successful. Cleaver use of hashtags, entertaining multimedia content, and selective engagement with audiences resulted in success.


The IE is a crucial domain for Canadian defence and will only increase in relevance as future conflict arise. While in other domains, Canada can benefit from its geographic position for defence, it will remain vulnerable in the information domain without concerted effort to address adversarial campaigns. Deployed abroad, Canadian Army operations are further at the mercy of losing the information advantage, with loss of local trust and unit disorientation potential results of adversarial information operations. Wargaming is one tool to assist the orientation of the force in this domain, allowing for new techniques and capabilities to be explored before they are tried in the information battlefield.


Sean Havel is a strategic analyst with Defence Research and Development Canada’s Centre for Operational Research and Analysis (DRDC CORA) and a member of the Joint Targeting Section (JTS).