DISARM Countermeasures Workshop Series — How do governments counter disinformation campaigns?

DISARM Foundation
7 min readJun 13, 2023

Victoria Smith

This year, DISARM has hosted a series of workshops exploring countermeasures to online harms. The workshop series is generously supported by Craig Newmark Philanthropies. The objective of the workshops is to gather feedback on the types of countermeasures used to counter online disinformation and other harms, and how to make advice on mitigations and countermeasures to these threats accessible and practical to those who need it. The feedback from these sessions will feed into DISARM’s work to update and improve the existing ‘Blue Framework’ of countermeasures.

Two workshops focused on how governments approach countermeasures. Participants had experience of working in or directly for governments in North America and Europe.

Introduction

Governments engage in a wide range of activities, from acting as donors to support the work of non-governmental organisations, or hiring third-party contractors to work on specific areas of research and development, to being responsible for assessing the threat and delivering the response. While this gives governments a broad spectrum in which to work, this must be balanced with concerns about the extent to which governments can and should collect and analyse data about their own citizens, how much trust there is among the general population in government messaging, and whether and how to incentivise or legislate to drive change.

Key Takeaways:

What should the DISARM Blue Framework do?

By assigning outcomes for Red TTPs and encouraging users to consider strategic objectives for Blue responses, DISARM can be part of a framework to support policy recommendations. Government analysts said that most of their work was on characterising the threat, and while they would like to start making policy recommendations, they would need a framework and processes in place to support it.

DISARM could reflect the language of the EU Foreign Information Manipulation and Interference (FIMI) Toolbox to ensure it can support identification of appropriate responses. The EU’s FIMI toolbox is under development, but some aspects, such as resilience building, are already being implemented. The aim of the toolbox is to enable a good understanding of the threat and unify the methodology and language used to identify and describe it, to ensure a consistent understanding across partners. Other aspects covered by the toolbox include regulatory and diplomatic responses. The toolbox is being developed in collaboration with academic and civil society partners, to try to reflect the need for a whole of society response, across the EU and building bridges between the EU and non-EU partners.

DISARM should present information in a clear and simple way to make it easy for users to access. Participants wanted the blue framework to be written in simple and straightforward language, that gives individuals clear guidance and education on what to do and how to analyse content they see online. One participant’s focus was on giving support to individuals from marginalised communities, to ensure that everyone has a voice on the internet and that no one can be bullied into silence. Another specified that the language used should be ‘desecuritised’, using softer terms to reflect education, creative approaches and taking local culture into consideration, when fostering critical thinking skills.

DISARM should give users tangible, practical and actionable guidance. For example, platforms often complain that governments are asking them to do more, without clearly articulating what ‘more’ is. For example, during the COVID-19 pandemic, there was a lot of scrutiny over the information that was being shared on social media platforms, but less guidance on what good public health information looks like.

DISARM should support responders in anticipating vulnerabilities and information voids, to inform pre-bunking, inoculation and other preemptive or preventative activities. Responders want to improve how they prepare and pre-empt disinformation campaigns using data, geopolitical context or historical precedent. This might involve assessing how long it takes deceptive narratives to emerge and then go viral, and what can be done to anticipate and prevent this happening, or trying to pre-bunk or clearly communicate factual information before disinformation campaigns take hold. Elections were cited as a good example of an event that could be prepared for, and the US government’s Cybersecurity and Infrastructure Security Agency’s Election Security program was cited as a good example of election preparedness.

DISARM should reflect the way in which users can work with partners to improve information sharing. This might include developing operational communication and information sharing plans across and between governments in advance of a disinformation campaign, the tactical sharing of information and warnings during a disinformation campaign, or recording community wide lessons learned and after action reports, analyzing all information shared by partners. Collaboration might also include public-private partnerships, allowing information to be shared between sectors, which can further inform how responses are better tailored to specific circumstances.

Sharing information with partners contributes to a better understanding of the problem and tailored responses. The ability to collect information from partners, including other nation states, allows analysts to get more of a global view of the information environment. Analysts can use this information to help identify the actors behind the narratives, the narratives themselves and the ways in which they spread. They can also see similarities and differences between narratives in different geographies or contexts and use this information to develop responses at a national and international level.

Polling and focus groups can help analysts get a better understanding of how audiences respond to both disinformation and public information campaigns. This can help get a more proportionate understanding of the impact disinformation campaigns have on different audiences, or help identify ways to improve the consumption of good quality information. For example, the US Agency for Global Media have done polling and focus groups of their target audiences to get a better understanding of how they can better support USAID’s development work. They are looking at how to use broadcast media, together with social media, to produce and promote good quality information, in ways that are accessible to a wide audience. Polling can also look at the extent to which audiences trust information from different sources, which can further inform who is best placed to lead a response and how.

Legislation, such as the proposed EU Digital Services Act (DSA), can increase the accountability and responsibility of specific actors, such as social media platforms, but requires evidence-based research to understand how to drive improvements. Measures such as the EU Code of Practice on Disinformation are still in their early stages and work remains to be done to enforce or incentivise safety by design in social media platform development.

What should the DISARM Blue Framework consider?

DISARM should support the transition from tactical to strategic response. Analysts felt they were quite well advanced at responding quickly at a tactical level, but there hadn’t necessarily been corresponding maturity in developing strategic responses. This situation was improving with the introduction of the Digital Services Act in the EU, but analysts felt they were at the beginning of understanding strategic objectives for countermeasures.

DISARM should consider how other fields of study, for example risk management, can contribute to countermeasures. For example, when responding to public health and safety-related events, there is a need to draw on evidence-based research to better understand how people perceive risk, and how they might consequently modify their behaviour in ways that are detrimental to themselves and others. Disinformation campaigns can manipulate how people perceive risk, so there is a need to draw on risk communication to improve how (for example) government agencies communicate with the public.

Having a good understanding of the baseline information environment facilitates the identification of spikes and anomalies. Such an understanding can help a responder tune their response so that it is effective and proportional.

Other areas of discussion:

  • One government analyst reported having autonomy over decisions about what data to collect and research, with some input top down from policy makers.
  • Work still needs to be done to improve how incidents are described and ensure the focus is on how information is being used to manipulate and influence.
  • At a political level, how to prioritise countermeasures in a whole of society approach remains a challenge.
  • Analysts focus on the characterisation of the threat, attribution is considered to be a political process.
  • Responses may also need to consider hybrid threats involving cyber related incidents.
  • Analytical teams are relatively small and need to cover a range of geographic and thematic research topics, such as threats to elections, healthcare, emerging crises etc. Analysts wanted an increased focus on behaviours and the ability to systematically analyse developments to move away from a continuous cycle of crisis response. Creating a common framework, for example using STIX and objects is a way of achieving this and ensuring knowledge is not lost.
  • Analysts also research long-term politically motivated disinformation campaigns, for example climate change and climate scepticism, or migration. These campaigns take place over a long period of time, and while there are spikes in activity, generally speaking there is a slow pollution of the information environment over time. Analysts want to find ways to understand these longer-term trends, so they can better understand whether new or repeated narratives are being used, and how concerned governments should be about the impact of these narratives at any particular time.
  • Some governments outsource their analytical needs to external contractors. This makes it difficult for analysts to make policy recommendations that involve a cost to deploy, as it can be seen as a conflict of interest.
  • An understanding of local context is extremely important as disinformation campaigns can draw on local grievances. There may also be historical or cultural factors that make a particular country or demographic more susceptible to a particular campaign.
  • Not all communication is tactical, governments have the ability to consider strategic communications, taking factors into consideration such as building trust, encouraging a culture of healthy debate, or considering cultural and geopolitical differences.
  • Governments are often limited in the information they can collect on individuals, and particularly on their own citizens. Organisations can have more flexibility over the information they can collect and the tools they can use. Establishing an effective process for collection, analysis and response will therefore likely involve the establishment of public-private partnerships.
  • Teams working tactically don’t necessarily have the capacity to evaluate the effectiveness of short-term responses.

--

--

DISARM Foundation

We are home to the open DISARM Framework — a common language and approach for diverse teams to coordinate their efforts in the fight against disinformation