DISARM Countermeasures Workshop Series — What do individuals need to effectively defend themselves and others against online disinformation campaigns?

DISARM Foundation
8 min readMay 18, 2023

Victoria Smith

This year, DISARM has hosted a series of workshops exploring countermeasures to online harms. The workshop series is generously supported by Craig Newmark Philanthropies. The objective of the workshops is to gather feedback on the types of countermeasures used to counter online disinformation and other harms, and how to make advice on mitigations and countermeasures to these threats accessible and practical to those who need it. The feedback from these sessions will feed into DISARM’s work to update and improve the existing ‘Blue Framework’ of countermeasures.

This workshop focused on what individuals can do to help protect themselves from the threat of online disinformation campaigns and how DISARM can make the ‘Blue’ Countermeasures Framework more accessible. Participants had experience of working in non-governmental organisations and journalism.

Introduction

While the DISARM Blue Framework includes tactical interventions that individuals can implement, it was not conceived as a tool to support individuals. However, it was important for us to consider individuals as users in their own right in this workshop series, and DISARM Blue could be enhanced to contain advice for individuals who wish to enhance their online defences.

For individuals being actively targeted by disinformation campaigns, those who are the victims of online harassment, stalking, doxing or other threats to their online or physical security, there are dedicated organisations who are much better equipped to advise and support them than DISARM. As we refine the Blue Framework, we must consider how we can best direct individuals who need more support than we can give them to the right places.

However the Blue Framework evolves, it will ultimately be individuals that must interact with it, understand it and use it to shape a strategic response on behalf of a government, platform or other actor. The design of the Framework must therefore support the individuals who need to use it. We must explore ways to do this, such as with an ability to search and filter results, as not every response should be considered in every scenario, or simplifying the language for non-technical or English as a foreign language users.

Key Takeaways:

What should the DISARM Blue Framework do?

For individuals, the ability to easily access relevant information quickly is vital, particularly when they are being targeted by a disinformation campaign; As highlighted in the first workshop, the current Blue Framework is difficult to access. Participants appreciated having a structure to organise the content in the framework. However, they emphasised the need for this information to be accessible to individuals in a way that reflects the nature of the behaviour they are trying to counter. ‘Rebutting narratives’ is a different problem from ‘enhancing cybersecurity’ for example and each of these categories could contain many different actions that an individual could take.

The Framework should incorporate the ability to tag and filter countermeasures, to present users with the most relevant guidance; Participants felt that the ability to narrow the options down to a more focused list that was tailored to address the specific situation would be a good way to make the Framework more accessible to individuals. One participant suggested that if this were done effectively, DISARM could create playbooks, or scenario-based guidance to support a response to specific behaviours. Fields such as information-security and crisis response might be well placed to inform DISARM’s approach to this type of solution.

DISARM should create partnerships with specialist advice and support organisations. We are more effective as a signpost to this guidance, rather than providers of it; This workshop highlighted the breadth and depth of the challenges posed by online disinformation campaigns. As a small organisation, DISARM cannot be the provider of legal, mental health or other specialist advice and there are others in the field who already offer high quality advice. DISARM is better placed to raise awareness of the options available to individuals and could build a community of specialist providers that we can promote and direct people to.

Participants reported feeling overwhelmed by the current structure of the framework. They felt that most of the countermeasures currently listed were not something they could personally do, reinforcing a feeling of powerlessness. They also emphasised that when individuals are being targeted by disinformation campaigns, they are likely to be in a heightened state of stress and that may affect their ability to think logically or effectively take in very complex advice.

While the DISARM Blue Framework was originally conceived as a tool to help governments and platforms, we must also recognize that individuals can be the targets of state-backed disinformation campaigns.

DISARM should ensure individuals are made aware of the options available to them and direct them to those who can provide support. Participants gave the following examples of specialist websites that have developed advice and provide support to individuals:

  • The Coalition Against Online Violence which consists of around 70 organisations internationally. They support journalists and human rights activists facing online attacks through an online hub . The hub provides access to 130 different resources and guides and guides visitors to the right place by asking them to identify their job role and the type of problem they are facing. Common scenarios that it provides guidance for include being doxed or online abuse and it provides signposts to partner resources or emergency support as appropriate
  • PEN aims to protect writers and human rights activists from online abuse and disinformation. They produce a Field Manual Against Online Harassment, which is a one-stop site that helps individuals prepare and respond to online abuse, including the legal avenues that are available to them if they live and work in the US. The Field manual is also available in Spanish, French (for West Africa), Swahili (for East Africa), and Arabic (for MENA), since the advice will differ by region. The Field Manual includes a section on reporting online harassment to the platforms and another section on limiting exposure to abusive content
  • Media Defense provides funding for the legal defence of journalists facing abuse and, in many cases, will connect the journalist with a lawyer. They provide national and regional-level resources to help journalists facing state-sponsored cyber- attacks. While there are some resources for journalists directly, the main focus is to act as a resource hub for media lawyers that includes fact sheets, modules for different regions etc. The hub, which covers a broad set of attacks, not just digital, offers search by lawyer, region, keyword, or theme
  • Right To Be, formerly Hollaback!, which walks users through the stages of dealing with an online attack
  • Hate Aid, a German non-profit offering consultation and litigation services to help counter digital hate.

What should the DISARM Blue Framework include?

The Blue Framework should include guidance of what can be done before, during, and to recover from a disinformation campaign; One participant said that often advice is focused on how to defend oneself before a campaign begins, and what do to when it ends, but they felt there was less available guidance on what individuals could do when a campaign was in progress. This guidance should also recognize that individuals may have more or fewer resources at their disposal depending on whether they are freelance or employees and the level of institutional support their organisation may have available.

The Blue Framework should reflect any costs or potential barriers that individuals may have to consider before implementing a countermeasure; Costs might include the financial cost of paying for a tool, training to upskill themselves, or access to advice such as legal, technical or mental health. Barriers might include their geographical location and therefore the legal jurisdiction they fall under or the availability of local support.

The framework should consider the geographical location of users, as this may restrict the options available to them; One participant said that outside the US, state-sponsored abuse was more common. Rules relating to free speech and legal frameworks also vary considerably, so responses need to be tailored accordingly. While we are not sufficiently resourced to include countermeasures that are specially adapted for every legal jurisdiction, we might consider taking into account whether a particular countermeasure is more or less suited to particular circumstances and build this knowledge up over time.

The language in the Blue Framework should be easy to read and understand and avoid jargon; Participants found the language used to describe countermeasures in the Blue Framework difficult to understand. They thought there was too much jargon, which raised the barrier for access, especially for those who don’t speak English as their first language.

Other areas of discussion:

  • Participants talked about the importance of providing in-person training, as some people find it more difficult to read and digest written information and prefer being walked through it in an environment where they can engage and ask questions
  • Participants had found it difficult to request action from social media platforms’ content moderation or site integrity teams. They found that if they could directly contact a platform employee in one of these teams, their requests were more likely to be successful than if they made contact through generic support emails. However, people move jobs and so it is difficult to maintain the right contacts over time. This process also fails the general public who do not have direct access to the right people at a platform. One participant cited research from the Center for Countering Digital Hate (CCDH), which found that 84% of antisemitic behaviour that was reported to platforms was not acted on. Previous reporting from CCDH found a similar percentage of abuse sent via direct messages that was reported to the platforms but not removed
  • Journalists can have specific needs, particularly when it comes to legal support. They can be targeted with online abuse, cyber-attacks or spyware and threats can come from state and non-state actors. They may need help finding or paying for legal defense but they may also need help investigating the threat. Journalists around the world face these challenges and so support needs to be provided in multiple languages and be relevant for multiple legal jurisdictions
  • One participant said that their experience was that the shorter the advice, the better. They suggested summarising a concept into a few short points and presenting them visually, perhaps in an infographic. More detailed information can then be linked to for those that want it. One participant used the example of a CCDH report about trolling which has a short set of key principles
  • Germany was given as an example of a legal jurisdiction where the Network Enforcement Act not only empowers people to report content on social media platforms, based on their community guidelines of the platforms, but also obligates the platforms to delete certain content. This law will also come into effect in the European Union, via the Digital Services Act, in about a year’s time. This law may help some individuals but is not a real solution to the problem
  • One participant reported that taking legal action has resulted in some success. They gave the example of Hate Aid, who have scored a number of successful legal actions against perpetrators and social media platforms. Recently they also sued Twitter.

About DISARM:

The DISARM Foundation was established in 2021 and is home to the DISARM Frameworks. DISARM stands for ‘Disinformation Analysis and Risk Management’.

The DISARM Foundation has developed a ‘Red’ framework, which identifies behaviours associated with online disinformation, influence operations and other harms, and continues to be refined. The Framework provides key elements of the common language, so urgently needed for better coordination to more effectively combat disinformation.

The DISARM Foundation has a ‘Blue’ countermeasures framework, which needs further work to build it out and review its overall shape and effectiveness. The aspiration is that, used together, the two frameworks will support the rapid and effective coordination, identification of and response to online disinformation.

The DISARM Foundation defines disinformation as “information that is false, misleading, incomplete, or out of context, and which is deliberately created, altered, manipulated, disseminated, or presented with the intent to deceive.” We recognise that there is no single term that covers all aspects of the DISARM Foundation’s remit. The framework supports the assessment and response to a wide range of online harms, and encompasses concepts such as influence operations, foreign information manipulation and influence, ‘mis/dis/mal-information’, and ‘Information Disorder’.

--

--

DISARM Foundation

We are home to the open DISARM Framework — a common language and approach for diverse teams to coordinate their efforts in the fight against disinformation