M001: resilience
Summary: Increase the resilience to disinformation of the end subjects or other parts of the underlying system
Counters in M001 | ||
disarm_id | name | summary |
C00009 | Educate high profile influencers on best practices | Find online influencers. Provide training in the mechanisms of disinformation, how to spot campaigns, and/or how to contribute to responses by countermessaging, boosting information sites etc. |
C00011 | Media literacy. Games to identify fake news | Create and use games to show people the mechanics of disinformation, and how to counter them. |
C00021 | Encourage in-person communication | Encourage offline communication |
C00022 | Innoculate. Positive campaign to promote feeling of safety | Used to counter ability based and fear based attacks |
C00024 | Promote healthy narratives | Includes promoting constructive narratives i.e. not polarising (e.g. pro-life, pro-choice, pro-USA). Includes promoting identity neutral narratives. |
C00027 | Create culture of civility | This is passive. Includes promoting civility as an identity that people will defend. |
C00051 | Counter social engineering training | Includes anti-elicitation training, phishing prevention education. |
C00073 | Inoculate populations through media literacy training | Use training to build the resilience of at-risk populations. Educate on how to handle info pollution. Push out targeted education on why it's pollution. Build cultural resistance to false content, e.g. cultural resistance to bullshit. Influence literacy training, to inoculate against “cult” recruiting. Media literacy training: leverage librarians / library for media literacy training. Inoculate at language. Strategic planning included as inoculating population has strategic value. Concepts of media literacy to a mass audience that authorities launch a public information campaign that teaches the program will take time to develop and establish impact, recommends curriculum-based training. Covers detect, deny, and degrade. |
C00093 | Influencer code of conduct | Establish tailored code of conduct for individuals with many followers. Can be platform code of conduct; can also be community code. |
C00109 | Dampen Emotional Reaction | Reduce emotional responses to misinformation through calming messages, etc. |
C00111 | Reduce polarisation by connecting and presenting sympathetic renditions of opposite views | |
C00121 | Tool transparency and literacy for channels people follow. | Make algorithms in platforms explainable, and visible to people using those platforms. |
C00125 | Prebunking | Produce material in advance of misinformation incidents, by anticipating the narratives used in them, and debunking them. |
C00130 | Mentorship: elders, youth, credit. Learn vicariously. | Train local influencers in countering misinformation. |
C00160 | find and train influencers | Identify key influencers (e.g. use network analysis), then reach out to identified users and offer support, through either training or resources. |
C00188 | Newsroom/Journalist training to counter influence moves | Includes SEO influence. Includes promotion of a “higher standard of journalism”: journalism training “would be helpful, especially for the online community. Includes Strengthen local media: Improve effectiveness of local media outlets. |
C00190 | open engagement with civil society | Government open engagement with civil society as an independent check on government action and messaging. Government seeks to coordinate and synchronize narrative themes with allies and partners while calibrating action in cases where elements in these countries may have been co-opted by competitor nations. Includes “fight in the light”: Use leadership in the arts, entertainment, and media to highlight and build on fundamental tenets of democracy. |
C00212 | build public resilience by making civil society more vibrant | Increase public service experience, and support wider civics and history education. |
C00223 | Strengthen Trust in social media platforms | Improve trust in the misinformation responses from social media and other platforms. Examples include creating greater transparancy on their actions and algorithms. |