M003: daylight
Summary: Make disinformation objects, mechanisms, messaging etc visible
Counters in M003 | ||
disarm_id | name | summary |
C00019 | Reduce effect of division-enablers | includes Promote constructive communication by shaming division-enablers, and Promote playbooks to call out division-enablers |
C00048 | Name and Shame Influencers | Think about the different levels: individual vs state-sponsored account. Includes “call them out” and “name and shame”. Identify social media accounts as sources of propaganda—“calling them out”— might be helpful to prevent the spread of their message to audiences that otherwise would consider them factual. Identify, monitor, and, if necessary, target externally-based nonattributed social media accounts. Impact of and Dealing with Trolls - "Chatham House has observed that trolls also sometimes function as decoys, as a way of “keeping the infantry busy” that “aims to wear down the other side” (Lough et al., 2014). Another type of troll involves “false accounts posing as authoritative information sources on social media”. |
C00081 | Highlight flooding and noise, and explain motivations | Discredit by pointing out the "noise" and informing public that "flooding" is a technique of disinformation campaigns; point out intended objective of "noise" |
C00085 | Mute content | Rate-limit disinformation content. Reduces its effects, whilst not running afoul of censorship concerns. Online archives of content (archives of websites, social media profiles, media, copies of published advertisements; or archives of comments attributed to bad actors, as well as anonymized metadata about users who interacted with them and analysis of the effect) is useful for intelligence analysis and public transparency, but will need similar muting or tagging/ shaming as associated with bad actors. |
C00094 | Force full disclosure on corporate sponsor of research | Accountability move: make sure research is published with its funding sources. |
C00113 | Debunk and defuse a fake expert / credentials. | Debunk fake experts, their credentials, and potentially also their audience quality |
C00115 | Expose actor and intentions | Debunk misinformation creators and posters. |
C00116 | Provide proof of involvement | Build and post information about groups etc's involvement in misinformation incidents. |
C00126 | Social media amber alert | Create an alert system around disinformation and misinformation artifacts, narratives, and incidents |
C00184 | Media exposure | highlight misinformation activities and actors in media |
C00189 | Ensure that platforms are taking down flagged accounts | Use ongoing analysis/monitoring of "flagged" profiles. Confirm whether platforms are actively removing flagged accounts, and raise pressure via e.g. government organizations to encourage removal |
C00219 | Add metadata to content that’s out of the control of disinformation creators | Steganography. Adding date, signatures etc to stop issue of photo relabelling etc. |