Counters

C00016 Censorship

Tactic stage: TA01

Summary: Alter and/or block the publication/dissemination of information controlled by disinformation creators. Not recommended. - Details


C00017 Repair broken social connections

Tactic stage: TA01

Summary: For example, use a media campaign to promote in-group to out-group in person communication / activities . Technique could be in terms of forcing a reality-check by talking to people instead of reading about bogeymen. - Details


C00019 Reduce effect of division-enablers

Tactic stage: TA01

Summary: includes Promote constructive communication by shaming division-enablers, and Promote playbooks to call out division-enablers - Details


C00021 Encourage in-person communication

Tactic stage: TA01

Summary: Encourage offline communication - Details


C00022 Innoculate. Positive campaign to promote feeling of safety

Tactic stage: TA01

Summary: Used to counter ability based and fear based attacks - Details


C00006 Charge for social media

Tactic stage: TA01

Summary: Include a paid-for privacy option, e.g. pay Facebook for an option of them not collecting your personal information. There are examples of this not working, e.g. most people don’t use proton mail etc. - Details


C00024 Promote healthy narratives

Tactic stage: TA01

Summary: Includes promoting constructive narratives i.e. not polarising (e.g. pro-life, pro-choice, pro-USA). Includes promoting identity neutral narratives. - Details


C00026 Shore up democracy based messages

Tactic stage: TA01

Summary: Messages about e.g. peace, freedom. And make it sexy. Includes Deploy Information and Narrative-Building in Service of Statecraft: Promote a narrative of transparency, truthfulness, liberal values, and democracy. Implement a compelling narrative via effective mechanisms of communication. Continually reassess messages, mechanisms, and audiences over time. Counteract efforts to manipulate media, undermine free markets, and suppress political freedoms via public diplomacy - Details


C00027 Create culture of civility

Tactic stage: TA01

Summary: This is passive. Includes promoting civility as an identity that people will defend. - Details


C00153 Take pre-emptive action against actors' infrastructure

Tactic stage: TA01

Summary: Align offensive cyber action with information operations and counter disinformation approaches, where appropriate. - Details


C00096 Strengthen institutions that are always truth tellers

Tactic stage: TA01

Summary: Increase credibility, visibility, and reach of positive influencers in the information space. - Details


C00111 Reduce polarisation by connecting and presenting sympathetic renditions of opposite views

Tactic stage: TA01

Summary: - Details


C00223 Strengthen Trust in social media platforms

Tactic stage: TA01

Summary: Improve trust in the misinformation responses from social media and other platforms. Examples include creating greater transparancy on their actions and algorithms. - Details


C00221 Run a disinformation red team, and design mitigation factors

Tactic stage: TA01

Summary: Include PACE plans - Primary, Alternate, Contingency, Emergency - Details


C00220 Develop a monitoring and intelligence plan

Tactic stage: TA01

Summary: Create a plan for misinformation and disinformation response, before it's needed. Include connections / contacts needed, expected counteremessages etc. - Details


C00212 build public resilience by making civil society more vibrant

Tactic stage: TA01

Summary: Increase public service experience, and support wider civics and history education. - Details


C00205 strong dialogue between the federal government and private sector to encourage better reporting

Tactic stage: TA01

Summary: Increase civic resilience by partnering with business community to combat gray zone threats and ensuring adequate reporting and enforcement mechanisms. - Details


C00190 open engagement with civil society

Tactic stage: TA01

Summary: Government open engagement with civil society as an independent check on government action and messaging. Government seeks to coordinate and synchronize narrative themes with allies and partners while calibrating action in cases where elements in these countries may have been co-opted by competitor nations. Includes “fight in the light”: Use leadership in the arts, entertainment, and media to highlight and build on fundamental tenets of democracy. - Details


C00176 Improve Coordination amongst stakeholders: public and private

Tactic stage: TA01

Summary: Coordinated disinformation challenges are increasingly multidisciplinary, there are few organizations within the national security structures that are equipped with the broad-spectrum capability to effectively counter large-scale conflict short of war tactics in real-time. Institutional hurdles currently impede diverse subject matter experts, hailing from outside of the traditional national security and foreign policy disciplines (e.g., physical science, engineering, media, legal, and economics fields), from contributing to the direct development of national security countermeasures to emerging conflict short of war threat vectors. A Cognitive Security Action Group (CSAG), akin to the Counterterrorism Security Group (CSG), could drive interagency alignment across equivalents of DHS, DoS, DoD, Intelligence Community, and other implementing agencies, in areas including strategic narrative, and the nexus of cyber and information operations. - Details


C00174 Create a healthier news environment

Tactic stage: TA01

Summary: Free and fair press: create bipartisan, patriotic commitment to press freedom. Note difference between news and editorialising. Build alternative news sources: create alternative local-language news sources to counter local-language propaganda outlets. Delegitimize the 24 hour news cycle. includes Provide an alternative to disinformation content by expanding and improving local content: Develop content that can displace geopolitically-motivated narratives in the entire media environment, both new and old media alike. - Details


C00170 elevate information as a critical domain of statecraft

Tactic stage: TA01

Summary: Shift from reactive to proactive response, with priority on sharing relevant information with the public and mobilizing private-sector engagement. Recent advances in data-driven technologies have elevated information as a source of power to influence the political and economic environment, to foster economic growth, to enable a decision-making advantage over competitors, and to communicate securely and quickly. - Details


C00161 Coalition Building with stakeholders and Third-Party Inducements

Tactic stage: TA01

Summary: Advance coalitions across borders and sectors, spanning public and private, as well as foreign and domestic, divides. Improve mechanisms to collaborate, share information, and develop coordinated approaches with the private sector at home and allies and partners abroad. - Details


C00010 Enhanced privacy regulation for social media

Tactic stage: TA01

Summary: Implement stronger privacy standards, to reduce the ability to microtarget community members. - Details


C00073 Inoculate populations through media literacy training

Tactic stage: TA01

Summary: Use training to build the resilience of at-risk populations. Educate on how to handle info pollution. Push out targeted education on why it's pollution. Build cultural resistance to false content, e.g. cultural resistance to bullshit. Influence literacy training, to inoculate against “cult” recruiting. Media literacy training: leverage librarians / library for media literacy training. Inoculate at language. Strategic planning included as inoculating population has strategic value. Concepts of media literacy to a mass audience that authorities launch a public information campaign that teaches the program will take time to develop and establish impact, recommends curriculum-based training. Covers detect, deny, and degrade. - Details


C00012 Platform regulation

Tactic stage: TA01

Summary: Empower existing regulators to govern social media. Also covers Destroy. Includes: Include the role of social media in the regulatory framework for media. The U.S. approach will need to be carefully crafted to protect First Amendment principles, create needed transparency, ensure liability, and impose costs for noncompliance. Includes Create policy that makes social media police disinformation. Includes: Use fraud legislation to clean up social media - Details


C00013 Rating framework for news

Tactic stage: TA01

Summary: This is "strategic innoculation", raising the standards of what people expect in terms of evidence when consuming news. Example: journalistic ethics, or journalistic licensing body. Include full transcripts, link source, add items. - Details


C00008 Create shared fact-checking database

Tactic stage: TA01

Summary: Share fact-checking resources - tips, responses, countermessages, across respose groups. - Details


C00159 Have a disinformation response plan

Tactic stage: TA01

Summary: e.g. Create a campaign plan and toolkit for competition short of armed conflict (this used to be called “the grey zone”). The campaign plan should account for own vulnerabilities and strengths, and not over-rely on any one tool of statecraft or line of effort. It will identify and employ a broad spectrum of national power to deter, compete, and counter (where necessary) other countries’ approaches, and will include understanding of own capabilities, capabilities of disinformation creators, and international standards of conduct to compete in, shrink the size, and ultimately deter use of competition short of armed conflict. - Details


C00207 Run a competing disinformation campaign - not recommended

Tactic stage: TA02

Summary: - Details


C00164 compatriot policy

Tactic stage: TA02

Summary: protect the interests of this population and, more importantly, influence the population to support pro-Russia causes and effectively influence the politics of its neighbors - Details


C00092 Establish a truth teller reputation score for influencers

Tactic stage: TA02

Summary: Includes "Establish a truth teller reputation score for influencers” and “Reputation scores for social media users”. Influencers are individuals or accounts with many followers. - Details


C00222 Tabletop simulations

Tactic stage: TA02

Summary: Simulate misinformation and disinformation campaigns, and responses to them, before campaigns happen. - Details


C00070 Block access to disinformation resources

Tactic stage: TA02

Summary: Resources = accounts, channels etc. Block access to platform. DDOS an attacker. TA02*: DDOS at the critical time, to deny an adversary's time-bound objective. T0008: A quick response to a proto-viral story will affect it's ability to spread and raise questions about their legitimacy. Hashtag: Against the platform, by drowning the hashtag. T0046 - Search Engine Optimization: Sub-optimal website performance affect its search engine rank, which I interpret as "blocking access to a platform". - Details


C00169 develop a creative content hub

Tactic stage: TA02

Summary: international donors will donate to a basket fund that will pay a committee of local experts who will, in turn, manage and distribute the money to Russian-language producers and broadcasters that pitch various projects. - Details


C00060 Legal action against for-profit engagement factories

Tactic stage: TA02

Summary: Take legal action against for-profit "factories" creating misinformation. - Details


C00156 Better tell your country or organization story

Tactic stage: TA02

Summary: Civil engagement activities conducted on the part of EFP forces. NATO should likewise provide support and training, where needed, to local public affairs and other communication personnel. Local government and military public affairs personnel can play their part in creating and disseminating entertaining and sharable content that supports the EFP mission. - Details


C00028 Make information provenance available

Tactic stage: TA02

Summary: Blockchain audit log and validation with collaborative decryption to post comments. Use blockchain technology to require collaborative validation before posts or comments are submitted. This could be used to adjust upvote weight via a trust factor of people and organisations you trust, or other criteria. - Details


C00144 Buy out troll farm employees / offer them jobs

Tactic stage: TA02

Summary: Degrade the infrastructure. Could e.g. pay to not act for 30 days. Not recommended - Details


C00029 Create fake website to issue counter narrative and counter narrative through physical merchandise

Tactic stage: TA02

Summary: Create websites in disinformation voids - spaces where people are looking for known disinformation. - Details


C00030 Develop a compelling counter narrative (truth based)

Tactic stage: TA02

Summary: - Details


C00031 Dilute the core narrative - create multiple permutations, target / amplify

Tactic stage: TA02

Summary: Create competing narratives. Included "Facilitate State Propaganda" as diluting the narrative could have an effect on the pro-state narrative used by volunteers, or lower their involvement. - Details


C00009 Educate high profile influencers on best practices

Tactic stage: TA02

Summary: Find online influencers. Provide training in the mechanisms of disinformation, how to spot campaigns, and/or how to contribute to responses by countermessaging, boosting information sites etc. - Details


C00011 Media literacy. Games to identify fake news

Tactic stage: TA02

Summary: Create and use games to show people the mechanics of disinformation, and how to counter them. - Details


C00065 Reduce political targeting

Tactic stage: TA05

Summary: Includes “ban political micro targeting” and “ban political ads” - Details


C00066 Co-opt a hashtag and drown it out (hijack it back)

Tactic stage: TA05

Summary: Flood a disinformation-related hashtag with other content. - Details


C00178 Fill information voids with non-disinformation content

Tactic stage: TA05

Summary: 1) Pollute the data voids with wholesome content (Kittens! Babyshark!). 2) fill data voids with relevant information, e.g. increase Russian-language programming in areas subject to Russian disinformation. - Details


C00216 Use advertiser controls to stem flow of funds to bad actors

Tactic stage: TA05

Summary: Prevent ad revenue going to disinformation domains - Details


C00130 Mentorship: elders, youth, credit. Learn vicariously.

Tactic stage: TA05

Summary: Train local influencers in countering misinformation. - Details


C00085 Mute content

Tactic stage: TA06

Summary: Rate-limit disinformation content. Reduces its effects, whilst not running afoul of censorship concerns. Online archives of content (archives of websites, social media profiles, media, copies of published advertisements; or archives of comments attributed to bad actors, as well as anonymized metadata about users who interacted with them and analysis of the effect) is useful for intelligence analysis and public transparency, but will need similar muting or tagging/ shaming as associated with bad actors. - Details


C00014 Real-time updates to fact-checking database

Tactic stage: TA06

Summary: Update fact-checking databases and resources in real time. Especially import for time-limited events like natural disasters. - Details


C00032 Hijack content and link to truth- based info

Tactic stage: TA06

Summary: Link to platform - Details


C00071 Block source of pollution

Tactic stage: TA06

Summary: Block websites, accounts, groups etc connected to misinformation and other information pollution. - Details


C00072 Remove non-relevant content from special interest groups - not recommended

Tactic stage: TA06

Summary: Check special-interest groups (e.g. medical, knitting) for unrelated and misinformation-linked content, and remove it. - Details


C00074 Identify and delete or rate limit identical content

Tactic stage: TA06

Summary: C00000 - Details


C00075 normalise language

Tactic stage: TA06

Summary: normalise the language around disinformation and misinformation; give people the words for artifact and effect types. - Details


C00076 Prohibit images in political discourse channels

Tactic stage: TA06

Summary: Make political discussion channels text-only. - Details


C00078 Change Search Algorithms for Disinformation Content

Tactic stage: TA06

Summary: Includes “change image search algorithms for hate groups and extremists” and “Change search algorithms for hate and extremist queries to show content sympathetic to opposite side” - Details


C00080 Create competing narrative

Tactic stage: TA06

Summary: Create counternarratives, or narratives that compete in the same spaces as misinformation narratives. Could also be degrade - Details


C00081 Highlight flooding and noise, and explain motivations

Tactic stage: TA06

Summary: Discredit by pointing out the "noise" and informing public that "flooding" is a technique of disinformation campaigns; point out intended objective of "noise" - Details


C00082 Ground truthing as automated response to pollution

Tactic stage: TA06

Summary: Also inoculation. - Details


C00084 Modify disinformation narratives, and rebroadcast them

Tactic stage: TA06

Summary: Includes “poison pill recasting of message” and “steal their truths”. Many techniques involve promotion which could be manipulated. For example, online fundings or rallies could be advertised, through compromised or fake channels, as being associated with "far-up/down/left/right" actors. "Long Game" narratives could be subjected in a similar way with negative connotations. Can also replay technique T0003. - Details


C00086 Distract from noise with addictive content

Tactic stage: TA06

Summary: Example: Interject addictive links or contents into discussions of disinformation materials and measure a "conversion rate" of users who engage with your content and away from the social media channel's "information bubble" around the disinformation item. Use bots to amplify and upvote the addictive content. - Details


C00087 Make more noise than the disinformation

Tactic stage: TA06

Summary: - Details


C00091 Honeypot social community

Tactic stage: TA06

Summary: Set honeypots, e.g. communities, in networks likely to be used for disinformation. - Details


C00094 Force full disclosure on corporate sponsor of research

Tactic stage: TA06

Summary: Accountability move: make sure research is published with its funding sources. - Details


C00106 Click-bait centrist content

Tactic stage: TA06

Summary: Create emotive centrist content that gets more clicks - Details


C00107 Content moderation

Tactic stage: TA06

Summary: includes social media content take-downs, e.g. facebook or Twitter content take-downs - Details


C00142 Platform adds warning label and decision point when sharing content

Tactic stage: TA06

Summary: Includes “this has been disproved: do you want to forward it”. Includes “"Hey this story is old" popup when messaging with old URL” - this assumes that this technique is based on visits to an URL shortener or a captured news site that can publish a message of our choice. Includes “mark clickbait visually”. - Details


C00165 Ensure integrity of official documents

Tactic stage: TA06

Summary: e.g. for leaked legal documents, use court motions to limit future discovery actions - Details


C00202 Set data 'honeytraps'

Tactic stage: TA06

Summary: Set honeytraps in content likely to be accessed for disinformation. - Details


C00219 Add metadata to content that’s out of the control of disinformation creators

Tactic stage: TA06

Summary: Steganography. Adding date, signatures etc to stop issue of photo relabelling etc. - Details


C00195 Redirect searches away from disinformation or extremist content

Tactic stage: TA07

Summary: Use Google AdWords to identify instances in which people search Google about particular fake-news stories or propaganda themes. Includes Monetize centrist SEO by subsidizing the difference in greater clicks towards extremist content. - Details


C00098 Revocation of allowlisted or "verified" status

Tactic stage: TA07

Summary: remove blue checkmarks etc from known misinformation accounts. - Details


C00105 Buy more advertising than misinformation creators

Tactic stage: TA07

Summary: Shift influence and algorithms by posting more adverts into spaces than misinformation creators. - Details


C00103 Create a bot that engages / distract trolls

Tactic stage: TA07

Summary: This is reactive, not active measure (honeypots are active). It's a platform controlled measure. - Details


C00101 Create friction by rate-limiting engagement

Tactic stage: TA07

Summary: Create participant friction. Includes Make repeat voting hard, and throttle number of forwards. - Details


C00097 Require use of verified identities to contribute to poll or comment

Tactic stage: TA07

Summary: Reduce poll flooding by online taking comments or poll entries from verified accounts. - Details


C00099 Strengthen verification methods

Tactic stage: TA07

Summary: Improve content veerification methods available to groups, individuals etc. - Details


C00090 Fake engagement system

Tactic stage: TA07

Summary: Create honeypots for misinformation creators to engage with, and reduce the resources they have available for misinformation campaigns. - Details


C00117 Downgrade / de-amplify so message is seen by fewer people

Tactic stage: TA08

Summary: Label promote counter to disinformation - Details


C00119 Engage payload and debunk.

Tactic stage: TA08

Summary: debunk misinformation content. Provide link to facts. - Details


C00120 Open dialogue about design of platforms to produce different outcomes

Tactic stage: TA08

Summary: Redesign platforms and algorithms to reduce the effectiveness of disinformation - Details


C00121 Tool transparency and literacy for channels people follow.

Tactic stage: TA08

Summary: Make algorithms in platforms explainable, and visible to people using those platforms. - Details


C00112 "Prove they are not an op!"

Tactic stage: TA08

Summary: Challenge misinformation creators to prove they're not an information operation. - Details


C00100 Hashtag jacking

Tactic stage: TA08

Summary: Post large volumes of unrelated content on known misinformation hashtags - Details


C00154 Ask media not to report false information

Tactic stage: TA08

Summary: Train media to spot and respond to misinformation, and ask them not to post or transmit misinformation they've found. - Details


C00136 Microtarget most likely targets then send them countermessages

Tactic stage: TA08

Summary: Find communities likely to be targetted by misinformation campaigns, and send them countermessages or pointers to information sources. - Details


C00188 Newsroom/Journalist training to counter influence moves

Tactic stage: TA08

Summary: Includes SEO influence. Includes promotion of a “higher standard of journalism”: journalism training “would be helpful, especially for the online community. Includes Strengthen local media: Improve effectiveness of local media outlets. - Details


C00184 Media exposure

Tactic stage: TA08

Summary: highlight misinformation activities and actors in media - Details


C00113 Debunk and defuse a fake expert / credentials.

Tactic stage: TA08

Summary: Debunk fake experts, their credentials, and potentially also their audience quality - Details


C00114 Don't engage with payloads

Tactic stage: TA08

Summary: Stop passing on misinformation - Details


C00115 Expose actor and intentions

Tactic stage: TA08

Summary: Debunk misinformation creators and posters. - Details


C00116 Provide proof of involvement

Tactic stage: TA08

Summary: Build and post information about groups etc's involvement in misinformation incidents. - Details


C00118 Repurpose images with new text

Tactic stage: TA08

Summary: Add countermessage text to iamges used in misinformation incidents. - Details


C00147 Make amplification of social media posts expire (e.g. can't like/ retweet after n days)

Tactic stage: TA09

Summary: Stop new community activity (likes, comments) on old social media posts. - Details


C00128 Create friction by marking content with ridicule or other "decelerants"

Tactic stage: TA09

Summary: Repost or comment on misinformation artifacts, using ridicule or other content to reduce the likelihood of reposting. - Details


C00129 Use banking to cut off access

Tactic stage: TA09

Summary: fiscal sanctions; parallel to counter terrorism - Details


C00182 Redirection / malware detection/ remediation

Tactic stage: TA09

Summary: Detect redirction or malware, then quarantine or delete. - Details


C00200 Respected figure (influencer) disavows misinfo

Tactic stage: TA09

Summary: FIXIT: standardize language used for influencer/ respected figure. - Details


C00109 Dampen Emotional Reaction

Tactic stage: TA09

Summary: Reduce emotional responses to misinformation through calming messages, etc. - Details


C00211 Use humorous counter-narratives

Tactic stage: TA09

Summary: - Details


C00122 Content moderation

Tactic stage: TA09

Summary: Beware: content moderation misused becomes censorship. - Details


C00123 Remove or rate limit botnets

Tactic stage: TA09

Summary: reduce the visibility of known botnets online. - Details


C00124 Don't feed the trolls

Tactic stage: TA09

Summary: Don't engage with individuals relaying misinformation. - Details


C00125 Prebunking

Tactic stage: TA09

Summary: Produce material in advance of misinformation incidents, by anticipating the narratives used in them, and debunking them. - Details


C00126 Social media amber alert

Tactic stage: TA09

Summary: Create an alert system around disinformation and misinformation artifacts, narratives, and incidents - Details


C00138 Spam domestic actors with lawsuits

Tactic stage: TA11

Summary: File multiple lawsuits against known misinformation creators and posters, to distract them from disinformation creation. - Details


C00139 Weaponise youtube content matrices

Tactic stage: TA11

Summary: God knows what this is. Keeping temporarily in case we work it out. - Details


C00131 Seize and analyse botnet servers

Tactic stage: TA11

Summary: Take botnet servers offline by seizing them. - Details


C00143 (botnet) DMCA takedown requests to waste group time

Tactic stage: TA11

Summary: Use copyright infringement claims to remove videos etc. - Details


C00140 "Bomb" link shorteners with lots of calls

Tactic stage: TA12

Summary: Applies to most of the content used by exposure techniques except "T0055 - Use hashtag”. Applies to analytics - Details


C00148 Add random links to network graphs

Tactic stage: TA12

Summary: If creators are using network analysis to determine how to attack networks, then adding random extra links to those networks might throw that analysis out enough to change attack outcomes. Unsure which DISARM techniques. - Details


C00149 Poison the monitoring & evaluation data

Tactic stage: TA12

Summary: Includes Pollute the AB-testing data feeds: Polluting A/B testing requires knowledge of MOEs and MOPs. A/B testing must be caught early when there is relatively little data available so infiltration of TAs and understanding of how content is migrated from testing to larger audiences is fundamental. - Details


C00040 third party verification for people

Tactic stage: TA15

Summary: counters fake experts - Details


C00059 Verification of project before posting fund requests

Tactic stage: TA15

Summary: third-party verification of projects posting funding campaigns before those campaigns can be posted. - Details


C00058 Report crowdfunder as violator

Tactic stage: TA15

Summary: counters crowdfunding. Includes ‘Expose online funding as fake”. - Details


C00172 social media source removal

Tactic stage: TA15

Summary: Removing accounts, pages, groups, e.g. facebook page removal - Details


C00056 Encourage people to leave social media

Tactic stage: TA15

Summary: Encourage people to leave spcial media. We don't expect this to work - Details


C00053 Delete old accounts / Remove unused social media accounts

Tactic stage: TA15

Summary: remove or remove access to (e.g. stop the ability to update) old social media accounts, to reduce the pool of accounts available for takeover, botnets etc. - Details


C00052 Infiltrate platforms

Tactic stage: TA15

Summary: Detect and degrade - Details


C00062 Free open library sources worldwide

Tactic stage: TA15

Summary: Open-source libraries could be created that aid in some way for each technique. Even for Strategic Planning, some open-source frameworks such as DISARM can be created to counter the adversarial efforts. - Details


C00162 Unravel/target the Potemkin villages

Tactic stage: TA15

Summary: Kremlin’s narrative spin extends through constellations of “civil society” organizations, political parties, churches, and other actors. Moscow leverages think tanks, human rights groups, election observers, Eurasianist integration groups, and orthodox groups. A collection of Russian civil society organizations, such as the Federal Agency for the Commonwealth of Independent States Affairs, Compatriots Living Abroad, and International Humanitarian Cooperation, together receive at least US$100 million per year, in addition to government-organized nongovernmental organizations (NGOs), at least 150 of which are funded by Russian presidential grants totaling US$70 million per year. - Details


C00067 Denigrate the recipient/ project (of online funding)

Tactic stage: TA15

Summary: Reduce the credibility of groups behind misinformation-linked funding campaigns. - Details


C00189 Ensure that platforms are taking down flagged accounts

Tactic stage: TA15

Summary: Use ongoing analysis/monitoring of "flagged" profiles. Confirm whether platforms are actively removing flagged accounts, and raise pressure via e.g. government organizations to encourage removal - Details


C00051 Counter social engineering training

Tactic stage: TA15

Summary: Includes anti-elicitation training, phishing prevention education. - Details


C00160 find and train influencers

Tactic stage: TA15

Summary: Identify key influencers (e.g. use network analysis), then reach out to identified users and offer support, through either training or resources. - Details


C00197 remove suspicious accounts

Tactic stage: TA15

Summary: Standard reporting for false profiles (identity issues). Includes detecting hijacked accounts and reallocating them - if possible, back to original owners. - Details


C00077 Active defence: run TA15 "develop people” - not recommended

Tactic stage: TA15

Summary: Develop networks of communities and influencers around counter-misinformation. Match them to misinformation creators - Details


C00036 Infiltrate the in-group to discredit leaders (divide)

Tactic stage: TA15

Summary: All of these would be highly affected by infiltration or false-claims of infiltration. - Details


C00203 Stop offering press credentials to propaganda outlets

Tactic stage: TA15

Summary: Remove access to official press events from known misinformation actors. - Details


C00048 Name and Shame Influencers

Tactic stage: TA15

Summary: Think about the different levels: individual vs state-sponsored account. Includes “call them out” and “name and shame”. Identify social media accounts as sources of propaganda—“calling them out”— might be helpful to prevent the spread of their message to audiences that otherwise would consider them factual. Identify, monitor, and, if necessary, target externally-based nonattributed social media accounts. Impact of and Dealing with Trolls - "Chatham House has observed that trolls also sometimes function as decoys, as a way of “keeping the infantry busy” that “aims to wear down the other side” (Lough et al., 2014). Another type of troll involves “false accounts posing as authoritative information sources on social media”. - Details


C00047 Honeypot with coordinated inauthentics

Tactic stage: TA15

Summary: Flood disinformation spaces with obviously fake content, to dilute core misinformation narratives in them. - Details


C00155 Ban incident actors from funding sites

Tactic stage: TA15

Summary: Ban misinformation creators and posters from funding sites - Details


C00046 Marginalise and discredit extremist groups

Tactic stage: TA15

Summary: Reduce the credibility of extremist groups posting misinformation. - Details


C00093 Influencer code of conduct

Tactic stage: TA15

Summary: Establish tailored code of conduct for individuals with many followers. Can be platform code of conduct; can also be community code. - Details


C00042 Address truth contained in narratives

Tactic stage: TA15

Summary: Focus on and boost truths in misinformation narratives, removing misinformation from them. - Details


C00135 Deplatform message groups and/or message boards

Tactic stage: TA15

Summary: Merged two rows here. - Details


C00133 Deplatform Account*

Tactic stage: TA15

Summary: Note: Similar to Deplatform People but less generic. Perhaps both should be left. - Details


C00044 Keep people from posting to social media immediately

Tactic stage: TA15

Summary: Platforms can introduce friction to slow down activities, force a small delay between posts, or replies to posts. - Details


C00034 Create more friction at account creation

Tactic stage: TA15

Summary: Counters fake account - Details