TA09 "Deliver Content"

Belongs to phase P03 Execute

Summary: Release content to general public or larger population

TA09 Tasks
TK0021 deamplification (suppression, censoring) deamplification (suppression, censoring)
TK0022 amplification amplification
TK0039 OPSEC for TA09 OPSEC for TA09

TA09 Techniques
T0114 Deliver Ads Delivering content via any form of paid media or advertising.
T0114.001 Social media Social Media
T0114.002 Traditional Media Examples include TV, Radio, Newspaper, billboards
T0115 Post Content Delivering content by posting via owned media (assets that the operator controls).
T0115.001 Share Memes Memes are one of the most important single artefact types in all of computational propaganda. Memes in this framework denotes the narrow image-based definition. But that naming is no accident, as these items have most of the important properties of Dawkins' original conception as a self-replicating unit of culture. Memes pull together reference and commentary; image and narrative; emotion and message. Memes are a powerful tool and the heart of modern influence campaigns.
T0115.002 Post Violative Content to Provoke Takedown and Backlash Post Violative Content to Provoke Takedown and Backlash.
T0115.003 One-Way Direct Posting Direct posting refers to a method of posting content via a one-way messaging service, where the recipient cannot directly respond to the poster’s messaging. An influence operation may post directly to promote operation narratives to the target audience without allowing opportunities for fact-checking or disagreement, creating a false sense of support for the narrative.
T0116 Comment or Reply on Content Delivering content by replying or commenting via owned media (assets that the operator controls).
T0116.001 Post inauthentic social media comment Use government-paid social media commenters, astroturfers, chat bots (programmed to reply to specific key words/hashtags) influence online conversations, product reviews, web-site comment forums.
T0117 Attract Traditional Media Deliver content by attracting the attention of traditional media (earned media).

TA09 Counters
C00109 Dampen Emotional Reaction Reduce emotional responses to misinformation through calming messages, etc.
C00122 Content moderation Beware: content moderation misused becomes censorship.
C00123 Remove or rate limit botnets reduce the visibility of known botnets online.
C00124 Don't feed the trolls Don't engage with individuals relaying misinformation.
C00125 Prebunking Produce material in advance of misinformation incidents, by anticipating the narratives used in them, and debunking them.
C00126 Social media amber alert Create an alert system around disinformation and misinformation artifacts, narratives, and incidents
C00128 Create friction by marking content with ridicule or other "decelerants" Repost or comment on misinformation artifacts, using ridicule or other content to reduce the likelihood of reposting.
C00129 Use banking to cut off access fiscal sanctions; parallel to counter terrorism
C00147 Make amplification of social media posts expire (e.g. can't like/ retweet after n days) Stop new community activity (likes, comments) on old social media posts.
C00182 Redirection / malware detection/ remediation Detect redirction or malware, then quarantine or delete.
C00200 Respected figure (influencer) disavows misinfo FIXIT: standardize language used for influencer/ respected figure.
C00211 Use humorous counter-narratives

TA09 Detections
F00051 Challenge expertise
F00052 Discover sponsors Discovering the sponsors behind a campaign, narrative, bot, a set of accounts, or a social media comment, or anything else is useful.
F00053 Government rumour control office (what can we learn?)
F00054 Restrict people who can @ you on social networks
F00055 Verify credentials
F00056 Verify organisation legitimacy
F00057 Verify personal credentials of experts
F00081 Need way for end user to report operations
F00092 daylight Warn social media companies about an ongoing campaign (e.g. antivax sites). Anyone with datasets or data summaries can help with this
F00095 Fact checking Process suspicious artifacts, narratives, and incidents