TA15 "Establish Social Assets"
Belongs to phase P02 Prepare
Summary: Establishing information assets generates messaging tools, including social media accounts, operation personnel, and organizations, including directly and indirectly managed assets. For assets under their direct control, the operation can add, change, or remove these assets at will. Establishing information assets allows an influence operation to promote messaging directly to the target audience without navigating through external entities. Many online influence operations create or compromise social media accounts as a primary vector of information dissemination.
TA15 Tasks | ||
disarm_id | name | summary |
TK0010 | Create personas | Create personas |
TK0011 | Recruit contractors | Recruit contractors |
TK0012 | Recruit partisans | Recruit partisans |
TK0013 | find influencers | find influencers |
TK0014 | Network building | Network building |
TK0015 | Network infiltration | Network infiltration |
TK0016 | identify targets - susceptible audience members in networks | identify targets - susceptible audience members in networks |
TK0033 | OPSEC for TA15 | OPSEC for TA15 |
TK0034 | OPSEC for TA15 | OPSEC for TA15 |
TA15 Techniques | ||
disarm_id | name | summary |
T0007 | Create Inauthentic Social Media Pages and Groups | Create key social engineering assets needed to amplify content, manipulate algorithms, fool public and/or specific incident/campaign targets. Computational propaganda depends substantially on false perceptions of credibility and acceptance. By creating fake users and groups with a variety of interests and commitments, attackers can ensure that their messages both come from trusted sources and appear more widely adopted than they actually are. |
T0010 | Cultivate ignorant agents | Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents". |
T0013 | Create inauthentic websites | Create media assets to support inauthentic organizations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations. |
T0014 | Prepare fundraising campaigns | Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities. |
T0014.001 | Raise funds from malign actors | Raising funds from malign actors may include contributions from foreign agents, cutouts or proxies, shell companies, dark money groups, etc. |
T0014.002 | Raise funds from ignorant agents | Raising funds from ignorant agents may include scams, donations intended for one stated purpose but then used for another, etc. |
T0065 | Prepare Physical Broadcast Capabilities | Create or coopt broadcast capabilities (e.g. TV, radio etc). |
T0090 | Create Inauthentic Accounts | Inauthentic accounts include bot accounts, cyborg accounts, sockpuppet accounts, and anonymous accounts. |
T0090.001 | Create Anonymous Accounts | Anonymous accounts or anonymous users refer to users that access network resources without providing a username or password. An influence operation may use anonymous accounts to spread content without direct attribution to the operation. |
T0090.002 | Create Cyborg Accounts | Cyborg accounts refer to partly manned, partly automated social media accounts. Cyborg accounts primarily act as bots, but a human operator periodically takes control of the account to engage with real social media users by responding to comments and posting original content. Influence operations may use cyborg accounts to reduce the amount of direct human input required to maintain a regular account but increase the apparent legitimacy of the cyborg account by occasionally breaking its bot-like behavior with human interaction. |
T0090.003 | Create Bot Accounts | Bots refer to autonomous internet users that interact with systems or other users while imitating traditional human behavior. Bots use a variety of tools to stay active without direct human operation, including artificial intelligence and big data analytics. For example, an individual may program a Twitter bot to retweet a tweet every time it contains a certain keyword or hashtag. An influence operation may use bots to increase its exposure and artificially promote its content across the internet without dedicating additional time or human resources. Amplifier bots promote operation content through reposts, shares, and likes to increase the content’s online popularity. Hacker bots are traditionally covert bots running on computer scripts that rarely engage with users and work primarily as agents of larger cyberattacks, such as a Distributed Denial of Service attacks. Spammer bots are programmed to post content on social media or in comment sections, usually as a supplementary tool. Impersonator bots102 pose as real people by mimicking human behavior, complicating their detection. |
T0090.004 | Create Sockpuppet Accounts | Sockpuppet accounts refer to falsified accounts that either promote the influence operation’s own material or attack critics of the material online. Individuals who control sockpuppet accounts also man at least one other user account.67 Sockpuppet accounts help legitimize operation narratives by providing an appearance of external support for the material and discrediting opponents of the operation. |
T0091 | Recruit malign actors | Operators recruit bad actors paying recruiting, or exerting control over individuals includes trolls, partisans, and contractors. |
T0091.001 | Recruit Contractors | Operators recruit paid contractor to support the campaign. |
T0091.002 | Recruit Partisans | Operators recruit partisans (ideologically-aligned individuals) to support the campaign. |
T0091.003 | Enlist Troll Accounts | An influence operation may hire trolls, or human operators of fake accounts that aim to provoke others by posting and amplifying content about controversial issues. Trolls can serve to discredit an influence operation’s opposition or bring attention to the operation’s cause through debate. Classic trolls refer to regular people who troll for personal reasons, such as attention-seeking or boredom. Classic trolls may advance operation narratives by coincidence but are not directly affiliated with any larger operation. Conversely, hybrid trolls act on behalf of another institution, such as a state or financial organization, and post content with a specific ideological goal. Hybrid trolls may be highly advanced and institutionalized or less organized and work for a single individual. |
T0092 | Build Network | Operators build their own network, creating links between accounts -- whether authentic or inauthentic -- in order amplify and promote narratives and artifacts, and encourage further growth of ther network, as well as the ongoing sharing and engagement with operational content. |
T0092.001 | Create Organizations | Influence operations may establish organizations with legitimate or falsified hierarchies, staff, and content to structure operation assets, provide a sense of legitimacy to the operation, or provide institutional backing to operation activities. |
T0092.002 | Use Follow Trains | A follow train is a group of people who follow each other on a social media platform, often as a way for an individual or campaign to grow its social media following. Follow trains may be a violation of platform Terms of Service. They are also known as follow-for-follow groups. |
T0092.003 | Create Community or Sub-group | When there is not an existing community or sub-group that meets a campaign's goals, an influence operation may seek to create a community or sub-group. |
T0093 | Acquire/Recruit Network | Operators acquire an existing network by paying, recruiting, or exerting control over the leaders of the existing network. |
T0093.001 | Fund Proxies | An influence operation may fund proxies, or external entities that work for the operation. An operation may recruit/train users with existing sympathies towards the operation’s narratives and/or goals as proxies. Funding proxies serves various purposes including: - Diversifying operation locations to complicate attribution - Reducing the workload for direct operation assets |
T0093.002 | Acquire Botnets | A botnet is a group of bots that can function in coordination with each other. |
T0094 | Infiltrate Existing Networks | Operators deceptively insert social assets into existing networks as group members in order to influence the members of the network and the wider information environment that the network impacts. |
T0094.001 | Identify susceptible targets in networks | When seeking to infiltrate an existing network, an influence operation may identify individuals and groups that might be susceptible to being co-opted or influenced. |
T0094.002 | Utilize Butterfly Attacks | Butterfly attacks occur when operators pretend to be members of a certain social group, usually a group that struggles for representation. An influence operation may mimic a group to insert controversial statements into the discourse, encourage the spread of operation content, or promote harassment among group members. Unlike astroturfing, butterfly attacks aim to infiltrate and discredit existing grassroots movements, organizations, and media campaigns. |
T0095 | Develop Owned Media Assets | An owned media asset refers to an agency or organization through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organization of content. |
T0096 | Leverage Content Farms | Using the services of large-scale content providers for creating and amplifying campaign artifacts at scale. |
T0096.001 | Create Content Farms | An influence operation may create an organization for creating and amplifying campaign artifacts at scale. |
T0096.002 | Outsource Content Creation to External Organizations | An influence operation may outsource content creation to external companies to avoid attribution, increase the rate of content creation, or improve content quality, i.e., by employing an organization that can create content in the target audience’s native language. Employed organizations may include marketing companies for tailored advertisements or external content farms for high volumes of targeted media. |
TA15 Counters | ||
disarm_id | name | summary |
C00034 | Create more friction at account creation | Counters fake account |
C00036 | Infiltrate the in-group to discredit leaders (divide) | All of these would be highly affected by infiltration or false-claims of infiltration. |
C00040 | third party verification for people | counters fake experts |
C00042 | Address truth contained in narratives | Focus on and boost truths in misinformation narratives, removing misinformation from them. |
C00044 | Keep people from posting to social media immediately | Platforms can introduce friction to slow down activities, force a small delay between posts, or replies to posts. |
C00046 | Marginalise and discredit extremist groups | Reduce the credibility of extremist groups posting misinformation. |
C00047 | Honeypot with coordinated inauthentics | Flood disinformation spaces with obviously fake content, to dilute core misinformation narratives in them. |
C00048 | Name and Shame Influencers | Think about the different levels: individual vs state-sponsored account. Includes “call them out” and “name and shame”. Identify social media accounts as sources of propaganda—“calling them out”— might be helpful to prevent the spread of their message to audiences that otherwise would consider them factual. Identify, monitor, and, if necessary, target externally-based nonattributed social media accounts. Impact of and Dealing with Trolls - "Chatham House has observed that trolls also sometimes function as decoys, as a way of “keeping the infantry busy” that “aims to wear down the other side” (Lough et al., 2014). Another type of troll involves “false accounts posing as authoritative information sources on social media”. |
C00051 | Counter social engineering training | Includes anti-elicitation training, phishing prevention education. |
C00052 | Infiltrate platforms | Detect and degrade |
C00053 | Delete old accounts / Remove unused social media accounts | remove or remove access to (e.g. stop the ability to update) old social media accounts, to reduce the pool of accounts available for takeover, botnets etc. |
C00056 | Encourage people to leave social media | Encourage people to leave spcial media. We don't expect this to work |
C00058 | Report crowdfunder as violator | counters crowdfunding. Includes ‘Expose online funding as fake”. |
C00059 | Verification of project before posting fund requests | third-party verification of projects posting funding campaigns before those campaigns can be posted. |
C00062 | Free open library sources worldwide | Open-source libraries could be created that aid in some way for each technique. Even for Strategic Planning, some open-source frameworks such as DISARM can be created to counter the adversarial efforts. |
C00067 | Denigrate the recipient/ project (of online funding) | Reduce the credibility of groups behind misinformation-linked funding campaigns. |
C00077 | Active defence: run TA15 "develop people” - not recommended | Develop networks of communities and influencers around counter-misinformation. Match them to misinformation creators |
C00093 | Influencer code of conduct | Establish tailored code of conduct for individuals with many followers. Can be platform code of conduct; can also be community code. |
C00133 | Deplatform Account* | Note: Similar to Deplatform People but less generic. Perhaps both should be left. |
C00135 | Deplatform message groups and/or message boards | Merged two rows here. |
C00155 | Ban incident actors from funding sites | Ban misinformation creators and posters from funding sites |
C00160 | find and train influencers | Identify key influencers (e.g. use network analysis), then reach out to identified users and offer support, through either training or resources. |
C00162 | Unravel/target the Potemkin villages | Kremlin’s narrative spin extends through constellations of “civil society” organizations, political parties, churches, and other actors. Moscow leverages think tanks, human rights groups, election observers, Eurasianist integration groups, and orthodox groups. A collection of Russian civil society organizations, such as the Federal Agency for the Commonwealth of Independent States Affairs, Compatriots Living Abroad, and International Humanitarian Cooperation, together receive at least US$100 million per year, in addition to government-organized nongovernmental organizations (NGOs), at least 150 of which are funded by Russian presidential grants totaling US$70 million per year. |
C00172 | social media source removal | Removing accounts, pages, groups, e.g. facebook page removal |
C00189 | Ensure that platforms are taking down flagged accounts | Use ongoing analysis/monitoring of "flagged" profiles. Confirm whether platforms are actively removing flagged accounts, and raise pressure via e.g. government organizations to encourage removal |
C00197 | remove suspicious accounts | Standard reporting for false profiles (identity issues). Includes detecting hijacked accounts and reallocating them - if possible, back to original owners. |
C00203 | Stop offering press credentials to propaganda outlets | Remove access to official press events from known misinformation actors. |
TA15 Detections | ||
disarm_id | name | summary |
F00008 | Detect abnormal amplification | |
F00009 | Detect abnormal events | |
F00010 | Detect abnormal groups | |
F00011 | Detect abnormal pages | |
F00012 | Detect abnormal profiles, e.g. prolific pages/ groups/ people | |
F00013 | Identify fake news sites | |
F00014 | Trace connections | for e.g. fake news sites |
F00015 | Detect anomalies in membership growth patterns | I include Fake Experts as they may use funding campaigns such as Patreon to fund their operations and so these should be watched. |
F00016 | Identify fence-sitters | Note: In each case, depending on the platform there may be a way to identify a fence-sitter. For example, online polls may have a neutral option or a "somewhat this-or-that" option, and may reveal who voted for that to all visitors. This information could be of use to data analysts. In TA08-11, the engagement level of victims could be identified to detect and respond to increasing engagement. |
F00017 | Measure emotional valence | |
F00018 | Follow the money | track funding sources |
F00019 | Activity resurgence detection (alarm when dormant accounts become activated) | |
F00020 | Detect anomalous activity | |
F00021 | AI/ML automated early detection of campaign planning | |
F00022 | Digital authority - regulating body (united states) | |
F00023 | Periodic verification (counter to hijack legitimate account) | |
F00024 | Teach civics to kids/ adults/ seniors | |
F00077 | Model for bot account behavior | Bot account: action based, people. Unsure which DISARM techniques. |
F00078 | Monitor account level activity in social networks | All techniques benefit from careful analysis and monitoring of activities on social network. |
F00084 | Track individual bad actors | |
F00089 | target/name/flag "grey zone" website content | "Gray zone" is second level of content producers and circulators, composed of outlets with uncertain attribution. This category covers conspiracy websites, far-right or far-left websites, news aggregators, and data dump websites |
F00093 | S4d detection and re-allocation approaches | S4D is a way to separate out different speakers in text, audio. |