The Conversation: Learn to recognize hallmarks of online disinformation ploys

Information warfare abounds, and everyone online has been drafted whether they know it or not.

Disinformation is deliberately generated misleading content disseminated for selfish or malicious purposes. Unlike misinformation, which may be shared unwittingly or with good intentions, disinformation aims to foment distrust, destabilize institutions, discredit good intentions, defame opponents and delegitimize sources of knowledge such as science and journalism.

Many governments engage in disinformation campaigns. For instance, the Russian government has used images of celebrities to attract attention to anti-Ukraine propaganda. And Meta, parent company of Facebook and Instagram, recently warned that China had stepped up its disinformation operations.

Disinformation is nothing new. And information warfare has been practiced by many countries, including the U.S.

But the internet gives disinformation campaigns unprecedented reach. Foreign governments, internet trolls, domestic and international extremists, opportunistic profiteers and even paid disinformation agencies exploit the internet to spread questionable content. Civil unrest, natural disasters, health crises and wars trigger anxiety, and disinformation agents take full advantage.

Meta has uncovered and blocked sophisticated Chinese disinformation campaigns. But that’s just the tip of the iceberg.

It’s worth watching for the warning signs for misinformation and dangerous speech. It’s even more important, perhaps, to recognize some of the that tactics disinformation agents employ:


It’s just a joke

Hahaganda is a tactic in which disinformation agents use memes, political comedy from state-run outlets, or speeches to make light of serious matters, attack others, minimize violence, dehumanize and deflect blame.

This approach provides an easy defense. If challenged, the disinformation agents can say, “Can’t you take a joke?” often followed by accusations of being too politically correct.


Shhh … tell everyone

Rumor-milling is a tactic in which the disinformation agents claim to have exclusive access to secrets they allege are being purposefully concealed. They indicate that you will “only hear this here” and will imply that others are unwilling to share the alleged truth – for example, “The media won’t report this” or “The government doesn’t want you to know” and “I shouldn’t be telling you this, but … “

But they do not insist that the information be kept secret, and will instead include encouragement to share it – for example, “Make this go viral” or “Most people won’t have the courage to share this.” It’s important to question how an author or speaker could have come by such “secret” information and what their motive is in prompting you to share it.


People are saying

Often disinformation has no real evidence, so instead disinformation agents will find or make up people to support their assertions. This impersonation can take multiple forms. Disinformation agents will use anecdotes as evidence, especially sympathetic stories from vulnerable groups such as women or children.

Similarly, they may disseminate “concerned citizen’” perspectives. People in this category present their social identity as an indicator of authority to speak on a matter — for example, “As a mother …,” “As a veteran …,” “As a police officer … .”

Convert communicators, or people who allegedly change from the “wrong” position to the “right” one, can be especially persuasive. For example, take the the woman who supposedly got an abortion but regretted it.

These people often don’t actually exist. In other cases, they may have been coerced or paid.


According to experts

If ordinary people don’t suffice, fake experts may be used. One way you can guard against “inauthentic user” behavior is by checking accounts on X — formerly Twitter — using the Botometer.

Fake experts come in many different varieties:

A faux expert is someone who is used for his or her title, but doesn’t actually have relevant expertise.

A pseudo expert is someone who claims relevant expertise, but has no actual training.

A junk expert is a sellout who may have had expertise once, but now says whatever is profitable. You can often find these people have supported other dubious claims — for example, that smoking doesn’t cause cancer — or work for institutes that regularly produce questionable “scholarship.”

An echo expert is a disinformation source citing another disinformation source to provide credence for spurious claims. China and Russia routinely cite one another’s newspapers, for example.

A stolen expert is someone who exists, but wasn’t actually contacted and had the relevant research misconstrued. Disinformation agents also steal credibility from known news sources, sometimes through typo squatting, the practice of setting up a domain name that closely resembles that of a legitimate organization.

You can check whether accounts, anecdotal or scientific, have been verified by other reliable sources.

Google the name to check expertise, status, source validity and interpretation of research. Remember, one story or interpretation is not necessarily representative.


It’s a giant conspiracy

Conspiratorial narratives involve some malevolent force — for example, “the deep state” — engaged in covert actions with the aim to cause harm to society. That certain conspiracies such as MK-Ultra and Watergate have been confirmed is often offered as evidence for the validity of new unfounded conspiracies.

Nonetheless, disinformation agents find that constructing a conspiracy is an effective means to remind people of past reasons to distrust governments, scientists or other trustworthy sources.

Extraordinary claims require extraordinary evidence. Remember, the conspiracies that were ultimately unveiled had evidence — often from sources like investigative journalists, scientists and government investigations.

Be particularly wary of conspiracies that try to delegitimize knowledge-producing institutions like universities, research labs, government agencies and news outlets by claiming they are in on a cover-up.

Basic tips for resisting disinformation and misinformation include thinking twice before sharing social media posts that trigger emotional responses like anger and fear and checking the sources of posts that make unusual or extraordinary claims.


Good vs. evil

Disinformation often serves the dual purpose of making the originator look good and opponents look bad. Disinformation takes this further by painting issues as a battle between good and evil, using accusations of evilness to legitimize violence.

Russia is particularly fond of accusing others of being secret Nazis, pedophiles or Satanists. Meanwhile, they often depict their soldiers as helping children and the elderly.

Be especially wary of accusations of atrocities like genocide, especially under the attention-grabbing “breaking news” headline.

Accusations abound. Verify the facts and how the information was obtained.


Are you with us or against us?

A false dichotomy narrative sets up the reader to believe that they have one of two mutually exclusive options; a good or a bad one, a right or a wrong one, a red pill or a blue pill. You can accept their version of reality or be an idiot or “sheeple.”

There are always more options than those being presented, and issues are rarely so black and white. This is just one of the tactics in brigading, where disinformation agents seek to silence dissenting viewpoints by casting them as the wrong choice.


Turning the tables

Whataboutism is a classic Russian disinformation technique used to deflect attention from own wrongdoings by alleging wrongdoings on the part of others.

These allegations about the actions of others may be true or false, but irrelevant to the matter at hand in either case. The potential past wrongs of one group does not mean you should ignore the current wrongs of another.

Disinformation agents also often cast their group as the wronged party. They only engage in disinformation because their “enemy” engages in disinformation against them; they only attack to defend; their reaction was appropriate, while that of others was an overreaction.

This type of competitive victimhood is particularly pervasive when groups have been embedded in a long-lasting conflict.

In all of these cases, the disinformation agent is aware of the deflecting, misleading, trolling or outright fabricating. If you don’t believe them, they at least want to make you question what, if anything, you can believe.

You often pause to take stock of advertising claims before handing over your money. This should also go for what information you buy into.

From The Conversation, an online repository of lay versions of academic research findings found at https://theconversation.com/us. Used with permission.


Web Design and Web Development by Buildable