FROM HYBRID WAR TO BUYING WEAPONS, BELARUS IS UNDER EXTERNAL THREAT

There are many different ways of looking at disinformation, with all its techniques, narratives, and instruments. One of the approaches offering a more thorough understanding is the Adversarial Misinformation and Influence Tactics and Techniques (AMITT)(opens in a new tab) framework. It is based on the standards and tooling used in the information security community, but modified so the techniques listed can be applied to detect and disrupt influence operations, including disinformation.

Disinformation cases collected this week consider, among others, Belarus. They include rather uneasy narratives such as hybrid war facing the country; West planning to tear Belarus away from Russia; and the evil West engaging in information warfare.

In addition, the “hostilities and provocations from Ukraine” are being countered by the Belarussian regime by buying weapons en masse, despite the fact that Ukraine has no plans to destabilise its neighbour. In other words, the cases above claim that Belarus is under malign external threat.

Distort facts

The AMITT framework(opens in a new tab) lists ten different techniques to spread disinformation and manipulate the audiences, including creating fake research, leaking altered documents (examples published on EUvsDisinfo include a fake letter and a fake ID) and generating information pollution (as in the Skripal case). Other examples of how to modify the world to suit pro-Kremlin facts also include denying involvement (still used regarding Ukraine) and demanding proof (case of MH17 flight tragedy).

Disinformation is often knitted around a kernel of truth. This week we found two examples where content from a respected source was adjusted to fit the pro-Kremlin worldview. First, French Sputnik misrepresented an article from the Financial Times by falsely claiming that EU and US sanctions had boosted the Russian economy. The same Sputnik also distorted an investigation by CNN to support the theory that the CIA destabilised relations between Minsk and Moscow. In reality the CIA didn’t play a leading role in the operation in question.

Channels

AMITT also lists many social media outlets that can be used to promote and disseminate distorted messages. One way to see how the Kremlin-controlled media uses these possibilities and also to measure the effectiveness of their work is to check how many likes, shares and comments on social media an article received.

Looking at examples of disinforming content mentioned in this text, the only successful articles on social media were created by French Sputnik receiving 791 engagements for the FT article and 70 engagements for the article referring to the CNN investigation.

Amplification

In addition to social media the framework also lists tertiary websites as handy to further convey the message. A good example is a false claim about the West allocating $5 billion to support the coup in Ukraine that was made accessible on six Arabic sites that referred back to Sputnik Arabic.

Then again, our Disinformation Cases Database has many Arabic cases where more than ten outlets spread the same message often originating from RT or Sputnik.

Play the long game

The AMITT framework concludes with techniques describing the persistence of disinforming activities. When looking at our Disinformation Cases Database and projecting the activities of the Kremlin into the future, it’s clear that disinformation is not evanescent and it won’t disappear like the morning fog. However, it will have a similar effect on people’s view on the world as haze or smog.
FROM HYBRID WAR TO BUYING WEAPONS, BELARUS IS UNDER EXTERNAL THREAT There are many different ways of looking at disinformation, with all its techniques, narratives, and instruments. One of the approaches offering a more thorough understanding is the Adversarial Misinformation and Influence Tactics and Techniques (AMITT)(opens in a new tab) framework. It is based on the standards and tooling used in the information security community, but modified so the techniques listed can be applied to detect and disrupt influence operations, including disinformation. Disinformation cases collected this week consider, among others, Belarus. They include rather uneasy narratives such as hybrid war facing the country; West planning to tear Belarus away from Russia; and the evil West engaging in information warfare. In addition, the “hostilities and provocations from Ukraine” are being countered by the Belarussian regime by buying weapons en masse, despite the fact that Ukraine has no plans to destabilise its neighbour. In other words, the cases above claim that Belarus is under malign external threat. Distort facts The AMITT framework(opens in a new tab) lists ten different techniques to spread disinformation and manipulate the audiences, including creating fake research, leaking altered documents (examples published on EUvsDisinfo include a fake letter and a fake ID) and generating information pollution (as in the Skripal case). Other examples of how to modify the world to suit pro-Kremlin facts also include denying involvement (still used regarding Ukraine) and demanding proof (case of MH17 flight tragedy). Disinformation is often knitted around a kernel of truth. This week we found two examples where content from a respected source was adjusted to fit the pro-Kremlin worldview. First, French Sputnik misrepresented an article from the Financial Times by falsely claiming that EU and US sanctions had boosted the Russian economy. The same Sputnik also distorted an investigation by CNN to support the theory that the CIA destabilised relations between Minsk and Moscow. In reality the CIA didn’t play a leading role in the operation in question. Channels AMITT also lists many social media outlets that can be used to promote and disseminate distorted messages. One way to see how the Kremlin-controlled media uses these possibilities and also to measure the effectiveness of their work is to check how many likes, shares and comments on social media an article received. Looking at examples of disinforming content mentioned in this text, the only successful articles on social media were created by French Sputnik receiving 791 engagements for the FT article and 70 engagements for the article referring to the CNN investigation. Amplification In addition to social media the framework also lists tertiary websites as handy to further convey the message. A good example is a false claim about the West allocating $5 billion to support the coup in Ukraine that was made accessible on six Arabic sites that referred back to Sputnik Arabic. Then again, our Disinformation Cases Database has many Arabic cases where more than ten outlets spread the same message often originating from RT or Sputnik. Play the long game The AMITT framework concludes with techniques describing the persistence of disinforming activities. When looking at our Disinformation Cases Database and projecting the activities of the Kremlin into the future, it’s clear that disinformation is not evanescent and it won’t disappear like the morning fog. However, it will have a similar effect on people’s view on the world as haze or smog.
0 Comments 0 Shares 0 Reviews
Sponsored