Analysis
Long ago, the press was rallied and weaponized for war. “Hunger Knows No Armistice” laments a poster dating as far back as 1919, and depicting the depth of human suffering World War I created, in a dark and obscured painting of a woman clinging to fellows among the dying.The purpose of the poster was to generate appeal and support a hunger-relief effort for the Near East Relief.
The Road To Hell
Rallying the press to support war efforts and stir public opinion is called propaganda, a term that Pope Gregory generated in the 17th century. All nations, for their part, use some bit of influence over their public to stimulate a reactive whole-of-society response to the events of conflict. For that pope of antiquity, the purpose was to spread the Catholic faith. The use of persuasive propagation began with “pure intentions”, Haifa University of Israel Professor Gabriel Weinnman explained in a recent historic presentation on the subject.
As the old adage says, the “road to Hell is paved with good intentions,” and the vehicle that clerics once rallied to spread messages of human goodwill became the instrument of death.
The Power of Press in the War Mind
In World War II, posters in the States urged and encouraged civilians to do their part in generating war bonds, and scrap metal, and to conserve rations. One could consider this propaganda a defensive use of the press. However, by contrast, during the two World Wars, posters were used for far more nefarious messaging.
In 2009, the Veterans History Project created a presentation studying the weapon of Nazi Propaganda and calling Nazi propaganda the “Machinery of Evil”, breaking down the rise and scourge of the Nazi disinformation machine.
In this presentation, Professor Gabriel Weinmann noted that Nazi propaganda played on the emotional response of Germany, playing on the trauma of Germany from losing World War I, and unifying the people under the idea of “one people” under “one leader,” to foster loyalty. The pain that had been experienced in Germany was linked to specific demographics of people, particularly Jews, and used to generate a national unification and response. Nazi propaganda used all forms of media, including posters, movies, early television to those it was available to, newspapers, demonstrations, art, theater and so forth. Even things like architecture and sculpture had been devoted to Nazi propaganda, as the political machine spared no expense in its effort to sway the German public to a cultic conformity to the narrative.
All of this was rallied under the Nazi minister of propaganda.
All of the previous examples mentioned were put in circulation before the inception of the internet. The internet has given the modes of communication the Nazi war machine relief upon a vehicle of escalated dissemination.
The Digital Age
In recent years, with world events showing unprecedented scale of disinformation on social media, policy makers weigh solutions.
The Political Studies Association tackled this subject in a review released in 2020, where researchers drew attention to the previous Ukraine crisis and news wars surrounding the 2016 U.S. presidential elections cycle. In this review, researchers weighed mutual propaganda activity across both Russia and the United States, its impact on the public eye, and resulting political relational decline between the two states from 2013 to 2019.
In 2016, the RAND Corporation highlighted Russia’s preceding disinformation efforts, calling them the “firehose of falsehood” and explaining that propaganda underwent an evolution after the Russian incursion into Georgia in 2008. In this review of Russian propaganda evolution, the RAND Corporation explained the impact of this metamorphosis had been on display when Russia annexed Crimea in 2014. Russia continued to elevate propaganda tactics throughout that time, using them, RAND noted, to support conflicts within Ukraine and the Russian interest in Syria.
Now the efforts RAND traced in 2016 have grown to maturity, complete with a new toolbelt, and drawing from the decades of Soviet propaganda in practice as well as the years before in the recess of memory drawn up from the catalyzing propaganda of World War I.
Deep Fakes as Weapons
Disinformation has always been a weaponizable asset of war, but the severity of impact has scaled with the invention of AI technologies.
Modern conflict has seen disinformation mutated by the use of deep fakes. Examples come from the conflict zone in Ukraine, where videos have been AI-generated to show Ukrainian President Volodymyr Zelensky and Russian President Vladimir Putin alike announcing the end of the war, as far back as 2022.
By October 2023, analysts declared that “for the first time” deep fakes had been weaponized against the Ukrainian people, spreading via social media. These videos, the analysts noted, had risen to a level of influence that were fueling the conspiracy theories and paranoia in Ukraine. One video in particular, which has become infamous at present, shows Zelensky surrendering to Putin, an event which sparked a frantic response from the Ukraine government for the people to maintain calm, and soldiers to maintain positions.
Researchers with the University of Cork, Ireland put together a study to question whether deep fakes had “undermined” epistemic trust. Epistemic trust refers to trusting knowledge and information that can be collected and verified.
Real and Present Danger
With deep fakes, the researchers noted immediate present dangers were posed to the veracity of news gathering. John Twomey, one of the researchers with the University of Cork, noted “for the first time” with examples from Ukraine, deep fakes had been used to influence war.
In this use, deep fake technology transitioned from interesting novel technology production, to a tool of propaganda, at the inflection point of human history, when, like no time before, propaganda has reached an all time scalability.
On June 6, NPR News reported that the scale of Russian deep fake-assisted propaganda had ramped up in an American election year. A deep fake video reportedly showed what was to appear to be an official from the U.S. State Department declaring a city in Russia was an ample target for Ukraine to attack using U.S. weapons.
U.S. officials told NPR News that Russia remains the single-greatest threat to U.S. elections. With elections occurring likewise in Europe, Russia is expected to continue scaling its propaganda and generating disinformation influence campaigns. In real-time, nations witness how what was used directly in the Russo-Ukrainian conflict has transcended, influencing global relations.
The Art of Reverse Deception
The University of Cork pointed out an anomaly that further compounds the era of disinformation steered by deep-fake. Verifiable news is sometimes labeled as “deep fake.” The act of undermining veracity of a news gathering service, and then elevating the range, visibility and “verification” of a deep fake is a compound factor aiding the art of deception weaponized AI is skilled at.
The War Against Propaganda
Gargantuan growth and transformation of propaganda with the use of deep fakes, voice clones, and generative AI adds a sense of urgency to the fight against propaganda. The world recognizes both the need for veracity and the sudden surge of anti-information, but the solution is not always cut-and-dry clear to the average news consumer.
Basic Defenses Against Propaganda
The average news consumer is called upon to learn some of the basics of journalism and to apply them for their own purposes. Frontsight Media has analyzed the tools and resources available to the general public that can help guard against disinformation campaigns:
General News
When consuming news, ask yourself: is this story corroborated by multiple sources? Does this narrative try to generate an emotional response from the viewer? Does it play to fear or hatred? Does it generate anger? Is there another motivation or opinion that I am asked to agree with and act upon?
Discern Deep Fakes With Character in Mind
When dealing with deep fakes, the art of deception can be far more nuanced. The Government Accountability Office has suggested the use of technologies to expose images that have been manipulated by artificial intelligence or other synthetic methods.
For the average person, who may not have instant access to technological methods of discerning deep fakes, analysis lends some recourse.
In this instance ,when deep fakes are generated around public officials, a suggested rule of thumb, Frontsight analyzes,is to compare the video to previous videos of the individual in question. Does the message the person is displaying match the public image record of this person? Would Zelensky, judging by his character on previous public record, likely surrender to the Russians on such short notice? If the answer is “no”, then it is reasonable to question the video.
This application will not always work, as deep fakes, and their ability to imitate down to the likeliness of a narrative will develop. However, as a first response of defense, developing a character assessing view of public officials can be a start in avoiding disinformation’s influence.
Insights For Weighing Public Character
The League of Women Voters gives examples of assessing the nature of candidates for political elections. As one of the steps suggested for choosing a political candidate, the League of Women Voters suggested gathering materials on candidates, contacting Vote411 to get information on that candidates political stances and views, gathering recordings of speeches, and previous media surrounding that candidate, reading online debates about the candidates’ stance, and so forth. George Mason University likewise gave insights into how American people can judge the character of presidents, giving a rubric for making assessments of public officials that, for the purposes of this analysis, can be transferred to weighing and questioning the nature of public conduct in video content that may have the air of deep fakeness about it.