Analysis
World theaters have been brought to the brink of war and ruin over the centuries because of rumors and lies. Never before, in the history of the human race, has the power to stretch the verbal narrative been machine-assisted to the degree of modern making. In this era of machine learning and deep fakes, the twisting of words passes through the crucible of voice cloning.
Human Parity Unlocked
Speculation has centered around the inception of Microsoft’s VALLE-2 text-to-speech technology, which debuted earlier this month. Because of its believability in human voice likeness, the technology’s makers have warned that the technology may present real hazards to world security. Microsoft said that, in making VALLE-2 they had achieved “human parity” for the “first time.” Microsoft defines "human parity" as that place where a human and an AI have achieved equal capacity to translate or communicate like humans.
How Voice Cloning is Used in Scams
CBS News explained that scammers can extract a person's voice from social media clips and use these sound bites to create a voice clone. Once the clone is made, the fake persona of a real individual's voice is then used to trick other individuals, such as a person's loved ones, into agreeing to some scheme.
While online scams are primary examples of voice cloning crimes, motivation for using voice cloning tech for nefarious purposes comes in various formats. From the Hollywood hills, where Scarlett Johansson was faced with the reality of voice theft, all the way to the average TikTok poster, the problem of AI voice fraud has added to the laundry list of identity theft risks to emerge within the metamorphosis of generative AI.
Criminal Motivations For Voice Cloning
The general public commonly associates voice cloning with financial schemes.
The Federal Trade Commission explains the risks of voice clone technology present with the most urgency in the form of financial fraud.
The FTC has listed the benefits of text-to-speech AI, a modern technology that uses a robotic voice to read complex text aloud, as advancing accessibility. People who have lost their voice due to injury or illness can regain speech capacity with voice cloning technology. However, the risks of the tech present themselves in the form of extortion schemes.
Voice cloning threats vary in degrees of threat and intensity. A fundamental problem, researchers with CBS News found earlier this year, is that voice cloning can be very cheap, with some services charging $5 for a user to generate a voice clone.
Weaponized Degrees of Threat
In October 2023, the use of voice cloning in warfare came to international attention.
Sudan has been locked in brutal civil wars repeatedly for decades. In 2023, a campaign used voice cloning technology to impersonate Omar al-Bashir, a former leader of Sudan, and transmit a message from him that received hundreds of thousands of engagements on TikTok. Bashir is an accused war criminal with unknown whereabouts, who, at the time of his voice clone, was believed to be seriously ill somewhere, Tech Round recalled, in its review of the risks from the generated voice message. As of earlier this year, Sudanese are "baffled" by the whereabouts of the former president. Analysts of this message described the concerns it generated, marking that significant damage was anticipated for Sudanese society, fractured by conflict in recent history, as the contents of the faux messages include a bistro of clippings from former coup attempts and speeches made over what was passed off as a grainy telephone line interview on various social media accounts.
Trade union leaders in Sudan have called for an end to the conflict, while spectators have given remarks on how the rule of military powers led to the decimation of the nation. Analysts of the al-Bashir message worry that hearing the defrocked dictator's voice adds to the noise in Sudanese society, adding new little breaks around the fractures caused by the most recent episode of conflict. Analysts did not even linger so much on the subject material of the al-Bashir faux message, but rather pointed to the fact that the mere utterance of a vanished man lent to the confusion. Sudanese leaders called out as "belligerent" by critics have reportedly held talks in Geneva this week as hopes rise that the current upstart of war could end.
Analysts who spoke with the BBC engaged with the voice clone of al-Bashir, explaining the significance of democratized technology in allowing threat actors to distort reality. These analysts made note of the fact that, in wartime, various threat actors have engaged in distorting information to a significant degree of damage. With democratized emerging technology, analysts believe that the damage of disinformation in combat theaters will increase.
History holds lessons of disinformation's morale impact on the frontline and the homefront as recent as World War II.
War Along the Grapevine
While wars have, since antiquity, engaged the grapevine as a weapon of psyops World War II saw significant perils generated by rumor campaigns. Rumors presented challenges steep enough to inspire special clinics that appeared to aid in combating wild gossip, the Smithsonian Magazine recalled. Clinics opened to combat rumors at home, where morale was weakened by the seeds of discord sowing distrust for the United States and its mission to combat the Axis dominance scheme.
At that time, rumors originated from Axis propagandists and American citizens alike, as fear began to breed contempt, and became the seedbed of rabid disinformation.
A Tragedy in Detroit
Detroit Historical Society has since recounted that some of these homegrown rumors proved lethal, in the form of the Race Riot of 1943, an incident fueled by the abysmal living conditions of rationing during wartime, particularly for African Americans in Detroit, who history recalled lived under greater duress in these times because of their treatment as second class citizens.
During this time, the Caucasian Americans who lived in this area were known to resort to violence to keep their neighborhoods isolated and therefore exacerbated the living conditions of Detroit’s African American population. Social violence has escalated because of the story of the rape of a white woman in the area, and subsequent mobbing and stonings that followed. However, riots broke out when the African American population heard the rumor of a black woman and her baby being hurled from Belle Isle Bridge, leading to African American rioting and looting of White businesses. The fighting between the two race groups led to the push for the then-mayor of Detroit to call on the then-governor of Michigan to send for 3,500 national troops to quell the violence.
The Smithsonian recalled that, as conditions grew dire at home, the Office of War ran its effort to curb the risks from gossip. This led 40 local newspapers nationwide to try to quell the grapevine’s wrath through rumor clinics. Said rumor clinics acted as wartime fact-checkers to fight disinformation for the sake of national morale, an effort mandated by “morale wardens.”
American society faces similar morale-depleting circumstances in today’s civil state of the Union. However, what adds exponentially to the risks of the rumors of the World War II era is the introduction of the deep fake and voice clone technology which, as has already been shown by use in the fraud cases the FTC has highlighted, can spike criminal capacity.
Comparing the Impact
In World War II, the impact of rumors moved at an impressive speed given the technology of the day. The publication Social Forces by the University of Minnesota featured a review by Theodore Caplow, printed in March of 1947, that recalls the speed at which rumors spread from generator to writer:
“Most rumors were transmitted to the writer within a few hours of being heard and set down in writing immediately, both for official purposes and in the interest of this study,” Caplow wrote, explaining the routine transcription duties of researchers of the regimental S-2 section who provided monthly intelligence reports that included a section devoted to rumors.
With the inception of social media, rumors could be written and disseminated by the generator and a secondary source within minutes. Fact-gatherers now have the added challenge of collecting and gathering dissemination at the speed at which it happens, which shaves whole hours off dissemination that once took place in a matter of hours and days. Now fact-gatherers also brace for the impact of a distorted sense of veracity from technological fakes.
Lessons From the Past, Efforts of the Future
As voice clones come for celebrities, politicians, and pedestrians alike, The Washington Post described the impact as "reshaping reality." While reality is bent into a newly contorted posture, however, history reveals that the use of rumors has been a significant player in global conflicts long before voice clones entered the conversation. An evolution rather than a revolution takes shape. As the cocoon of 20th-century propaganda tactics, wraps around the caterpillar of modern AI craft, something reborn and not entirely new emerges. The metamorphosis of new disinformation will take place under this device. To succeed in safeguarding the future, our analysis draws from the rumor clinics of the past, noting that disinformation countering becomes a full-time occupation and clinical service to a world equally challenged on the social platform and the battlefield with fully democratized rumor-shaping capabilities.