Russia is essentially the most prolific overseas affect actor utilizing synthetic intelligence to generate content material focusing on the 2024 presidential election, U.S. intelligence officers stated on Monday.
The cutting-edge know-how is making it simpler for Russia in addition to Iran to shortly and extra convincingly tailor often-polarizing content material geared toward swaying American voters, an official from the Workplace of the Director of Nationwide Intelligence, who spoke on situation of anonymity, instructed reporters at a briefing.
“The [intelligence community] considers AI a malign affect accelerant, not but a revolutionary affect instrument,” the official stated. “In different phrases, info operations are the menace, and AI is an enabler.”
Intelligence officers have beforehand stated they noticed AI utilized in elections abroad. “Our replace right this moment makes clear that that is now occurring right here,” the ODNI official stated.
Russian affect operations have unfold artificial pictures, video, audio, and textual content on-line, officers stated. That features AI-generated content material “of and about distinguished U.S. figures” and materials in search of to emphasise divisive points resembling immigration. Officers stated that’s according to the Kremlin’s broader objective to spice up former President Donald Trump and denigrate Vice President Kamala Harris.
However Russia can also be utilizing lower-tech strategies. The ODNI official stated Russian affect actors staged a video wherein a girl claimed to be a sufferer of a hit-and-run by Harris in 2011. There’s no proof that ever occurred. Final week, Microsoft additionally stated Russia was behind the video, which was unfold by a web site claiming to be a nonexistent native San Francisco TV station.
Russia can also be behind manipulated movies of Harris’s speeches, the ODNI official stated. They could have been altered utilizing modifying instruments or with AI. They had been disseminated on social media and utilizing different strategies.
“One of many efforts we see Russian affect actors do is, after they create this media, attempt to encourage its unfold,” the ODNI official stated.
The official stated the movies of Harris had been altered in a spread of the way, to “paint her in a nasty mild each personally but in addition compared to her opponent” and to give attention to points Russia believes are divisive.
Iran has additionally tapped AI to generate social media posts and write faux tales for web sites posing as reliable information shops, officers stated. The intelligence neighborhood has stated Iran is in search of to undercut Trump within the 2024 election.
Iran has used AI to create such content material in each English and Spanish, and is focusing on Individuals “throughout the political spectrum on polarizing points” together with the conflict in Gaza and the presidential candidates, officers stated.
China, the third foremost overseas menace to U.S. elections, is utilizing AI in its broader affect operations that purpose to form international views of China and amplify divisive subjects within the U.S. resembling drug use, immigration, and abortion, officers stated.
Nonetheless, officers stated that they had not recognized any AI-powered operations focusing on the result of voting within the U.S. The intelligence neighborhood has stated Beijing’s affect operations are extra targeted on down-ballot races within the U.S. than the presidential contest.
U.S. officers, lawmakers, tech corporations, and researchers have been involved concerning the potential for AI-powered manipulation to upend this 12 months’s election marketing campaign, resembling deepfake movies or audio depicting candidates doing or saying one thing they did not or deceptive voters concerning the voting course of.
Whereas these threats could but nonetheless materialize as election day attracts nearer, up to now AI has been used extra ceaselessly in several methods: by overseas adversaries to enhance productiveness and enhance quantity, and by political partisans to generate memes and jokes.
On Monday, the ODNI official stated overseas actors have been sluggish to beat three foremost obstacles to AI-generated content material turning into a larger danger to American elections: first, overcome guardrails constructed into many AI instruments with out being detected; second, develop their very own subtle fashions; and third, strategically goal and distribute AI content material.
As Election Day nears, the intelligence neighborhood might be monitoring for overseas efforts to introduce misleading or AI-generated content material in a wide range of methods, together with “laundering materials by distinguished figures,” utilizing faux social media accounts or web sites posing as information shops, or “releasing supposed ‘leaks’ of AI-generated content material that seem delicate or controversial,” the ODNI report stated.
Earlier this month, the Justice Division accused Russian state broadcaster RT, which the U.S. authorities says operates as an arm of Russian intelligence providers, of funneling almost $10 million to pro-Trump American influencers who posted movies important of Harris and Ukraine. The influencers say they didn’t know the cash got here from Russia.