US Senate Warns Big Tech to Act Fast Against Election Meddling

Top officials from Google, Apple, and Meta testified Wednesday before the United States Senate Intelligence Committee about their company’s ongoing efforts to identify and disrupt foreign influence campaigns ahead of the country’s November elections.

The hearing, chaired by senator Mark Warner of Virginia, served largely to impress upon the companies the need for more extensive safeguards against the disinformation campaigns being funded by foreign entities with an eye on influencing US politics.

“This is really our effort to try to urge you guys to do more. To alert the public that this has not gone away,” Warner said.

The chairman, a proponent of expanding cooperation between the government and Silicon Valley to root out campaigns by Russia, Iran, and China, among other legally designated rivals, described the recent efforts by Russia as both “effective and cheap.”

The Treasury Department’s Office of Foreign Assets Control placed sanctions this month on 10 Russian citizens, several of them employees of the state-funded news outlet RT, formerly Russia Today. US secretary of state Antony Blinken on Friday accused the Russian outlet of working hand-in-hand with the country’s intelligence services, conducting influence and cyber operations meant to covertly spread Kremlin propaganda on more than three continents. And earlier this month, US authorities accused RT employees of bankrolling right-with influencer network Tenet Media.

Warner noted—almost as an aside—that Elon Musk’s X had refused to send a representative to testify Wednesday. A spokesperson for Warner told WIRED that X’s former chief of global affairs, Nick Pickles, had previously agreed to appear before the committee; however, he resigned from the company roughly two weeks later. X then declined to provide a replacement. (Pickles could not be immediately reached for comment.)

Warner received the companies that did appear amicably, praising the “positive role” they’ve played during the government’s recent actions: Meta’s recent decision, for example, to ban RT—formerly Russia Today—and its subsidiary Sputnik from its platforms. Warner also highlighted recent decisions at Google and Microsoft to publicly reveal information about foreign election threats, keeping the public and government better informed.

In addition to the Tenet Media indictment, the Department of Justice revealed this month in an FBI affidavit that it had seized 32 internet domains allegedly tied to the Kremlin and related entities. The websites, with names like “fox-news[.]top,” were created to imitate popular media and news brands, including CNN, spreading content favorable to Russia. One fake Fox News story, for instance, declared that Ukraine has “no particular value to the US,” and that squaring off with Russia is “too great” a risk.

The operation, dubbed “Doppelganger,” allegedly relied on influencers and paid social media advertisements, as well as fake accounts that mimicked US citizens—in some cases with the help of artificial intelligence. In private documents obtained by the FBI, the operation’s principal director—a little-known Russian political strategist named Ilya Gambashidzer—is alleged to have stated plainly: “They are expecting fake news from us every day.”

Marco Rubio, the committee’s Republican vice chair, argued on the behalf of Americans who, he said, should not be punished for holding views that align with the Kremlin’s. “The question becomes is that disinformation, or is that misinformation, is that an influence operation, because that pre-existing view is being amplified?” Decisions by companies to remove the amplified information is “problematic and complicated,” he said, adding that he believes it risks “stigmatiz[ing]” Americans holding those views.

Andy Carvin, the managing editor and research director of the Digital Forensic Research Lab (DFRLab), tells WIRED that his organization, which conducts a vast amount of research into disinformation and other online harms, has been tracking Doppelganger for more than two years. The scope of the operation should surprise few, he says, given the fake news sites follow an obvious template, and that populating them with AI-generated text is simple.

“Russian operations like Doppelganger are like throwing spaghetti at a wall,” he says. “They toss out as much as they can and see what sticks.”

Meta, in a written statement on Tuesday, said it had banned RT’s parent company, Rossiya Segodnya, and “other related entities” globally across Instagram, Facebook, and Threads for engaging in what it called “foreign interference activity.” (“Meta is discrediting itself,” the Kremlin replied Tuesday, claiming the ban has endangered the company’s “prospects” for “normalizing” relations with Russia.)

Testifying on Wednesday, Meta president of global affairs Nick Clegg stressed the industry-wide nature of the problem facing voters online. “People trying to interfere with elections rarely target a single platform,” he said, adding that Meta is, nevertheless, “confident” in its ability to protect the integrity of “not only this year’s elections in the United States, but elections everywhere.”

Warner appeared less than fully convinced, noting the use of paid advertisements in recent malign influence campaigns. “I would have thought,” he said, “eight years later, we would be better at at least screening the advertisers.”

He added that, seven months ago, over two dozen tech companies had signed the AI Elections Accord in Munich—an agreement to invest in research and the development of countermeasures against harmful AI. While some of the firms have been responsive, he said, others have ignored repeated inquiries by US lawmakers, many eager to hear how those investments played out.

While talking up Google’s efforts to “identify problematic accounts, particularly around election ads,” Alphabet’s chief legal officer, Kent Walker, was halted mid-sentence. Citing conversations with the Treasury Department, Warner interrupted to say that he’d confirmed as recently as February that both Google and Meta have “repeatedly allowed Russian influence actors, including sanctioned entities, to use your ad tools.”

The Virginia senator stressed that Congress needed to know specifically “how much content” relevant bad actors had paid to promote to US audiences this year. “And we’re going to need that [information] extraordinarily fast,” he added, referring as well to details of how many Americans specifically had seen the content. Walker replied to say that Google had taken down “something like 11,000 efforts by Russian-associated entities to post content on YouTube and the like.”

Warner additionally urged the officials against viewing Election Day as if it were an end-zone. Of equal and great importance is the integrity of the news that reaches voters, he stressed, in the days and weeks that follow.

Facebook
Twitter
LinkedIn
Telegram
Tumblr