Encrypted messages. Two-factor authentication. Real-time monitoring of social media for malicious internet bot activity.
This is the new reality for candidates running in 2018, scared of email hacks and elaborate misinformation schemes like the ones Russia used to disrupt the 2016 campaign.
And many candidates say they’re concerned they can’t rely on Congress or the White House for advice, or protection.
“Since many in Washington continue to bury their head in the sand over the dangers our Democracy faces, our campaign has taken deliberate steps to guard against cyberattacks by mandating extensive security measures,” said Gareth Rhodes, a Democrat running for an upstate New York House seat. He said he’s put his campaign staff through training on how to identify phishing and hacking attempts.
The horror of 2016′s hacked emails is still fresh for most operatives. Democratic lawmakers saw their cellphone numbers splashed online. Democratic National Committee chairwoman Debbie Wasserman Schultz resigned before the convention. The hacks even prompted a North Carolina man to storm a Washington pizzeria with an assault rifle, based on an internet conspiracy theory that started with Clinton campaign chairman John Podesta’s emails.
Since then, the Democratic Senatorial Campaign Committee has been hosting cybersecurity briefings for its candidates and staff, pushing campaigns to use encrypted messaging and two-factor authentication. The National Republican Congressional Committee, or NRCC, has hired multiple cybersecurity staffers to work with its candidates and promises to do more.
“We’re starting to advise campaigns, but we’re not ready to roll the whole thing out. We’re working on it,” NRCC Chairman Steve Stivers said this week. “We’re working on the technology-based stuff to try and make sure that we know what’s out there — which is hard, too — and then we try to defend against it the best we can.”
Leaders with the Democratic Congressional Campaign Committee and the NRCC negotiated last year on a coordinated defense against hacks and cyberattacks, but the talks crumbled last summer amid accusations from both sides of grandstanding on the issues, according to Democratic and Republican officials familiar with the effort. The officials spoke on condition of anonymity to discuss private negotiations.
Jason Rosenbaum, the former head of digital advertising for Hillary Clinton’s presidential campaign, likened the average congressional campaign to how Rocky Balboa of the ’80s blockbuster movie “Rocky IV” was doing a bare-bones training regime in an isolated cabin in the frozen tundra and clearly was outgunned by Russian prizefighter Ivan Drago.
“Drago had unlimited state resources, and House campaigns are like Rocky, pushing tree logs in the snow,” said Rosenbaum, who also worked previously in Google’s elections and issues department.
Special counsel Robert Mueller only heightened these concerns when he revealed an intricate misinformation campaign run out of Russia, which used fake identities, set up rallies in America and rushed protesters into the streets on both sides of the divide.
The deeper problem, say cybersecurity experts advising campaigns, is that while hacks and phishing attempts can be blocked, misinformation is more amorphous and harder to curtail.
Supporters of Virginia Democratic Gov. Ralph Northam may offer the best example of what can, and cannot, be done.
In the homestretch of the Virginia governor’s race last year, a Democratic group aired an explosive ad showing a white man in a pickup truck with a waving Confederate flag chasing four black, Hispanic and Muslim kids through a leafy suburban neighborhood.
It sparked an outcry among conservatives who said it unfairly painted supporters of Republican candidate Ed Gillespie as unrepentant racists. The spot was taken down after two days, and Democrats thought they may have avoided any nasty consequences from the politically insensitive ad. But then a small group of Twitter bots and accounts closely associated with Russia’s Internet Research Agency, a Kremlin-connected troll farm, latched on and kept the ad alive through the final week of the race.
In a matter of hours, an easily missed TV ad quickly punched through the din of the national news and was enshrined as one caustic part of the 2017 governor’s race. Now, with the 2018 vote looming for hundreds of candidates for governor and the House and Senate, it’s a cautionary tale about the perils of a new political landscape filled with bots, trolls and even “cyborgs” — real people blasting from dozens of social media accounts at a time.
“You’re not going to be able to battle them in the digital sphere, there’s just too many. It’s calling them out for what they are. They’re not voters, they’re not constituents — they’re just machines,” said David Turner, who worked as Northam’s spokesman during the governor’s race.
A social media report commissioned by Virginia’s teachers union pinned much of the blame on 15 Twitter accounts. The report did not specifically state that the accounts were operated by Russia’s troll farm, but the accounts were heavily retweeted and promoted by Russian accounts, according to a database compiled by NBC of tweets purged by Twitter.
US intelligence officials have warned that Russian operatives didn’t stop on Election Day 2016. While they offered few details, officials said they expect attacks to continue through the current election season.
The social media giants, too, have struggled to come up with answers on their own.
Through the end of the 2016 election campaign, the Tennessee Republican Party pressed Twitter to take down an impostor account that was tweeting wild accusations — like claims that then President Barack Obama wanted to convert children to Islam. But Twitter didn’t do anything for 11 months, until it discovered the account was linked to Russian meddling in the election.
Mueller later tagged the account “@TEN_GOP” as one of the most active run by the Internet Research Agency in St. Petersburg, Russia.
But when Twitter recently purged thousands of accounts it discovered were fake or automated, it spurred a backlash among conservative pundits online who lost thousands of followers. The hashtag “#TwitterLockout” quickly began trending last week in response to the purge.
Later the same day, the chairman of the House Intelligence Committee, Republican Devin Nunes, mocked Democrats on Twitter worried about Russian meddling: “Catch up on mainstream media Russian conspiracy theories in this piece by @FDRLST PS-If you are a Russian Bot please make this go viral PSS-If you’re not a Russian Bot you will become one if you retweet.”
Mueller’s indictment of the Russian nationals and companies two weeks ago outlined an effort that was mostly aimed at helping Trump and hurting Clinton. But their targets weren’t all Democrats — the indictment said the Russians also tried to spread misinformation about some of Trump’s GOP primary opponents, including Republican Sen. Marco Rubio.
Terry Sullivan, Rubio’s campaign manager in 2016, said the campaign noticed misinformation online but didn’t suspect it was from Russians. He’s not managing any campaigns this year, but advises anyone who is slammed by negative content online to create more of their own content that is positive.
“What I learned early on is you can only focus on the things you can control and don’t worry about the rest,” Sullivan said. “And to a large extent this is beyond any campaign manager’s control.”
The other problem, noted Stivers from the NRCC, is that misinformation is a quintessential part of campaign politics.
“It’s been part of American politics since the presidential campaigns of the 1800s,” he said.