Social media giants Twitter, Facebook and its subsidiary Instagram announced they were suspending US President Donald Trump’s accounts on Wednesday in an unprecedented move.
The temporary bans came as pro-Trump supporters stormed the US Capitol building in Washington D.C. The riot interrupted a joint congressional session to confirm President-elect Joe Biden’s election victory.
What led to the ban?
Trump posted a video on Twitter and Facebook more than two hours after protesters entered the Capitol and asauthorities struggled to take control of the situation.
Trump opened his video saying, “I know your pain. I know your hurt. But you have to go home now.”
He repeated claims of voter fraud in the video. He also appealed to his supporters, saying “We don’t want anybody hurt,” adding: “We can’t play into the hands of these people.”
Directly addressing his supporters in the video, he said: “we love you, you’re very special.”
Republican lawmakers and previous administration officials had reportedly begged Trump to call on his supporters to quell the violence.
How did Twitter respond?
Following the video upload, Twitter locked Trump out of his account for 12 hours. The Twitter Safety account cited “repeated and severe violations of our Civic Integrity policy.”
The company required the removal of three of Trump’s tweets, including the video, and warned: “If the Tweets are not removed, the account will remain locked.”
Trump’s account has since deleted those posts, Twitter said.
It said further violations of Twitter rules would result in permanent suspension.
“Our public interest policy — which has guided our enforcement action in this area for years — ends where we believe the risk of harm is higher and/or more severe,” added Twitter.
Twitter initially left the controversial video up but blocked people from being able to retweet it or comment on it. Only later in the day did the platform delete the video entirely.
How did Facebook respond?
Facebook followed suit later, announcing that Trump wouldn’t be able to post for 24 hours following two violations of its policies.
Facebook earlier removed Trump’s video. Guy Rosen, Facebook’s vice president of integrity, said on Twitter Wednesday that the video was removed because it “contributes to rather than diminishes the risk of ongoing violence.”
“This is an emergency situation and we are taking appropriate emergency measures, including removing President Trump’s video,” Rosen said.
Facebook also removed a text post from Trump, which sought to justify the attack, telling supporters to “remember this day forever!”
How have the platforms dealt with Trump previously?
Twitter tightened up its policies regarding content on its platform throughout 2020. This has also impacted Trump’s tweets.
In May, Twitter added a label, instructing readers to fact check the content of one of the president’s tweets for the first time. Then in June 2020, Twitter labelled one of Trump’s tweets as containing “manipulated media,” for the first time.
Since losing the November 3 presidential election to Democrat Joe Biden he has increasingly used the platform to make unsubstantiated claims about electoral fraud to his 88.7 million followers.
Many of his subsequent tweets have carried the blue label: “This claim about election fraud is disputed.”
Since Tuesday morning, 38 percent of Trump’s tweets and retweets have carried that label.
Does Trump’s social media presence matter?
German lawmaker Gyde Jensen told DW she was certain that Donald Trump “knows that his words matter.”
“So he has very much responsibility for what is happening. He could have stopped it [the looting and rioting inside the Capitol buildings] even earlier because his past tweets basically said ‘stay peaceful.’ But he could have said to get out of the Capitol and remove [yourself] from this violent riot.”
By Deutsche Welle