Update, January 8:
Social-media platforms have have taken differing positons against outgoing US president Donald Trump. After initial bans stretching between 12 and 24 hours from Twitter and Facebook, both platforms have opted to deal with the flow and promotion of disinformation by Trump differently.
While Facebook has hardered its stance and expelled him until the end of his term, Twitter has re-instated him, with a dire warning of being booted out should his actions recur. "After the Tweets were removed and the subsequent 12-hour period expired, access to @realDonaldTrump was restored ... Any future violations of the Twitter Rules, including our Civic Integrity or Violent Threats policies, will result in permanent suspension of the @realDonaldTrump account,: Twitter's statement read.
Facebook took a far dimmer view of Trump's actions, suspending his access for at least a fortnight. "His decision to use his platform to condone rather than condemn the actions of his supporters at the Capitol building has rightly disturbed people in the US and around the world," Mark Zuckerberg, Facebook's founder said in a post. "We believe the risks of allowing the President to continue to use our service during this period are simply too great. Therefore, we are extending the block we have placed on his Facebook and Instagram accounts indefinitely and for at least the next two weeks until the peaceful transition of power is complete."
Original article published January 7:
Amid an armed invasion of the US Capitol building as Congress was meeting to affirm Joe Biden's election, social-media platforms including Facebook, Twitter and YouTube have been compelled overnight to act against posts from outgoing US president Donald Trump.
The posts in question seemingly sought to stoke actions from Trump-aligned protestors and continued to claim the presidential election had been "stolen". While Facebook banned Trump from posting for 24 hours, Twitter locked Trump's account for 12 hours, providing he removes certain Tweets.
Facebook declared an "emergency situation" and removed one of Trump's videos, as did YouTube.
Over the last few years, these platforms have had to keep pace with an explosion of misinformation from many sources, including Trump and his supporters, and have reacted in varying ways to try to retain the neutrality of their platforms. While Twitter has added a disclaimer to some Tweets, Trump's supporters from the far right have also seen their content taken down and/or had their accounts banned in the run-up to an acrimonious US presidential election. Twitter has throughout Trump's term chosen not to suspend or revoke his personal account, even though it objectively violates the company's policies on a regular basis.
These new moves mark some of the most stringent responses from the platforms, which have previously been accused of allowing the president too much leeway.
"We are appalled by the violence at the Capitol today ... our Elections Operations Center has already been active in anticipation of the Georgia elections and the vote by Congress to certify the election, and we are monitoring activity on our platform in real time," Guy Rosen, VP of integrity, and Monika Bickert, VP of global policy management for Facebook said in a blog post. "As a part of this, we removed from Facebook and Instagram the recent video of President Trump speaking about the protests and his subsequent post about the election results. We made the decision that on balance these posts contribute to, rather than diminish, the risk of ongoing violence."
"We will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 US Presidential election," YouTube also announced.
Facebook's strong statement didn't stop prominent voices, including Alex Stamos, its former security chief, from laying into the platform for being complict in allowing the radical flames to spread in the first place.