Why Facebook Bans Donald Trump Till 2023 Over US Capitol Hill Violence?
  • Save

Why Facebook Bans Donald Trump Till 2023 Over US Capitol Hill Violence?

Last Updated on: 5th June 2021, 02:46 pm

Former President Donald Trump’s ban from Facebook will last at least two years, the company said on Friday as it revised its rules for moderating political speeches.

In a blog post, Facebook introduced new rules for public figures who use the company’s platforms, saying the company will now lock accounts that violate its policies during times of turmoil and violence.

Facebook said the former president’s actions violated company rules and warranted a two-year suspension effective from the original January 7th date.

Investors ignored the news, with the Facebook stock rising 1.4% in Friday afternoon trading. Additional content monitoring has historically cut Facebook’s profits as the company tries to monitor the billions of posts on its platforms.

“Given the gravity of the circumstances that led to Mr. Trump’s suspension, we believe his actions constituted a grave violation of our rules that deserves the highest penalty available under the new enforcement protocols,” said Nick Clegg, Vice President for Global Affairs, on a blog post.

Also Read: When is National Donut Day 2021 USA? How to get Free Doughnuts Deals, Discounts, and Offers

Clegg added that Facebook would evaluate Trump’s behavior after the lockdown ended, saying it could extend the ban if necessary.

“When the ban is finally lifted, there will be a rigorous set of rapidly escalating sanctions that will be triggered if Mr. Trump commits further violations in the future, up to and including the permanent removal of his pages and accounts,” said Clegg.

Beyond Trump’s two-year suspension, the company plans to treat politician’s posts the same as other users’ posts, undoing a policy to enforce an exception important to political discourse.

“Ultimately, when we evaluate content from a news perspective, we will not treat content posted by politicians any differently from content posted by someone else,” Facebook wrote in the blog post.

“Instead, we simply apply our thematic value balancing test to all content equally and measure whether the value of the public interest of the content outweighs the potential risk of damage by leaving it out.”

Content moderation decisions have increased the cost of doing business for Facebook over time.

In 2017, Facebook executives warned investors that its margins would be impacted as the company planned to add around 20,000 employees to the company’s security teams.

Facebook’s net income rose 39% in 2018 but declined 16% in 2019. Last year, the company’s profit rose 58% to $ 29.15 billion amid Covid-19-related lockdowns.

Also Read: Twitter now officially launches its ‘Twitter Blue’ first paid subscription service, Implements to correct a typo

The profit margin fell from 39% in 2017 to an expected 33% this year. Friday’s announcement contained no potential business impact.

0 Shares
Share via
Copy link