- Google has suspended President Trump's YouTube account and formally warned the White House about its use of the world's largest video platform.
- The company has a three-strike rule and the first strike results in a temporary account suspension.
- The company said it issued the White House a warning for a violation, adding that it could face the same suspension.
Google has suspended President Donald Trump's YouTube account and formally warned the White House about its use of the world's largest video platform.
The company said Tuesday night that Trump uploaded content that violated its policies, giving it an automatic one strike, which leads to a minimum seven-day suspension from uploading new content. It said it is also disabling the comments section.
The company did not specify which videos violated its policies but said it was "content" that included comments Trump made Tuesday morning. YouTube said it violated policies that prohibit content for inciting violence.
Donald J. Trump's YouTube account has 2.77 million subscribers and typically posts several videos a day from him and from right-wing media stations.
Under YouTube's three-strike system, a channel will be suspended for one week after the first strike, two after the second and terminated after a third strike within 90 days. The temporary suspension means Trump's account and existing videos will remain accessible but he won't be able to upload new content.
"After review, and in light of concerns about the ongoing potential for violence, we removed new content uploaded to Donald J. Trump's channel for violating our policies," the company said in a statement on social media Tuesday evening. "Given the ongoing concerns about violence, we will also be indefinitely disabling comments on President Trump's channel, as we've done to other channels where there are safety concerns found in the comments section."
A company spokesperson said it also removed videos from the White House's YouTube account but that, instead of a first strike, the company issued a warning because "there has not been the same history of violative content on this channel, but should further violative content get uploaded to the channel, it would receive a strike, per our policies."
Google-owned YouTube on Thursday announced that it would suspend — a first strike — "any channels" posting new videos of false claims in violation of its policies, rather than giving them a warning first. Later Thursday, Alphabet employees called on YouTube executives to take further action against the president, criticizing them for not suspending his account and reasoning he would incite further misinformation and violence.
YouTube's suspension of Trump's account comes after the violence at the U.S. Capitol by some Trump supporters on Wednesday, which left five dead. Politicians and the public have called for social media and technology companies to more closely moderate their platforms, which are at risk of inciting further violence.
Twitter and Facebook announced they were suspending Trump's account on their respective platforms — Twitter, permanently. However, Trump found a workaround and began tweeting from the government-owned @POTUS account late Friday before that account was suspended.
Google also removed Parler, a social media app popular with Trump supporters, from the Google Play Store Friday, making it much harder for Android users to download and access the app.
YouTube is unique from other social networks because videos can be shared on other platforms, giving it a wide-ranging reach.
Clarification: YouTube's announcement on Twitter was Tuesday night. An earlier version had the incorrect day in one reference.