YouTube announced on September 29 that it would ban all content that spreads misinformation about COVID-19 vaccines, including several anti-vaccine activists.
In a blog post, the video platform said it would remove videos claiming that approved vaccines are dangerous, can cause chronic health effects, do not reduce transmission or contraction of disease, and any other content that promotes vaccine hesitancy. (Read: Common COVID-19 Vaccine Fake News And Myths–Debunked!)
“We’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO,” the post said.
Since many people are getting into video blogs or vlogs, some content creators may intentionally or unintentionally spread misinformation about COVID-19 vaccines. Find out if the video you’re watching or uploading violates the updated COVID-19 policy of YouTube. Here are content and information that are not allowed.
Vaccines can cause other disease
Content that falsely says that approved vaccines cause autism, cancer, infertility, or that substances in vaccines can track those who receive them will be removed.
“Our policies not only cover specific routine immunizations like for measles or Hepatitis B, but also apply to general statements about vaccines,” YouTube underscored.
The World Health Organization (WHO) stated that there is no evidence of any link between vaccines and autism or autistic disorders, and it has been demonstrated in many studies conducted across very large populations. (Read: What’s the Difference Between CoronaVac, AstraZeneca, Moderna, etc?)
Vaccines are dangerous or ineffective
There are YouTube videos that disseminate fake news, theories, and studies without proof about the COVID-19 vaccines, which may discourage people from getting their COVID-19 jabs. WHO has been telling the public that all vaccine brands are safe and effective. Several studies also proved that vaccines approved by WHO can protect an individual against severe cases of COVID-19.
“Vaccination is safe and side effects from a vaccine are usually minor and temporary, such as a sore arm or mild fever. More serious side effects are possible, but extremely rare,” WHO explained.
Anti-vaccine content creators
Anti-vaccine activists and content creators will not be exempted from YouTube’s updated COVID-19 policy. In order to combat the misinformation about COVID-19 vaccines, the sources should be targeted first. YouTube has already removed the accounts of prominent anti-vaccine activists such as Erin Elizabeth, Sherri Tenpenny, Joseph Mercola, and Robert Kennedy Jr.
Content that denies the existence of COVID-19
The world has been battling the COVID-19 pandemic for two years now. Unfortunately, there are still people who don’t believe in the existence of COVID-19. Videos claiming that people have not died or gotten sick from COVID-19 and it no longer exists or that the pandemic is over, including false claims about the symptoms, death rates, or contagiousness of COVID-19 will be banned on YouTube as well.
Denying the existence of the virus also denies the existence and purpose of the vaccines, as it can be one of the reasons that may prevent some people from getting inoculated. (Read: 3 Things to Do After Getting Your COVID-19 Vaccine)
YouTube assures that its policy update will serve as an “important step” to address vaccine and health misinformation on its platform. As a viewer or content creator, make sure to always check the sources of the content you’re watching or uploading.