YouTube is moving to block and remove all content that spreads misinformation about vaccines against COVID-19 and other illnesses, such as measles, hepatitis and chicken pox.
The Google-owned online video company said in a blog post on Wednesday that any content that “falsely alleges that approved vaccines are dangerous and cause chronic health effects” will be removed.
“This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them.”
Since 2020, Google says it has taken down 130,000 videos for violating the company’s COVID-19 vaccine policies, and says it is stepping up those efforts.
“We’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO,” the company said.
The company will remove individual videos from some users, and — as first reported by the Washington Post — will go as far as taking down the accounts entirely of serial spreaders of misinformation, including Joseph Mercola, an American doctor who had more than half a million followers, and Robert F. Kennedy Jr., son of the former presidential candidate, who has been a vocal critic of vaccines and other aspects of modern medicine.
The move comes as YouTube and other tech giants such as Facebook and Twitter have been criticized for not doing enough to stop the spread of false health information on their sites.