YouTube pulls 30k videos with COVID vax misinformation

With the aim to demotivate false claims about COVID-19, video-streaming platform YouTube has removed over 30,000 videos for sharing misinformation about COVID-19 vaccines over the last six months

Representative Image (Photo Courtesy: IANS)
Representative Image (Photo Courtesy: IANS)
user

IANS

With the aim to demotivate false claims about COVID-19, video-streaming platform YouTube has removed over 30,000 videos for sharing misinformation about COVID-19 vaccines over the last six months.

According to a report by Axios, the video-streaming platform has taken down more than 800,000 videos containing COVID-19 misinformation since February 2020. The videos are first flagged by either the company's AI systems or human reviewers, then receive another level of review.

Videos that violate the vaccine policy, according to YouTube's rules, are those that contradict expert consensus on the vaccines from health authorities or the World Health Organization (WHO), the report said.

Other platforms, including Facebook and Twitter, have also rolled out policies to reduce the spread and reach of such content.

Recently, the micro-blogging platform Twitter introduced a strike system against misleading tweets about COVID-19 vaccination and five or more strikes will result in permanent suspension of the account.


Since introducing the COVID-19 guidance, Twitter said it has removed more than 8,400 tweets and challenged 11.5 million accounts worldwide.

While one strike will cause no account-level action, two strikes will lead to a 12-hour account lock; three strikes in another 12-hour account lock; four strikes in a 7-day account lock and five or more strikes means permanent suspension of the account.

Labels will first be applied by Twitter team members when they determine the content violates the platform's policy.

Follow us on: Facebook, Twitter, Google News, Instagram 

Join our official telegram channel (@nationalherald) and stay updated with the latest headlines