YouTube Bans Anti-Vaccine Misinformation

But creating a new set of rules and enforcement policies took months, because it is difficult to rein in content across many languages and because of the complicated debate over where to draw the line on what users can post, the person said. For example, YouTube will not remove a video of a parent talking about a child’s negative reaction to a vaccine, but it will remove a channel dedicated to parents providing such testimonials.

Misinformation researchers have for years pointed to the proliferation of anti-vaccine content on social networks as a factor in vaccine hesitation — including slowing rates of Covid-19 vaccine adoption in more conservative states. Reporting has shown that YouTube videos often act as the source of content that subsequently goes viral on platforms like Facebook and Twitter, sometimes racking up tens of millions of views.

“One platform’s policies affect enforcement across all the others because of the way networks work across services,” said Evelyn Douek, a lecturer at Harvard Law School who focuses on online speech and misinformation. “YouTube is one of the most highly linked domains on Facebook, for example.”

She added: “It’s not possible to think of these issues platform by platform. That’s not how anti-vaccination groups think of them. We have to think of the internet ecosystem as a whole.”

Prominent anti-vaccine activists have long been able to build huge audiences online, helped along by the algorithmic powers of social networks that prioritize videos and posts that are particularly successful at capturing people’s attention. A nonprofit, the Center for Countering Digital Hate, published research this year showing that a group of 12 people were responsible for sharing 65 percent of all anti-vaccine messaging on social media, calling the group the “Disinformation Dozen.” In July, the White House cited the research as it criticized tech companies for allowing misinformation about the coronavirus and vaccines to spread widely, sparking a tense back-and-forth between the administration and Facebook.

Several people listed in the Disinformation Dozen no longer have channels on YouTube, including Dr. Mercola, an osteopathic physician who took the top spot on the list. His following on Facebook and Instagram totals more than three million, while his YouTube account, before it was taken down, had nearly half a million followers. Dr. Mercola’s Twitter account, which is still live, has over 320,000 followers.

YouTube said that in the past year it had removed over 130,000 videos for violating its Covid-19 vaccine policies. But this did not include what the video platform called “borderline videos” that discussed vaccine skepticism on the site. In the past, the company simply removed such videos from search results and recommendations, while promoting videos from experts and public health institutions.

Daisuke Wakabayashi contributed reporting. Ben Decker contributed research.

Source

Be the first to comment

Leave a Reply

Your email address will not be published.


*


4 × 2 =