Abby Lee is a sophomore at MU studying journalism and women’s and gender studies. She is an opinion columnist who writes about social issues.
YouTube is the latest social media platform to pledge to work against the plague of anti-vaccination content present on their website. At least, it’s promising to do so on paper. The media mogul follows similar moves by other media sites including Facebook, Instagram, Twitter and Snapchat. The similarities don’t end with the policy contents, however; they continue to overlap when looking at the rule offenders and the inefficacy of the policies across social media platforms.
To put it bluntly, YouTube’s latest anti-vaccination misinformation policies don’t do nearly enough — in fact, they hardly do anything at all.
The policies serve as a guise of concern for the public, but these moguls’ true concerns lie in their profits. YouTube can guarantee steady performances by appeasing both crowds calling for COVID-19 awareness and pandemic deniers by publishing paper-thin guidelines.
The platform has taken their stance, which puts it in the good graces of the former crowd. Unfortunately (but not surprisingly), these words are often hollow and are not followed by substantial, lasting action.
Researchers from the Center for Countering Digital Hate released a study called “The Disinformation Dozen,” which chronicles the “twelve anti-vaxxers who play leading roles in spreading digital misinformation about [COVID] vaccines.” The study finds the Dozen is responsible for 65% of anti-vaccine content. Much of their material and ideas end up reposted by other people, and major platforms (CCDH focused on Facebook, Instagram and Twitter) “fail to act on 95 percent of the [COVID] and vaccine misinformation reported to them.”
Theoretically, if major social media platforms would ban these creators and take down anti-vaccination content they posted, featured in or endorsed, it would tackle a large part of the COVID-19 misinformation issue.
Interestingly enough, YouTube has jumped on the Dozen’s accounts for the most part; 11 out of 12 accounts appear to have been deleted.
It was when I tried and easily succeeded in finding videos featuring the Dozen that I realized banning their accounts was a cover-up — bandaids over 12 bullet holes.
It remains stupidly easy to access videos featuring members of the Dozen and their inaccurate vaccine information. While YouTube led a promising (albeit late) start to tackling vaccine misinformation, it has ended up following other media platforms’ examples. The website has only succeeded in addressing surface-level issues, just like their forerunners.
The fact YouTube execs paint themselves as progressively and proactively fighting misinformation is laughable. Matt Halprin, YouTube’s vice president of global trust and safety, made the following statement: “We felt like we need to address both [general vaccine hesitancy and COVID-19 vaccine misinformation.]”
Is this what YouTube calls addressing these issues? If so, they are failing miserably. I should have at least struggled to find videos featuring vaccine misinformation. Instead, I easily found an episode of Mike Tyson’s podcast that features honorary Disinformation Dozen member Robert F. Kennedy, Jr.
In the episode, which has amassed over 300,000 views, Kennedy expresses his hesitancy to get the vaccine and spreads misinformation. Among his claims on vaccines, he says that “obesity is linked to vaccines.” Kennedy, Jr. also claimed that the chance of being diagnosed with a chronic disease has gone from 12% to 54% since 1989, when the vaccine schedule was changed.
In actuality, FactCheck.org’s Jessica McDonald said that “those percentages are cherry-picked and can’t be compared.”
Mike Tyson has over 2 million subscribers. Why is it this easy to find dangerous misinformation? Why are people this prominent getting away with acting so irresponsibly?
In its mission statement, YouTube claims one of its core values is “freedom of information.” However, it leaves deadly misinformation readily accessible to any and all users.
According to the Prager University v. Google ruling, YouTube is “not a public forum.” It is not required to offer its users total free speech, and it shouldn’t have to. It should act as a forum where people are safe to express themselves and their beliefs as long as they don’t infringe on someone else’s safety, physical or otherwise.
The world’s population has surpassed 5 million deaths from COVID-19. The only way people can actively address and protect themselves from this virus is by getting vaccinated. Content that discourages taking advantage of the single active defense against the virus is harmful, dangerous and should not be tolerated on any platform, much less one as widely used as YouTube.
The Maneater encourages readers to help make vaccines and healthcare accessible globally. One way to do this is by donating to the International Rescue Committee. The IRC provides vaccines and PPE and ensures “clinics and hospitals have access to essential supplies like medical oxygen, testing kits for COVID-19, and access to clean water and sanitation.” https://help.rescue.org/donate/covid-vaccine?ms=gs_ppc_fy21_covid19_dmusa_may&initialms=gs_ppc_fy21_covid19_dmusa_may&gclid=Cj0KCQjw_fiLBhDOARIsAF4khR2NM-m9GTRi9hAmoMXN-B09c2LiZ373WxvnCEQwOxCH5y9x3kNaxa8aAmaLEALw_wcB
Edited by Sarah Rubinstein | srubinstein@themaneater.com