Wednesday, October 27, 2021

Fighting wave of misinformation, YouTube bans false vaccine claims

Amanda SEITZ. By

YouTube on Wednesday announced a sweeping crackdown on vaccine misinformation that removed popular anti-vaccine influencers from its site and removed false claims made about a range of vaccinations.

The video-sharing platform said it will no longer allow users to speculate whether approved vaccines, such as those given to prevent the flu or measles, are dangerous or cause diseases.

YouTube’s latest effort to stem the tide of vaccine misinformation comes as countries around the world struggle to accept free vaccinations to the somewhat hesitant public, which scientists say. To say that will end the COVID-19 pandemic that started 20 months ago. The tech platform, which is owned by Google, already tried to ban COVID-19 vaccine misinformation last year, at the height of the pandemic.

“We have consistently seen false claims about coronavirus vaccines spread in misinformation about vaccines in general, and we are now at a point where we have expanded the work we started with COVID-19 to other vaccines. It’s more important than ever to do that,” YouTube said in a blog post.

As of Wednesday, anti-vaccine influencers, who have thousands of subscribers, were using YouTube to stir up fears about vaccines, which health experts say have been administered safely for decades. The YouTube channel of an organization run by environmental activist Robert F. Kennedy Jr. was one of several popular anti-vaccine accounts that went up as of Wednesday morning.

In a statement emailed to the Associated Press, Kennedy criticized the ban: “There is no instance in history when censorship and secrecy have outpaced democracy or public health.”

YouTube declined to provide details about how many accounts were removed in the action.

Under its new policy, YouTube says it will remove misinformation about any vaccine that has been approved by health authorities such as the World Health Organization, and is currently being administered. False claims that those vaccines are dangerous or cause health problems such as cancer, infertility or autism – principles that have been discredited by scientists for decades but tolerated on the Internet – must also be dispelled.

“The concept that vaccines harm rather than help — is the foundation of a lot of misinformation,” said Jeanine Guidry, a media and public health professor at Virginia Commonwealth University School of Medicine.

Read Also:  Ford, Korean battery maker plots $11.4 billion in electric truck, battery plants

He said that, if implemented properly, the new rules could prevent bad information from affecting a new parent who is using the Internet to research whether or not to vaccinate their child. for example.

But, as is common when tech platforms announce stricter regulations, YouTube remains vulnerable to the spread of anti-vaccine misinformation.

Claims for the vaccines that are being tested will still be allowed. Personal stories about reactions to vaccines will also be allowed, as long as they do not come from an account that has a history of promoting vaccine misinformation.

Despite tech companies announcing a string of new rules around COVID-19 during the pandemic, and vaccine misinformation, the lies on the platform still found a large audience.

In March, Twitter began labeling content making misleading claims about COVID-19 vaccines and said it would ban accounts that repeatedly shared such posts. Facebook, which also owns Instagram, had already banned posts claiming COVID-19 vaccines contained tracking microchips, and announced in February that it would remove similar claims. Will give that vaccines are toxic or can cause health problems such as autism.

Yet popular anti-vaccine influencers go live on Facebook, Instagram and Twitter, where they actively use the platform to sell books or videos. On Facebook and Instagram alone, a handful of anti-vaccine influencers still have 6.4 million followers, according to the social media watchdog group Center for Countering Digital Hate. And COVID-19 vaccine misinformation is so widespread on Facebook that in July President Joe Biden accused influencers on the platform of “killing people” with lies about the COVID-19 vaccine.

Other platforms have taken a harder line. For example, Pinterest blocked any form of vaccine misinformation even before the pandemic began. Now, if users search for content about vaccines on the site, they are directed to visit official websites run by the Centers for Disease Control and Prevention and WHO. ___

Associated Press writers David Klepper in Providence, Rhode Island, and Barbara Ortute in Oakland, Calif., contributed to this report.

Nation World News Deskhttps://nationworldnews.com
Nation World News is the fastest emerging news website covering all the latest news, world’s top stories, science news entertainment sports cricket’s latest discoveries, new technology gadgets, politics news, and more.
Latest news
Related news
- Advertisement -