Celebrating 10 Years of Trusted News Discovery
One News Page
> > >

Facebook's Fake News Fight May Have Helped Vaccine Misinfo

Video Credit: Newsy Studio - Duration: 02:33s - Published < > Embed
Facebook's Fake News Fight May Have Helped Vaccine Misinfo

Facebook's Fake News Fight May Have Helped Vaccine Misinfo

Reports show that some social media algorithms designed to fight fake news are pointing people to vaccine misinformation before real info.


Facebook's Fake News Fight May Have Helped Vaccine Misinfo

Social media giants have struggled lately to stop misinformation campaigns on their platforms, and now they're coming under fire again for their possible role in outbreaks of measles around the world.

According to a recent investigation,  Facebook and Youtube 's recommendation tools routinely promote unscientific vaccine misinformation instead of verified, factual sources.

This has drawn the attention of lawmakers who want to how these companies plan to solve the problem.  The big social media companies do take extra steps to knock down misinformation that can cause real-world harm — but those policies are meant to combat politically-charged or viral misinformation.  Facebook recently told the Washington Post  that anti-vaccination content doesn't fit into this frame.

In early February, YouTube echoed Facebook, saying vaccination videos weren't a primary target of its policies.

Now, unregulated misinformation is easy to find.

A Guardian report found when users with no friends or likes entered neutral terms like “vaccine” into Facebook or YouTube’s search engines, the top results skewed toward anti-vaccination information.  SEE MORE: US Lawmakers Want To Question Facebook About Closed Group Privacy Media researchers say Facebook seems to be responsible for most of the misinformation, in part because of the steps it took to  fight fake news  after the 2016 U.S. Presidential election.  Facebook said  it would refocus its Groups feature to direct people to join more groups tailored to their interests, and to make the content from those groups appear on their timeline more often.

In just two years, Facebook group membership  jumped 40%  to 1.4 billion monthly users.

Now, experts say these groups are harder to regulate or monitor than before — and because Facebook's business model is built on getting people to join more groups, misinformation can spread to hundreds of thousands of people at a time.

According to a Credibility Coalition and Health Feedback study, Facebook accounted for 96% of shares of the  100-most engaged health stories  on social media — and less than half of those 100 stories were considered "highly credible." U.S. Rep.

Adam Schiff has sent letters  to Facebook  and Google  to ask how they currently address this problem, and what additional steps they plan to take.

Specifically, Schiff asked how these companies planned to "distinguish quality information from misinformation or misleading information" in its algorithms. So far, the companies have responded with basic steps.  YouTube said  it would stop recommending what it calls "borderline content" in its autoplay algorithm.

Facebook said it would explore "additional measures to best combat the problem" like reducing the amount of antivaccine content that shows up in its searches. 

You Might Like

Environmentally friendly: One News Page is hosted on servers powered solely by renewable energy
© 2019 One News Page Ltd. All Rights Reserved.
About us  |  Contact us  |  Disclaimer  |  Press Room  |  Terms & Conditions  |  Content Accreditation
 RSS  |  News for my Website  |  Free news search widget  |  In the News  |  DMCA / Content Removal  |  Privacy & Data Protection Policy
How are we doing? FeedbackSend us your feedback  |   LIKE us on Facebook   FOLLOW us on Twitter  •  FOLLOW us on Pinterest
One News® is a registered trademark of One News Page Ltd.