Facebook Will Reportedly Limit Anti-Vax Information From Spreading By Enacting These Subtle Yet Powerful Changes

Share

Following the recent measles outbreaks and the rise of unvaccinated children in the United States, Facebook will reportedly limit anti-vax information from spreading by making it "less prominent" on the site, as CNN reported on Monday. Although the reported changes might seem small at first glance, they have the potential to make a big impact in the Facebook community and beyond.

Facebook is generally a great place for people to share important information, but it's also a place where misinformation can be easily spread. One way misinformation is disseminated is through closed Facebook groups because only pre-approved members can join, which means people on the other side of the debate aren't able to share their views. Without challengers, these closed groups allow mistruths to be stated as fact.

Unfortunately, anti-vax groups have a big presence on Facebook. Stop Mandatory Vaccination, for instance, has a whopping 156,000 members as of Tuesday.

Dr Wendy Sue Swanson, spokeswoman of the American Academy of Pediatrics, spoke to The Guardian about why it's important for Facebook to minimize the impact and spread of these anti-vax groups. “Facebook should prioritize dealing with the threat to human health when falsehoods and misinformation are shared. This isn’t just self-harm, it’s community harm," she to The Guardian.

She added, "Parents deserve the truth. If they are being served up something that is not true it will likely increase their levels of anxiety and fear and potentially change their uptake of vaccines, which is dangerous."

After facing public pressure to act, Facebook will reportedly start rolling out changes in the near future. A Facebook representative told CNN that the company is currently "working with health experts to decide what changes to make and considering a combination of approaches to handle vaccine misinformation."

One of the reported ideas is to have anti-vax posts appear farther down on a person's news feed, while another proposal would prevent anti-vax groups from appearing in the "list of groups that Facebook recommends users join," the rep told CNN.

Additionally, Facebook reportedly plans to put "results with vaccine misinformation farther down when people search for certain terms." This reported change is a big deal because articles with anti-vax information are often the first to pop up when you type in "vaccines" into Facebook's search bar, according to The Guardian.

Facebook did not immediately respond to Romper’s request for comment regarding these reported changes.

Another area where Facebook could improve in the fight against anti-vax groups is through advertising. A recent investigation conducted by The Guardian found that Facebook accepted advertising revenue from "Vax Truther, Anti-Vaxxer, Vaccines Revealed and Michigan for Vaccine Choice."

On Thursday, Feb. 14, California Rep. Adam Schiff, a Democrat and chair of the House intelligence committee, cited these advertising concerns in letters to Mark Zuckerberg Google CEO Sundar Pichai, according to The Hill.

The algorithms which power these services are not designed to distinguish quality information from misinformation or misleading information, and the consequences of that are particularly troubling for public health issues,” he wrote, according to The Hill. “I am concerned by the report that Facebook accepts paid advertising that contains deliberate misinformation about vaccines."

In light of these concerns, Facebook is "considering making changes in its advertising policy," the company's representative reportedly told CNN.

Considering the health and safety of children are at stake here, it's important that Facebook and other social media platforms take steps against spreading misinformation.