Facebook Conducts Study Addressing The Tricky Problem Of Vaccine Mistrust

By now, it's likely that most of us have identified that one relative who often gets drawn in by conspiracy theories. And over the past year, you've likely either blocked them from social media or have been treated to a barrage of bizarre claims about the COVID-19 pandemic and the vaccines developed to combat it in recent months.

Yet it's easy to forget just how many of those relatives exist and how powerful they can be when they come together. As an article in the British Medical Journal expressed, social media campaigns sharing vaccine misinformation can have a predictable effect on vaccine hesitancy within offline communities and affect vaccination rates.

Such research and other concerns of the effect of vaccine hesitancy on efforts to curb the pandemic have previously compelled platforms like Facebook to crack down on anti-vaccine posts and groups on their platforms.

However, an ongoing study within Facebook is already revealing that combating vaccine hesitancy on the platform won't be as simple as that.

When a Facebook post has the potential to make people hesitant about receiving vaccines, the company's algorithms mark it as "VH."

And as internal documents obtained by The Washington Post revealed, Facebook is in the process of a widespread study of these posts that intends to determine what communities are distrustful of vaccines and how their ideas continue to spread on the platform.

To do this, they broke down all of their American users and groups into 638 population segments. Although they didn't explicitly define these segments or how they were grouped together, Facebook's document suggested that each one could encompass at least 3 million people.

Although Facebook's study is still in its early stages, it has already discovered that 50% of the platform's vaccine hesitancy content comes from just 10 of those 638 segments.

And according to The Washington Post, when they dug deeper into the segment with the most prevalent mistrust of vaccines, about half of its potentially problematic content was only being produced by 111 people.

Considering that NBC News reported that about 30% of Americans have stated they do not intend to get the vaccine, it seems imperative to learn how wide of a net that minority can cast.

Although Facebook has identified that a small number of influencers have acted as "superspreaders" of vaccine misinformation, they've also noticed a significant overlap between groups with high levels of vaccine mistrust and those linked to QAnon.

Although Facebook has been active in banning accounts associated with the conspiracy group, those who remain are using general distrust in the U.S. government and elite institutions as an engine for vaccine hesitancy.

As the document reportedly states, "It’s possible QAnon is causally connected to vaccine hesitancy."

However, QAnon's brand of vaccine hesistancy posting doesn't encompass all of the related content on Facebook and it's difficult to determine what can be done about the rest of it.

According to The Washington Post, the problem is that a great deal of arguably vaccine hesitant content doesn't actually violate any of the platform's rules and is too nuanced to be considered 100% harmful.

For instance, if someone discusses how the side effects of the vaccine they received were more severe than anticipated, that could be of value to both others seeking a full picture of what to expect and health authorities who weren't aware of their experiences.

However, this post could also have the potential effect of discouraging people from receiving the vaccine at all, making it harmful without necessarily violating any rules or arguing in bad faith.

This presents a problem for Facebook as it's more difficult to balance the encouragement of free speech vs. the reduction of harm when there isn't a clear bad actor to ban.

As Facebook's head of health Kang-Xing Jin wrote, "Vaccine conversations are nuanced, so content can’t always be clearly divided into helpful and harmful. It’s hard to draw the line on posts that contain people’s personal experiences with vaccines."

As such, addressing this content will require a similarly nuanced solution. Since its study isn't complete yet, Facebook hasn't necessarily landed on one. Still, some possibilities include an adjustment to its policies on problematic content or directing information from the 60 health experts they're partnered with towards groups showing high vaccine hesitancy.

In any case, one can expect the results of this study to heavily inform how they decide to tackle the issue from here on out.

h/t: The Washington Post