For the past three years, Amy Haney has administered a private Facebook group for parents in Georgia who seek more information about vaccines.
Haney, a metro Atlanta resident and mother of four, founded the group in February 2016 after a policy change required one of her children to turn in a standardized shot record from the health department in order to enroll in school. Previously, only a parent-written affidavit was needed.
She says her group was created as a forum to provide information to parents from reputable sources, such as the CDC and published research articles. The goal is to offer resources to help parents with decision-making. “Some of the information that is out there is harder to find, and we make that a little easier by sharing it with each other,” said Haney.
The state of Georgia requires children to be vaccinated for diseases such as diphtheria, polio, hepatitis and measles. (There are religious and medical exemptions.) But health officials worry that misinformation about the safety of vaccines has circulated online and interfered with progress against infectious diseases.
The effort in some quarters to discourage people from giving measles and mumps vaccines to their children is a highly contentious issue. Vaccination rates have become a widely discussed topic as the number of measles cases has soared — rising from 86 reported cases in 2016 to more than 550 reported cases so far this year, including three in Georgia.
Haney’s group, which grew mostly through word-of-mouth, has nearly 900 members with varying beliefs. They do not identify as “anti-vaccine” but as “pro-consent,” aiming to inform parents of all aspects of childhood vaccination.
“Parents should be fully informed to what they’re agreeing to do,” said Haney.
But social media companies such as Facebook, Pinterest and YouTube are beginning to crack down on content about vaccines on their platforms. In March, Facebook became the latest social media giant to announce plans to make it harder for people to find information from anti-vaccine communities.
The trend is becoming popular with these platforms, and public health experts view the decision to remove untrue content as a positive move. But the decision to limit or silence content posted by people opposed to vaccines has been met with mixed reviews by others.
“When they start taking out certain ideas, you wonder how long it is before these corporations control what you do,” said Jared Schroeder, a professor at Southern Methodist University who focuses on freedom of speech online.
A corporate trend
In an announcement titled “Combatting Vaccine Misinformation” posted March 7, Facebook said that it would take a series of steps to reduce some users’ rankings and stop recommending content that does not present accurate information about vaccines. A reduction in ranking makes content appear lower on a user’s feed and less visible in general.
Using public examples from the World Health Organization and the CDC, Facebook said it would lower the ranking of pages and groups that the company deems to be sharing vaccine misinformation. Additionally, ads that contain similar messaging about vaccines will be taken down and/or disabled. These measures extend across both the Facebook and Instagram platforms.
Unlike government websites, social media companies have the ability to restrict online content and speech because they are private entities, according to Schroeder, a journalism professor.
“It’s like we hold our discourses in Walmarts or Targets now. And of course, Target is happy to have us,” Schroeder explained. “But as soon as we start causing disturbance, they’re going to ask us to leave.” That’s what several social media and tech companies have started doing.
Pinterest implemented a change in late 2018 banning all searches for vaccine-related content, whether the information is accurate or inaccurate, according to a story published in The Wall Street Journal.
YouTube said in February that it will remove advertising from videos and channels that it views as spreading inaccuracies about vaccines. In March, Amazon removed two books from its online marketplace that promote unvalidated autism cures and make false claims about vaccines. The companies did not respond to requests for comment.
The crowd-funding site GoFundMe, which announced in March that it will ban users who disseminate false assertions about vaccines, confirmed the banning policy in an email statement.
Public health vs. freedom of speech
The Georgia Department of Public Health declined to comment on Facebook’s policy change, but said in an email response that vaccination is a major public health achievement that has eliminated or controlled the spread of diseases.
It’s a long tradition in America that unconventional ideas must be tolerated, but the major exception is misinformation that endangers people. That’s why food containers have to be properly labeled, and selling fake medicines is illegal. And the scientific consensus in favor of vaccination is solid. Experts note that vaccines have saved countless lives around the world.
Karen M. Hilyard, a public health communication strategist in Atlanta, sees the policy changes by social media as a positive step. “When it comes to vaccinations and misinformation, we’re talking about actual threats to people’s lives, and so the misinformation is very concerning,” she said.
The actual impact of the inaccurate data may not be that great, however. Hilyard, a former professor of health promotion and behavior at the University of Georgia’s College of Public Health, said researchers are not sure how much effect social media misinformation has on people when they’re deciding about vaccinations.
So far, she said, the only thing found to correlate directly with vaccination rates is government policy in the state where a person lives. The harder a state makes it for someone to opt out of vaccinations, the more likely it is that people will adhere to vaccination requirements, she said.
Could the new policies spell trouble for online communities?
He’s not alone in his concern. Alarm about the power of information-related companies to shape public opinion, whether their intentions are good or bad, is strong in many countries. The European Parliament has passed policy changes affecting such giants as Facebook, Google and YouTube. But of course, government intervention raises its own issues.
Haney says she understands that Facebook is a corporation and has the ability to restrict content on its platform. The good news for her is that her group does not promote or advertise its content, so the change in Facebook policy will not affect how the group operates.
Members hold a wide range of beliefs, and are respectful of the varying terms used to address an individual’s views on vaccination, she explained. For instance, “vaccine hesitant” is used for people who are undecided about vaccines, and “ex-vaxxer” is the preferred term for those commonly referred to as anti-vaccine, according to Haney.
As Schroeder noted, Facebook and other media companies don’t violate the First Amendment by restricting content. In the United States at least, Facebook can censor all it wants because it is a commercial enterprise, not a government entity. But he pointed out that silencing certain speech like vaccine misinformation could lead to further kinds of censorship.
Haney declined to discuss her children’s vaccine status.
It’s unknown whether social media companies will further restrict or censor certain kinds of speech. Individuals like Haney, who appreciate the social media sites as forums to discuss information and opinions, believe there will always be a way to continue such discussions. It just might have to happen on other platforms.
“I would imagine if you’re going to censor speech on a grander scale, then there will be other areas that people will turn to,” said Haney. “I don’t necessarily know what that will look like, but if you look around history, people would meet in homes or they would have gatherings or they would find other ways.”