In a year when the World Health Organization listed “vaccine hesitancy” as one of the top 10 global health threats, researchers at the University of Maryland have investigated a new source that could perpetuate the spread of vaccine misinformation: Facebook advertisements.

Using over 300 advertisements from December 2018 and February 2019 in Facebook’s Ad Library — a platform it introduced last year that publicly released an archive of the platform’s advertisements — the researchers found that several anti-vaccination campaigns had successfully leveraged Facebook into targeting its ads for specific populations.

Sandra Crouse Quinn, the chair of this university’s family science department, co-led the research; Amelia Jamison, a research assistant at this university, contributed to the writing of the findings, which were published in the academic journal Vaccine in November.

The researchers found that 54 percent of the advertisements opposing vaccination came from just two organizations: the World Mercury Project and Stop Mandatory Vaccination.

“When you see that kind of consolidation, and you know, major organizations that are anti- vaccine providing these ads, it misrepresents,” Quinn said. “It can be confusing to parents who won’t necessarily be able to discern, you know, ‘are these real people?’”

[Read more: A UMD student’s research yielded 500 pounds of beans — and a hefty gift to Dining Services]

Meanwhile, pro-vaccination advertisements came from a wider variety of sources, including public health departments. Quinn said that many of the pro-vaccine groups do not have enough financial resources or staff to devote to social media, and they are “at a little bit of a disadvantage” compared to the larger, consolidated anti-vaccination advertisers.

“People are reaching out looking for information about vaccines, and they’re not necessarily knowing how to judge the credibility, the scientific merit of the information they’re seeing,” Quinn said. “They could be making decisions that are not in the best interest of their children or the broader community.”

The research is part of a five-year grant the team received from the National Institute of General Medical Sciences to research the relationship between vaccinations and social media. This is the first time the team has looked at Facebook advertisements; last year, it discovered that Russian trolls had infiltrated the vaccine discussion on Twitter, contributing to both sides.

The research comes as a global outbreak of measles — a disease preventable by vaccination — began this year. In response to the outbreak, Facebook instituted new guidelines in March intended to limit the amount of misinformation spread in vaccination advertisements.

Jamison noted that the team’s research was conducted before this new policy, so it could serve as a “useful baseline to be able to assess the impact of those policy changes.”

[Read more: UMD astronomers invited students to watch Mercury pass in front of the sun]

Henry Boyd, a clinical professor in the marketing department at the university’s business school, called the spread of misinformation “unforgivable.” He added that since “negative information travels quickly,” it is up to reputable organizations to dominate the advertising platform.

“We may have to give people a history lesson to say, ‘you don’t want to know what the pandemic looked like back in 1918,’” Boyd said. “Why would we relive history when we don’t have to do this anymore?”

Quinn added that in the future, it is important that consumers are aware of possible misinformation spread across all social media platforms.

“Really thinking about becoming a more critical consumer while one reads on Reddit or Facebook, I think is something that all of us need to be working toward.” Quinn said.