TikTok uses interest in politics to target anti-vaccine content

Ms April Thompson, a second year Doctor of Medicine and Surgery student, has been chosen to present her research and represent the Australian National University (ANU) at the upcoming Medical Deans Australia and New Zealand - Research Educator Network event.

Ms Thompson’s research uncovered that the TikTok algorithm – which picks up on a person’s interests and shows content it believes the user is interested in – can expose users to anti-vaccine content, and identified the themes present in that content.

“We found that the algorithm sees politics as one of the pathways to anti-vaccine content. We know from research on the Human Papillomavirus (HPV) vaccine that negative social media posts can influence a persons' risk-benefit analysis, so themes that are prevalent, such as safety and Adverse Event Following Immunisation (AEFI) in anti-vaccine videos may have implications for public health.”

“Research done on other social platforms reveals that anti-vaccine content online sheds doubt over the safety of vaccines. Considering this, it is possible that routinely being exposed to videos about catastrophic side effects is further skewing how users perceive the COVID vaccine,” Ms Thompson explained.

Although research into COVID vaccine content on TikTok has, in the past, sampled popular videos under commonly used and popular COVID hashtags it didn't find much misinformation.

Ms Thompson’s research sampled content on TikTok chosen by the algorithm and shown on a user’s ‘For Your Page’ – something not done before. ‘Dummy’ accounts were created to engage with either health or political content until the account was successfully coded to the TikTok algorithms.

“The dummy accounts that were successfully coded for health-interested users saw no anti-vaccine content, however, four of the five accounts coded for political users saw anti-vaccine content,” Ms Thompson said.

“When we sampled the explicitly anti-vaccine videos, we found that over half discussed the safety of the vaccine, and 39% mentioned an adverse vaccine reaction. Almost all the reactions mentioned were catastrophic, rare or not recognised side effects of the COVID vaccine. For example, 29% of the reactions mentioned were death.”

“After we found an algorithmic link between politics and anti-vaccine content, we tried to investigate whether there was a difference between conservative and progressive political users. We found that the political content associated with anti-vaccine beliefs wasn't clear cut. Themes that couldn't be coded as conservative or political were common, such as anti-governance, anti-WHO or anti-globalism.” Ms Thompson explained.

Given these insights, when questioned about social media regulation Ms Thompson advised, “Government regulation was beyond the scope of this study, however the findings provide insights into how users may adapt or react to such measures.”

“For example, we found that in previously conducted research only 11% of the anti-vaccine content on TikTok used popular hashtags, such as Coronavirus, Vaccine, Covid-19, and WearAMask. This might be a way for anti-vaccine content creators to avoid suppression, and it is possible that anti-vaccine content creators will adapt around further regulations.”

“In addition, TikTok and Twitter both flagged videos relevant to the pandemic with 'educational banners' so user knew what was considered public health information. Previous research has found that many relevant videos aren't automatically tagged with these banners, and that the banners can make sceptical users even more sceptical. This suggested that government regulations may be ineffective at reducing anti-vaccine beliefs.”

Ms Thompson’s presentation was selected through an ANU medical school event where she competed against fellow year 2 students, presenting her findings in three minutes and by answering one question (known as the 3+1Q). The format for the event was inspired by the three-minute thesis established by the University of Queensland.