We are fortunate to live in a time where technology has made obtaining information on just about every topic under the sun easy and free. Sadly, the information isn’t always accurate; we’ve all been taught to take Wikipedia entries with a grain of salt, for instance. Because anybody with access to the internet and a few rudimentary skills can post information online, there is a constant risk of inaccurate info being made available. Incorrect medical information isn’t just a nuisance – it can create real problems for people who take what they have read to heart.
There’s been a lot of press lately about the anti-vaxxer movement. Despite Facebook CEO Mark Zuckerberg’s pronouncement in 2015 that science has validated the effectiveness of vaccinations, people searching for vaccination information on Facebook and other social media platforms are often directed toward inaccurate anti-vaccination propaganda that has no basis in science. Instead, it relies on scare tactics to make parents question the effectiveness of vaccines in preventing common childhood diseases, prompting the World Health Organization (WHO) to name the anti-vaxxer movement as one of its Top 10 global health threats in 2019.
Some notable online giants are taking steps to clamp down on misinformation that can cause real-world harm. Both Facebook and YouTube are scrutinizing information posted to their sites that falls under this category more carefully and taking steps to delete anything designed to provoke violence or physical harm. While they don’t specially mention the anti-vaccination movement, it is clear that topic is at the forefront of their concerns. They are right to be proactive: Washington state recently declared a public health emergency due to a measles outbreak that has spread as a result of misinformation about vaccines on various social media channels. The problem isn’t confined to the Pacific Northwest, either; WHO states that there has been a 30 percent increase in measles cases worldwide.
Policing the flow of information is a good first step for companies like Facebook and YouTube, but more needs to be done. Algorithms must be modified to direct people toward fact-based information as opposed to propaganda, and more weight should be given to articles that have been reviewed by third-party fact checkers and provide more context.
As always, the best rule of thumb is never to trust everything you read online. Stick to well-trusted sources and don’t fall for propaganda and hype.