Fake news is a growing phenomenon, and its impact on people, politics and public health, seems greater than ever before. With the advent of social media, misinformation about COVID-19, vaccines, or global warming, for example, can reach huge audiences and circulate very quickly.
So, what kind of social networks can lead to learning the most reliable information? It’s a question explored by Vancouver School of Economics Professor Wei Li in a new paper, recently published in Theoretical Economics.
Prof. Li’s research focuses on the interaction of information and incentives in various economic and political environments. She talks about the new paper’s findings in this Q&A.
Your paper argues that an effective social network is like a social quilt. What is the difference between a social quilt and an echo chamber?
A social quilt is a tree with many branches, where each node is a group of individuals who are connected to each other. Any information observed by one member is shared with all other members of the group next. It is like a close-knit family that meets in groups to discuss a particular topic out in the open. Each node learns correctly, and there is no circle outside the nodes so that information does not travel back and forth. In contrast, an echo chamber is a sequence of circles, where information travels back repeatedly.
Think of people watching one news source. When you talk to two people separately, who are passing on some news, you may not realize their information is coming from the same source. You count these transmissions twice, despite it being one distorted piece of information on a loop. And the more circles there are in a network, the more likely you hear about the same news. You may believe in it completely even if it originates from one piece of false information.
What makes echo chambers so harmful?
Propaganda and disinformation thrive in these networks regardless of how much truthful information is available. Such entrenched, incorrect beliefs can contribute to political polarization, poverty and disease outbreaks. Individuals cannot remove these errors because we only know our local networks—those we talk to—while the false news may travel back from large circles beyond our local networks.
For example, measles vaccination rates among Somali-American children in Minnesota dipped to 43 per cent in 2013, down from 92 per cent in 2004. This persistent myth that measles vaccines is linked to autism caused another severe measles outbreak in Somali-American children in 2017. Minnesota’s public health department has had difficulty fighting off this misinformation, despite public announcements and awareness campaigns. Governments and public health officials have to come up with different ways to disseminate information in these networks. Our research shows that governments have to change the way they do public outreach and communications, to properly meet the times we live in.
What are the public policy implications from this research?
It’s not enough for governments to tell people what is correct or false, backed up by research and data. Nor can they dismiss the power of misinformation early on. Our paper really stresses that governments have to do more than repeat facts to curb misinformation, and their response must be fast and immediate. Policymakers should look to disseminate information through injecting information to key people in a network, or to reshape the network.
In developing countries, there has been success in having the village elder transmit the information to the rest of the community, instead of allowing information to travel freely in small circles. This method of targeting central nodes in the network can give governments the opportunity to transmit correct information more effectively. On an individual level, one useful rule of thumb is to discount the same information if it keeps reaching you. Because as time goes on, the probability that it comes from the same source, and has no new content, becomes higher and higher.