Strategies To Combat Disinformation

Education is the most powerful weapon which you can use to change the world.

Nelson Mandela

Disinformation has become a key weapon in dividing societies. As part of effort to provide resources to #FightBack, here are some strategies, taken from evidence-based research or testimonials, to fight back and restore harmony in our families, friendships and communities.

Education, communication, community, and trust are key factors in combatting disinformation. These are the tools we will use to fight back, both online and in-person.

Personal Relationships And Community-Based Approaches

The most effective way to combat disinformation is through dialogue with family, friends and neighbours in our communities. Building trust through empathy, education and gentle conversation are the necessary components to encouraging critical thinking, creating doubt and changing minds. This is a slow process that requires multiple interactions, patience and compassion.

Steps:

  1. Identify sources of relatability: Create trust and comfort through familiarity. Search for common ground, both politically and personally, e.g. shared hopes, dreams, fears, struggles, etc.
  2. Do not attack: Antagonistic approaches are unsuccessful; civil and engaging conversations are necessary. You need not agree, accept or defend; try asking questions instead. For example, questions like “Where did you see/hear/read about that” are engaging and encourage consideration of sources of information.
  3. Ask permission to introduce the possibility of reconciliation with family and friends: Finding and feeling acceptance is extremely important. Reconciliation is necessary in finding harmony with those we disagree with, especially those we have close relationships with. Try to encourage them to think about their lives and relationships prior to their beliefs in the disinformation.
  4. Don’t debate facts and policy: Debating is unsuccessful. Try to engage in gentle, respectful conversations. Showing empathy may help in enabling communications that are more open-minded and less dehumanizing. For example, saying, “I understand some of the reasons why you wanted another Trump presidency. Do you understand some of the reasons why others didn’t want another Trump presidency?” This helps to get past the “us vs them” instinct of categorizing and dehumanizing people with whom we may disagree. This approach is likely unfamiliar to them and may enable you to gently probe their beliefs, create seeds of doubt, and inspire more critical thinking. Connecting, gently communicating and opening doors to community could potentially rebuild relationships that you may have thought were lost.
  5. Build supportive communities and connections. Once you have strengthened your relationship and increased trust, it’s important to continue to offer support or direction to sources of support. Evidence points to the power of former group members, or “formers,” in helping people leave extremist groups. Formers provide social support and can reflect on the challenges and fears associated with leaving. One organization currently offering support to Americans is Leaving MAGA. People are also more likely to get out of toxic groups if they know someone else leaving. This is why communities, and the support, encouragement, companionship and other social benefits they offer, are so important in building healthy societies.

Education is key for younger generations. Research on preventing radicalization in the first place has focused on using inoculative messaging. That is, warnings about extremist messaging before exposure to either left- or right-wing propaganda resulted in an increased ability to think critically and later dismiss disinformation.

Fighting Disinformation Online

Social media has become an effective tool in the spread of disinformation. While traditional news outlets can be easily assessed by ownership and the content generated, social media has several tools that enable disinformation to spread quickly.

Guidelines:

  1. Ignore all sources of disinformation, including posts, comments and users: Engaging in content that you disagree with is not an effective use of your time an energy. Engaging only gives that content higher visibility, spreading the message further and helping the creators of the disinformation. Often, it is time-consuming, frustrating, emotionally difficult, and completely ineffective. Influencing other social media users is not possible without multiple conversations that build trusting relationships, as outlined above. Increasingly, other social media users are not humans, they’re machine-generated bots.
    • Online arguments do not benefit anyone except the owner of the platform. The two largest sources of disinformation on social media are X and Facebook. The more time you spend on these platforms, and the more you engage with others, even with simple emoji responses, adds to their profits. You do not need to click on ads to help these platforms make money, your presence and interactions with others create profits for them through network effects. Do not comment, do not react with angry emojis.
    • Increasingly, accounts posting and sharing disinformation are bots, cloned profiles or trolls. With Meta’s plans to move towards AI-generated profiles, distinguishing between real people and AI-generated profiles will be very difficult. These fake profiles are generated to sway opinions with inflated “likes”, comments and posts.
  2. Comment, like and share content that combats disinformation: To some degree, this helps increase the reach of the content, enabling the positive messages to be seen by more people. This improves the potential for facts to have influence or become widely accepted. However, reach is highly controlled and limited by the platform, as businesses must pay to have their posts seen by users who have already liked their page or joined their groups.
  3. If possible, try to minimize or stop using platforms that spread disinformation: It is simply not possible to combat disinformation on platforms that do not actively try to stop their spread. Even with fact checking, the disinformation spreads more quickly and more widely than facts. Because owners of the platforms benefit from the engagement generated by outraged users reacting emotionally to posts or comments they disagree with, there is little incentive to stop manufactured outrage. In fact, the owners of X and Facebook are complicit in the spread of disinformation and the effects that the campaigns have had on democracy.
    • Some users believe that they need to fight against platform owners, or that they can work from within to combat the agendas of the platform. Unfortunately, it is not possible. Platforms function as they are programmed, with ultimate control in the hands of the owner. Algos, fake profiles, removing visibility of advocacy groups, hiding content (posts, comments, groups, pages, users) containing keywords, enabling wide reach of disinformation while stifling reach of facts, promoting profile clones, intentionally manipulating users, stealing copyrighted IP, and creating AI-generated users and content based on stolen content and their own user base are just some of the technological devices that are being used to spread disinformation and manipulate societies. The only way to fight back against social media platforms that intentionally spread disinformation is to stop using them and to discourage others from using the platforms as well.

One of the main reasons people disengage from extremist groups is the same reason many people leave jobs or other organizations—because they dislike their boss. With the chaos, inflation and job losses created by the current administration, we may see a growing shift in perspectives, particularly from people who did not vote for fascism and instead voted against the status quo. In fact, eighty-three percent of Americans disapproved of the decision to pardon Jan. 6 rioters and a University of Massachusetts Amherst poll taken before the election last fall showed that Project 2025 was incredibly unpopular among Americans. The majority of Americans also disapprove of Musk’s role in government, aggression on allies, deporting those brought to the US as children, and attempts to end USAID. As each day passes, discontent and outrage grow.

If you have family members, friends or neighbours who has been influenced by disinformation, they may start questioning that decision now. Any effort to help them embrace their doubts could be quite impactful.

Critical thinking and recognition of propaganda pose a threat to dictatorships. With human rights under constant attack, community-based educators are our best weapons in our fight for freedom and democracy.


Additional resources can be found in our Human Rights knowledge base

References:

Zara Abrams, American Psychological Association

Annemarie van de Weert, EU Home Affairs

Ann-Sophie Hemmingsen, Danish Institute for International Studies

University of Massachusetts Amherst

Jared Sharp, University of Massachusetts Amherst

The Journal for Deradicalization

Julio-cesar Chavez, Andrew Goudsward, Jason Lange and Nathan Layne, Reuters

Additional References:
How to Stand Up to a Dictator: The Fight for Our Future
How We Learn to Be Brave: Decisive Moments in Life and Faith
The Demon-Haunted World: Science as a Candle in the Dark
On Tyranny
Democracy

*Note: This article may be updated as more information and research is published.
**Note: We’ve discussed social media and Facebook to some extent here on BCF (#TechnologyIssues) and although there are growing calls to move away from FB, it’s clear that the platform is wildly popular. It seems reasonable to take advantage of its popularity to advocate for human rights. We still encourage decreased use for purely social interactions and note that Bluesky and Signal are becoming more popular ways to connect with friends, family, public figures and communities. We have no affiliation with either of these two platforms.

avataravataravatar

Leave a Reply