Social media has the potential to bring friends and families closer together, and help businesses create deeper connections with their customers. Yet it can also sow division and serve as a conduit of misinformation, intentionally agitated by bad actors creating dissension and discord.
At a recent Houston Social Media Breakfast and Impact HUB Houston livestream event, Zoetica Media founder Kami Huyse invited Karen Naumann, APR, PMP to talk about misinformation, disinformation, and malinformation, and how people can avoid spreading them by accident.
Naumann covered a wide range of topics in her presentation, among them the following.
Intent is the differentiator.
One of the important differences between misinformation, disinformation, and malinformation is the intent on part of the person or entity stating or sharing information.
Misinformation
— based on falseness and includes unintentionally erroneous information or mistakes such as inaccurate photo cations, dates, statistics, translations, or even instances where satire is taken seriously.
Disinformation
— based on an intent to harm and includes information that is fabricated or deliberately manipulated, such as audio or visual content, or intentionally created conspiracy theories or rumors.
Malinformation
— based on an intent to harm and includes the deliberate publishing of private information for personal or corporate rather than public interest, such as nonconsensual pornography (“revenge porn”), or the deliberate change of context, date, or time of the original content.
Knowing who the bad actors are.
In an effort to minimize the harm caused by disinformation, it’s important to know who is behind its spread. This can range from countries to organizations, and from groups to individuals.
Naumann notes that the countries most known for spreading misinformation are Russia, Iran, North Korea, and China. “Russia has this firehose of falsehoods… and they just try to throw everything out there… and see what sticks, and we [the United States] are their target.” Countries who engage in such behavior hope to sow discord, undermine civil discourse, and fracture democracy as an institution, she points out.
Speed of dissemination makes falsehoods even more dangerous.
On average, a false story reaches an average of 1,500 people six times more quickly than a factual story, according to Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age. Meanwhile, stories about politics are the most likely to spread and go viral.
That’s why, in today’s always-on world, it’s more important than ever to be cautious about sharing information unless it's from a credible source. Taking a few minutes to verify before sharing can help minimize the spread of falsehoods.
Humans’ psychological proclivity towards confirmation bias plays into the dissemination of falsehoods.
Many people live in online echo chambers, choosing to mainly and in some cases exclusively expose themselves to information sources that already support their world view and political opinions, rather than cause them to reflect.
Artificial intelligence bots capitalize on this by feeding people information that either strengthens their already strongly-held views or, alternatively, upsets them to the point where they increase their engagement just to vent or share their outrage with others.
Measuring the impact of disinformation.
While all disinformation is bad, its negative impact is amplified at least in part based on its level of dissemination by influence operations, namely operations that covertly influence public or political debate and decision-making processes.
According to The Breakout Scale: Measuring the impact of influence operations, there are six different categories that reflect an operation’s impact based on whether it remains on one platform or in one community, or spreads across multiple ones and through several communities. Thus, category one reflects an operation that exists on a single platform and doesn’t spread beyond one community, while category six reflects an operation that exists on multiple platforms and has spread to multiple communities and perhaps even includes a credible risk of violence, and which consequently requires a policy response or some other form of concrete action.
Credit: The Breakout Scale by Ben Nimmo
Watch Naumann’s entire livestream HERE.