5
When discussing the online context of false information in today’s information age, the concept of false
information driven by a motive of a deceptive agenda has existed for many decades in military
warfare.
12,13
Therefore, the strategies for countering false information with a malicious agenda in the
online environment by either coordinated efforts of humans or bots could be informed by the offline
environment as well.
14,15
Problem
One of the biggest challenges public safety
agencies and organizations face is how to reduce
or eliminate the spread of false information,
especially as public demands for a response from
these authorities increases. Social media can
distribute news faster and to a wider audience
than traditional news sources. However, that also
means the potential for misinformation, false
information and rumors to spread and go viral is
high.
16,17
A factor that may impede first
responders’ ability to mitigate and minimize the
spread of misinformation, rumors and false
information is the decreasing public trust in
government, media and nongovernmental
organizations (NGOs). While
2017 was a low point
in terms of credibility of the media, the 2018
Edelman’s Trust Barometer showed trust in
journalism jumped five points and trust in social
media platforms dipped two points. In addition,
the credibility of “a person like yourself” — often
a source of news and information on social media — dipped to an all-time low in the study’s history.
While this paper is focused on social media, responder agencies should be aware that many people still
get their news from television, which serves as an additional resource to counter false information.
18
12
Whaley, B. “Toward a General Theory of Deception.” The Journal of Strategic Studies. 1982. 5(1), 178-192.
13
Holt, T. The Deceivers: Allied Military Deception in the Second World War. Simon and Schuster, 2010.
14
A computer program that performs automatic repetitive tasks. <https://www.merriam-
webster.com/dictionary/bot>.
15
For future reading on this whole section, see Manheim, Jerol. Strategy in Information and Influence Campaigns:
How Police Advocates, Social Movements, Insurgent Groups, Corporations, Governments and Others Get What
They Want. Routledge, 2010.
16
Incorrect or misleading information. <https://www.merriam-webster.com/dictionary/misinformation>.
17
Madhusree Mukerjee. “How Fake News Goes Viral – Here’s the Math.” Scientific American, July 14, 2017.
<https://www.scientificamerican.com/article/how-fake-news-goes-viral-mdash-heres-the-math/>.
18
Pew Research Center. “Pathways to News.” July 7, 2016. <http://www.journalism.org/2016/07/07/pathways-to-
news/>.
By Catherine Graham, Humanity Road
After the April 2015 earthquake in Nepal, a
Facebook post described 300 houses in Dhading
needed aid. The post was shared over 1,000 times,
reaching over 350,000 people within 48 hours. The
originator of this message was trying to find help
for Ward #4’s villagers via social media. Facebook
statistics show that the average user has 350
contacts, meaning this one message was viewed
by approximately 350,000 Facebook users. A week
before the viral post,
this need had already been
shared on
quakemap.org, a crisis-mapping
database built by online volunteers and managed
by Kathmandu Living Labs. On May 7, Helping
Hands (a humanitarian group) was notified, and by
May 11, Ward #4 received much-needed food and
shelter. While the late Facebook post was meant
to be helpful, the need had already been taken
care of. This short example demonstrates that
sharing outdated information can waste resources
1