黑料网

Misinformation in the World Today

Faculty Spotlight: Professor Emeritus Thomas J. Froehlich Ph.D. Warns About Misinformation

Thomas J. Froehlich, Ph.D., is Professor Emeritus in the School of Information at 黑料网. He has published extensively about ethical considerations in information professions. Dr. Froehlich teaches and publishes extensivively on misinformation in the public sphere, including a chapter in a recent collection, . He spoke with us regarding disinformation in the twenty-first century. This is part one of a two-part interview.

How would you define 鈥渄isinformation鈥?

Disinformation: lies or false information supplied with the deliberate intent to mislead, most often in a political context. In theory, this should be easy to identify; in practice, it is not always that clear.

What motivates someone to engage in disinformation?

There is no one motive for the creation, acceptance or spread of disinformation. At the top of the disinformation chain, it is to retain political and economic advantages. At the middle class or lower level, it is to nurture grievances whether founded or not, to maintain privilege or to engage in negative polarization (where voters side with one candidate not out of faith in him/her but out of fury with the other side).   

There are predispositions to accept disinformation: cognitive bias, gullibility, willful ignorance, self-deception, avoiding discordant information, etc. These are heavily influenced by cognitive authorities, such as news sources, peers, religious leaders, social media channels and political associations.  

Is there a meaningful difference between willful 鈥渄isinformation鈥 and perhaps inadvertent 鈥渕isinformation鈥 where wrong information is being given out without ulterior motivation?

There is not a simple answer to this question. For example, a public figure spreads misinformation about the coronavirus then it is repeated in certain news channels. On one level this may be seen as merely misinformation, but it is disinformation disguised as misinformation, because the ultimate objective is control and manipulation. 

Now, if I hear something on social media and misinterpret it and tell a friend, such an action might be done without ulterior motivation. This is inadvertent misinformation. But these days, even such 鈥渋nnocent鈥 acts may represent cognitive bias 鈥 I misunderstand the message to prove my point or the 鈥渞ightness鈥 of my bias.


Our constraints on lies, false information and attacks on genuine expertise have been replaced by self-righteous, unjustified opinions.


What is the typical transmission route of disinformation? Does it matter if the participants are willful or neutral transmitters?

There are some very willful transmitters at the top who make money and retain power from it, but we have a disinformation ecology, consisting of messages and messengers. Like-minded (and like-propagating) cognitive authorities, news channels, social media, etc. that all reinforce the same messages, make it difficult to question the reliability of a given message and control which communication channels are the 鈥渞ight鈥 ones to which to pay attention.  

Any agency in the chain can start some disinforming message and it get echoed back to most, if not all elements in the chain. Repetition is a major factor in cementing the disinformation: how could anyone reject some bit of misinformation coming from so many different sources saying the same thing? To question any message is to question the whole edifice and one鈥檚 stake in this ecology. That is why it is hard to change anyone in what is called a propaganda feedback loop or filter bubble. The filter bubble is a self-propagating and self-reinforcing, explicitly refuting any challenging sources. 

How has social media transformed our disinformation consumption? 

Social media has considerably aggravated (dis/mis-) information. Studies have shown false information spreads more quickly and broadly than true information on the internet and that false information is virtually impossible to retract and stays in people鈥檚 minds, whether it is retracted or not. Prior to the internet and social media, information, misinformation and disinformation ran through clearance circles or reliability checks (e.g., newspapers and news broadcasts checking their sources, especially for their reliability and trustworthiness).  

Now a single person with any theory or grudge or false belief can broadcast it on social media, can attract millions of followers and they have no filters that will check their assertions. Not only that but one or few voices can seem to be amplified into millions, by bots and software algorithms. Our constraints on lies, false information and attacks on genuine expertise have been replaced by self-righteous, unjustified opinions.  
 

Part Two of this interview can be read here.

UPDATED: Friday, December 09, 2022 11:40 AM