In the narrow sense, I don't know of any strict experimental tests of trapped priors and fear (indeed, I don't even know if the idea has been validated beyond being theoretically plausible)
In the wider sense, any domain where what you currently believe stops you exploring your options can generate analogous traps (e.g. you think republicans aren't worth arguing with, so you never argue with any, so you never learn differently). We wrote about this in this paper
Eiser, J. R., Fazio, R. H., Stafford, T. and Prescott, T. J. (2003), Connectionist simulation of attitude learning: Asymmetries in the acquisition of positive and negative evaluations. Personality and Social Psychology Bulletin, 29, 1221–1235.
Any fact check or misinformation essay which fails to highlight that H. Biden’s laptop was his, not disinfo, becomes part of the untrustworthy media. This lie, used to censor the truth, means the 2020 election was unfair. Unfair, rigged, stolen. The loser in a close but unfair election, or game, can honestly claim it was stolen.
The desire to avoid discussing this elite lie is similar to avoidance of the Obama spy lie as part of the Collusion hoax, with the FBI acting on false info provided by HR Clinton.
Some truth can be used to obfuscate the inconvenient truth the elites want to avoid, allowing the elites to rationalize their own lies while complaining about untruths others believe.
There’s always a market for those elite rationalizations.
This might be a controversial view (from a onetime minor name in the disinfo field) but I suspect that fact-checking usually misses the mark. QAnon is an example. Those 'thick' stories being about wider intuitions, facts don't matter. The same applies to vaccine hysteria and stolen elections. But where the continued attachment to fact-checking troubles me is in how it speaks of controlling perceptions. And how that gives rise to often justified fears of narrative control.
I hear you. There's something in the Full Fact and co manifesto about backfire effects, I believe. Can you give an example fact checkers speaking of controlling perceptions?
I have a Full Fact mug ! I still consult the site for specific material claims.
Currently stumped on a specific example of perception and narrative control example. The issue - it seems to me - is more subtle. An example is the now defunct Disinformation Governance Board, in the US, so easily being dubbed the 'ministry of truth'. Some of the people involved with it had quite dubious track records on judging certain claims related to Covid issues as 'misinformation' when they eventually proved to be legitimate.
I'm trying to steer a course between dismissing those concerned with stemming the flow of errant nonsense as 'thought police' and describing them as activists for a particular worldview, such as 'trust institutions or we will have anarchy'.
I'm working on my own piece about this, so my thoughts are still a bit nebulous. Sorry about that.
Is there any evidence about whether a 'trapped prior' could be related to fearfulness?
good question!
In the narrow sense, I don't know of any strict experimental tests of trapped priors and fear (indeed, I don't even know if the idea has been validated beyond being theoretically plausible)
In the wider sense, any domain where what you currently believe stops you exploring your options can generate analogous traps (e.g. you think republicans aren't worth arguing with, so you never argue with any, so you never learn differently). We wrote about this in this paper
Eiser, J. R., Fazio, R. H., Stafford, T. and Prescott, T. J. (2003), Connectionist simulation of attitude learning: Asymmetries in the acquisition of positive and negative evaluations. Personality and Social Psychology Bulletin, 29, 1221–1235.
https://drive.google.com/file/d/1bULXLeeUMv3wgq7g47JgLAZYfD_0ailg/view
Any fact check or misinformation essay which fails to highlight that H. Biden’s laptop was his, not disinfo, becomes part of the untrustworthy media. This lie, used to censor the truth, means the 2020 election was unfair. Unfair, rigged, stolen. The loser in a close but unfair election, or game, can honestly claim it was stolen.
The desire to avoid discussing this elite lie is similar to avoidance of the Obama spy lie as part of the Collusion hoax, with the FBI acting on false info provided by HR Clinton.
Some truth can be used to obfuscate the inconvenient truth the elites want to avoid, allowing the elites to rationalize their own lies while complaining about untruths others believe.
There’s always a market for those elite rationalizations.
This might be a controversial view (from a onetime minor name in the disinfo field) but I suspect that fact-checking usually misses the mark. QAnon is an example. Those 'thick' stories being about wider intuitions, facts don't matter. The same applies to vaccine hysteria and stolen elections. But where the continued attachment to fact-checking troubles me is in how it speaks of controlling perceptions. And how that gives rise to often justified fears of narrative control.
I hear you. There's something in the Full Fact and co manifesto about backfire effects, I believe. Can you give an example fact checkers speaking of controlling perceptions?
I have a Full Fact mug ! I still consult the site for specific material claims.
Currently stumped on a specific example of perception and narrative control example. The issue - it seems to me - is more subtle. An example is the now defunct Disinformation Governance Board, in the US, so easily being dubbed the 'ministry of truth'. Some of the people involved with it had quite dubious track records on judging certain claims related to Covid issues as 'misinformation' when they eventually proved to be legitimate.
I'm trying to steer a course between dismissing those concerned with stemming the flow of errant nonsense as 'thought police' and describing them as activists for a particular worldview, such as 'trust institutions or we will have anarchy'.
I'm working on my own piece about this, so my thoughts are still a bit nebulous. Sorry about that.