Really fascinating discussion on Mind Hacks about cognitive dissonance (actually, its about reductionism, but the cognitive dissonance is what interests me.) This is the idea that we rationalise our choices based on external events.
“Cognitive Dissonance is a term which describes an uncomfortable feeling we experience when our actions and beliefs are contradictory…[and>…we are motivated to reduce dissonance by adjusting our beliefs to be in line with our actions.”
The post lists two expeiments
1. subjects were asked to perform a boring task,and then paid to lie to another subject (actually the experimenter) to persuade her to do it as well. Those paid less reported that the task was more interesting. (Because “what kind of person would lie to an innocent for only $1? So, the theory goes, they would experience dissonance between their actions and their beliefs and reduce this by adjusting their beliefs: they would come to believe that they actually did enjoy the boring task”)
2. children or monkey were offered three different things (eg sweets) they valued equally. They were then offered only two (A and B). Say they chose A. Next, they were offered a choice between C and the one theyd rejected, ie B. Thye were much more likely to choose C, presumably because they had already rejected B once and this downgraded it in their estimation.
(Mind Hacks then goes on to question both these experiments – could reductionists find simpler explanations?)
Its an interesting parallel with Bayesian probability. Previous knowledge influences present choices, partly for psychological and partly for mathematical reasons.
Given that many types of simulation involve manufacturing and undergoing an event, there is undoubtedly a cognitive dissonance effect on users – thats the whole principle behind training, in one sense. Except if. instead of making you adjust your behaviour to do it better, the process leads you to rationalise your failure in some other way, eg it wasnt worth doing.
I see this sort of thing professionally sometimes, when we run an emergency response exercise. A few clients, who follow their own response procedures very badly, tend to conclude that it was a badly written unrealistic exercise, and that there is no point in doing these things.
You can also see the effects of major exercises in raising consciousness of an issue – eg after a high-profile avian flu exercise, people are probably more likely to take measures that they might not otherwise have approved. How far is this just straight common sense, and how far is it a subtler form of cognitive dissonance skewing responses too far the other way?
We once inadvertently got an entire chemical storage site closed down, simply by holding an exercise which showed how serious the effects of an incident there might be. (The exercise said nothing about the probability of an incident!)
The other thing is that our longer-term attitudes are altered. This probably explains why some people become addicted to computer games: once you ave made a commitment to something, you are motivated to justify it. Another really good example of this is Tanya Luhrmans book, “Persuasions of the Witchs Craft: ritual magic in modem culture”, which studies a modern withcraft group and the way in which they adjust reality to fit their own beliefs.