Not sure if Scott Adams manages to convey the notion of cognitive dissonance in his post. I think I get what he’s trying to say: training in economics (among many other disciplines) protects you from experiencing cognitive dissonance.
In other words, economists have no problem keeping two or more completely contradictory ideas in their head at the same time. Supposedly.
But I don’t think the phrase “cognitive dissonance” actually encompasses the disconnect that the panelists on “Politically Correct” experienced.
In other words, I think the phenomenon he is describing is a concept that needs a name.
Sure, there is the concept of binary thinking: if you’re not with us, you must be against us. Failing to believe in the orthodox thinking about global warming/climate change must mean you don’t believe that humans are increasing the CO2 concentration in the atmosphere, according to the binary thinker. I think psychologists/psychiatrists use the term catastrophizing in a similar context. Binary thinkers are apt to catastrophize. Clearly, this is not identical to the ordinary connotation of catastrophe. It is rather more akin to the mathematical connotation of catastrophe: a value of the domain that is undefined, from which values either diverge or converge but never achieve. A synonym to this would be singularity.
But you can see how catastrophizing a situation would result in a dichotomy. There is no way to stay in the middle. You are either on one side or the other of the mathematical catastrophe.
I don’t think it necessarily has anything to do with economics per se. I think it has much more to do with the scientific method. Or at least with the ideal form of the scientific method.
Because even scientists are human beings, they tend to catastrophize too, and start believing that the evidence either supports their pet theory or it doesn’t. Sometimes they manage to remember there is a third category, which is that the evidence has no bearing on the theory whatso ever. But even geniuses blink.
But in an ideal world, a practitioner of the scientific method would be able to accept conflicting models simultaneously, up until the point where one model completely fails to describe reality.
You can see how it should work by how we teach/deal with mechanics. Despite the fact that Einstein proved Newton’s theory of gravity wrong, unless you’re an actual physicist, most of us pretty much get by with deploying Newtonian concepts (although frequently with the caveat of ignoring friction.) But no one really uses the General Theory of Relativity whenever they are dealing with sublight speeds. Newtonian mechanics gets it right to as many decimal points as it humanly matters, so why bother? But when you start having to deal with particle accelerators, or light streaming in from several billion light years away, you have to use General Relativity. There is no cognitive dissonance here. To people whom it matters for, they can comfortably frame shift from one model of the universe to the other, never mind the fact that the two are not, in the utmost details, compatible at all.
I think an even more dramatic illustration of warding off cognitive dissonance is the ability of physicists to accept both General Relativity and Quantum Mechanics simultaneously. There is (as of yet) no actual way to get the two theories to agree. Sure, there’s String Theory/M-Theory, but really, M-Theory is more of a description of an actual theory. A meta-theory, if you will. In the way that Deep Thought was but the designer of Earth, M-Theory is but a blueprint for what the actually Theory of Everything will look like.
But in theory every scientist has to think this way. On one hand, you have to understand the current orthodox understanding of things. For example, Alan Guth’s inflation scenario. On the other hand, you have to also believe in the experimental theory that you are pursue. For example, the cyclic brane collision theory. So far there is no piece of data that proves one is right and the other is wrong, and both theories apparently make the same testatble predictions thus far.
I suppose cognitive dissonance comes down most hard when your brain has decided to espouse on philosophy and reality chooses to try and negate that philosophy. Like, for example, a Republican who finds out that not all poor people are lazy and stupid. Or the fundamentalist Christian who finds out that their best friend in the whole wide world is gay. Cognitive dissonance is known to drive people insane.
But I may be wrong about the whole connotation thing about cognitive dissonance. I’ve been known to create private meanings of certain words and phrases. What a word or phrase means frequently may not be what I think it means.
For example, I’ve always thought of social engineering as the act of becoming “civilized,” that is, of growing accustomed to the customs and mores of the society you are in. You become engineered to act a certain way in certain contexts.
So I’ve always thought of social engineering scams as scams that take advantage of these reflexes.
Like how we’ve all been socialized to obey authority, for example. So when some assertive dude with one of those weird cop mustaches pulls out a little flashy thing and claims it’s their badge, we may end up dropping our pants like they tell us too even though we haven’t actually positively ID’ed them as law enforcement.
But apparently I’m on the wrong end of the stick. It’s the scam that is labeled social engineering. But it still feels weird to call the perps who do these things social engineers. It doesn’t seem right for some reason.
Ah well. The science of linguistics is about being descriptive, not prescriptive anyway, and even if the usage is wrong, eventually it’ll be right.