While not explicitly stated, the following writing by George Soros summarizes what I believe are the need and what of mental models. “The complexity of the world in which we live exceeds our capacity to comprehend it. Confronted by a reality of extreme complexity, we are obliged to resort to various methods of simplification: generalizations, dichotomies, metaphors, decision rules, and moral precepts, just to mention a few.” (1) Put simply, the complexity of reality exceeds our capacity to comprehend it, thus we use mental models to simplify reality. Thus, by definition, a mental model is something that allows us to simplify reality. To be useful, however, not only do mental models have to be simplifications of reality, but they have to closely map to reality….but how close?
A mental model that says the world is flat meets the above criterion of a mental model, but, as we now know, it doesn’t map to reality. Of course, saying the Earth is round also doesn’t perfectly map to reality; however, it is more accurate. Achieving 100% accuracy of reality is impossible and achieving 99.999% accuracy is, as of yet, only possible in the physical sciences (more on this later). This brings to mind an interesting question: to what degree of accuracy is required for a mental model to be useful? In determining what level of accuracy I need, I think through the following: the downside of my mental model being wrong and the upside of my mental model being right. To the merchant, who only has aspirations in his small town, the upside is little and the downside is little of having a mental model that the world is flat. To the scientist who has staked his career on having an opinion on this matter, the upside and downside is much more, thus it would make sense for him to figure out a way to get a more accurate representation.
Let me pause and summarize: The world is too complex to fully comprehend. Mental models allow us to simplify reality and, therefore, interact with reality. No mental model is 100.000% accurate. Accuracy of a mental model falls on a probability scale of 0.001% to 99.999% (throw in an infinite amount of zeros and an infinite amount of nines in there, respectively). The level of accuracy required for a mental model to be useful is dependent on the downside and upside associated with using that mental model. That upside and downside will be unique to every individual, thus the level of accuracy needed will be unique to every individual.
If the level of necessary accuracy of every mental model is related to each individual’s personal upside / downside, then is explaining any mental model useful? Yes, I believe it is still helpful, caveated with as long as the explained mental model points out where the fog remains. Given that mumbo jumbo, (I had to read it a few times to make sure I even followed the logic…certainly point out if/where it is wrong) my goal will to be explain my mental models and (to the extent I know my knowledge gaps…you don’t know what you don’t know) point out where my fogginess remains. It is then up to you to determine if incorporating the mental model as-is is sufficient for your upside/downside potential or if it requires further digging.
A couple caveats: You don’t know how accurate your model is. You don’t perfectly know your potential upside and downside. You don’t know what you don’t know. Protect against this by being intellectually flexible (it’s OK to be wrong – reward yourself when you admit being wrong) and intellectually curios.