Models Are Not Reality — And That’s the Problem

Models are dangerous.

Not because they are useless. Quite the opposite. They can be extraordinarily useful. They allow us to test assumptions, explore scenarios, simulate complex systems that would otherwise be impossible to grasp. In engineering, finance, logistics — models are indispensable tools.

But they are also a potent drug.

Anyone allowed to deploy a model in public discourse should first be required to sit through an hour of disclaimers. A loud, repetitive reminder that no matter how sophisticated the output looks, no matter how many decimal places it carries, no matter how elegant the visualization — it is not data.

It is not observation.

It is not measurement.

It is an approximation of reality. A structured guess. A formalized opinion. An imaginative reconstruction of how the world might behave under a set of assumptions that we chose.

And yet we do not treat models that way.

Society at large has drifted into accepting model outputs as if they were empirical facts. A projection becomes a datapoint. A scenario becomes an outcome. A curve on a screen acquires the authority of a thermometer reading.

That shift is subtle but profound.

No matter how refined a model becomes, it remains a simplification. It is built on assumptions. It requires parameters. It depends on boundary conditions. It is calibrated against selected data. It necessarily excludes variables deemed irrelevant or inconvenient. It translates messy reality into a system that can be computed.

That act alone introduces interpretation.

And interpretation is not reality.

Even a model that tracks the real world with impressive accuracy is still a construct. A figment of structured imagination. It can approximate. It can illuminate. But it cannot become the thing it seeks to represent.

And like all human constructs, it can fail.

If failure were the worst-case scenario, we could live with that. Models fail all the time. Assumptions prove wrong. Parameters shift. Systems behave in unexpected ways. When that happens, at least reality reasserts itself. The mismatch becomes visible.

Failure is honest.

The real danger lies elsewhere.

Because models are human-made constructs, they are extraordinarily flexible. Adjust a parameter here. Weight a variable differently there. Select a different baseline. Extend or shorten a time horizon. Choose optimistic inputs. Choose pessimistic ones. Exclude inconvenient dynamics. Emphasize preferred mechanisms.

With enough technical complexity layered on top, the output still looks authoritative.

And once model outputs are socially accepted as “data,” the temptation becomes obvious. If the model is treated as reality, then shaping the model becomes a way of shaping perceived reality. You do not have to falsify measurements. You simply refine the assumptions until the curve aligns with the narrative.

From there, the incentives line up neatly.

Enormous funding streams. Prestigious careers. Institutional status. Policy influence. Media visibility. All reinforced by outputs that appear scientific, quantified, rigorous.

And the tool that generates those outputs sits in human hands.

What could possibly go wrong?

https://wattsupwiththat.com/2026/02/16/models-gone-wild-the-ionosphere-triggers-earthquakes/