Speaking of how the model of reality which a technology proposes can end up influencing and changing reality itself, here’s George Dyson:
Their models are no longer models. The search engine is no longer a model of human knowledge, it /is/ human knowledge. What began as a mapping of human meaning now defines human meaning, and has begun to control, rather than simply catalog or index, human thought. No one is at the controls. If enough drivers subscribe to a real-time map, traffic is controlled, with no central model except the traffic itself. The successful social network is no longer a model of the social graph, it is the social graph. This is why it is a winner-take-all game. Governments, with an allegiance to antiquated models and control systems, are being left behind.
Maps make for a good example here. We’re all aware that the mercator projection is an inaccurate model of reality, one which distorts the true size of countries and is skewed in favour of Europe, nonetheless that’s what we use to describe the world, it’s what we think of when we think of a map — it’s the default, almost natural, choice.
The risk here is for a model to become so ingrained that we end up forgetting about the other options we had — or that what we’re using is, in fact, just a model.