This is kind of meta. I am just wondering if anybody else here shares my definitions for these terms, and finds the prior discussion regarding CIQ simulation a bit off-putting. I wonder if this is more of a culture/dialect issue between different regions, generations, or sub fields, or...?
To me, the words simulator vs emulator do not not mean distinct levels of accuracy. They mean different purposes. Accuracy is a different axis of choices for either one.
A "simulator" models or predicts the function of a system or subsystem. The simulator's purpose is to forecast. It predicts output responses and other quantifiable characteristics in response to inputs and other simulation parameters. These can be extremely accurate, e.g. down to simulating charges and RF characteristics of nanoscale semiconductors. Or they can be abstract simulations of digital processes, e.g. how a computer will respond to certain network traffic.
An "emulator" also replicates input/output for a system or subsystem. But, conversely, the emulator's purpose is to reproduce/replay. You don't use an emulator to predict the behavior of a new design. It is constructed to reproduce desired/known behaviors, usually to embed in a larger system that is being tested.
There is a subcategory of "in-circuit emulators" meant to replicate electrical characteristics well enough to embed in a test circuit. There is also a category of "real-time simulator" meant to interact with partner systems. E.g. a flight simulator can predict how an aircraft will behave under the live control of a real human pilot who feels like they're flying it.
Where it gets confusing is that these things can also be combined in practice. It becomes a judgement, in the eye of the beholder, whether you are simulating or emulating. I'd say a basic flight simulator, e.g. at a training or recreational center, might emulate an airspeed sensor connected to physical cockpit instruments. It sends signals based on simulated aircraft dynamics and some textbook definition of how the airspeed should indicate in that scenario. A more first-principles flight simulator, e.g. at a design center, might model the aerodynamic flow around an actual pitot tube design, to predict airspeed sensor response to extreme conditions and maneuvers. Such simulation results might be used to make a more accurate emulator.
This is similarly vague when it comes to software execution.The software is really just data, i.e. part of the inputs/parameters. Is a modern paravirtualized machine on a general purpose computer a simulator or an emulator? It is faithfully reproducing the known execution system like a good emulator. But, due to the huge input/parameter space, it may also be predicting the response to a novel input like a good simulator!
My personal bias is that an emulator would strive to run programs according to the published instruction set rules, and a simulator might reveal bugs in an actual processor design where it deviates from the rules for a given instruction sequence.
Edit to add: I do recognize that some may have other biases. E,g. they could expect an emulator to reproduce the errata of a given processor, while their simulator is for the abstract instruction set rules. This difference is the core of what I'm wondering... is it taught differently in different regions of the world, or different generations of teachers, or something else?