This paper reframes world models as a latent-state design problem under sufficiency constraints, organizing methods by what the state is meant to preserve and support. That lens should help robotics and physical-AI researchers compare representation choices across prediction, control, planning, and memory.
arXiv:2605.01694v1 Announce Type: new Abstract: A world model matters to an agent only through the state it constructs. That state must preserve some information, discard other information, and support some future function: prediction, control, planning, memory, grounding, or counterfactual reasoning. This paper treats world-model research as latent state design under sufficiency constraints. We propose a functional taxonomy that groups methods by what their latent state is for, rather than by…