Entropy appears across scientific disciplines, each defining it differently while pointing to a similar underlying idea. These definitions often seem inconsistent because each field formalizes entropy based on its own assumptions and modeling choices. As a result, the shared structure beneath them remains obscured.
This Quanta Magazine article provides helpful background for this work. It traces how entropy has evolved across disciplines and highlights the ambiguity that arises when it is treated as both a physical property and a measure of uncertainty, depending on the framework.
In this work, I introduce a conceptual framework that unifies entropy and probability as phenomena. While the focus is on information theory and statistical mechanics, the framework is extensible to other forms of entropy. Rather than treating these interpretations as analogous, it reveals them as expressions of the same underlying structure grounded in the relationship between priors and possibilities.
Entropy and probability exist as phenomena before measurement. They represent relationships that exist in reality independent of our attempts to quantify them.