## More on the Provisional Entropy.

In contrast to the traditional entropies, the algorithmic entropy must exactly specify a particular string that defines a microstate of the system.   But in practice, allowing for units, the provisional entropy of a typical microstate in a thermodynamic macrostate is virtually the same as the Shannon.  Allowing for units, and with an appropriate zero of entropy that takes as given the structure of the system, the value is the same as the Boltzmann Entropy.   While a thermodynamic macrostate contains all microstates with the same characteristics, and the algorithmic entropy must exactly specify a particular microstate, the exact state may not be known.   In this case the provisional entropy becomes the best guess of the algorithmic entropy of the exact state.   It is the shortest description of the microstate given the known information.

For example, taking the physical laws and the physical structure of the system as a given, the algorithm that specifies a particular microstate in an equilibrium macrostate consists of two parts. The first part of the algorithm species the characteristics of the set of equilibrium states, while the second identifies the actual state within this set.

The second contribution to the algorithmic entropy, given the characteristics of the macrostate,  is virtually the Shannon entropy.

As an example, consider the set of equilibrium configurations in a mole of gas containing some 1023 = 269 particles. The algorithm that specifies one of the allowable states must specify t he momentum and position of each particle to an agreed degree of precision. Assuming 10 bits are needed to specify the position coordinates and another 10 bits for the momentum coordinates for each of the particles,  the overall algorithm must repeat this sequence of 20 bits for each of the 269 particles. The length of this string that specifies a microstate is  20x 269 bits.  From a Shannon point of view, the number of available states is 220 raised to the power of 269.  The Shannnon entropy is also 20x 269.  (The extra loglog contribution to cover self-delimiting coding is negligible in comparison this.) In other words, for large N , the provisional entropy is close to;

Hprov = HSet+ HShannon

Furthermore, for a typical string in a set of equilibrium states or similar macrostate, the first of the above terms that  defines the set will be small.  Hence in practice the provisional entropy of an equilibrium state is indistinguishable from the Shannon entropy of all possible states.   However, the situation changes for the very small number of highly ordered states that belong to the equilibrium set.  A simple example illustrates the point.   The toss of a coin 100 times could produce a random string of heads ( h ) and tails ( t ) of the form hhttht….thhth.   If the definition of the set can be ignored, the provisional entropy is a little more than 100, which is the Shannon entropy of the set of 2100 possible states. However, i f one has reason to believe that the string of interest was ordered the provisional algorithmic entropy will drop.

For example if every second character is an h, (with every   alternate character undetermined), the size of the set is halved and   the provisional entropy would be a little more than N/2.   But, if later it becomes clear that the string of interest is still more highly ordered, and is in fact hhhh…hhh, ( one hundred heads in a row) the provisional entropy drops further. In which case the set has only two members, namely 100 heads or 100 tails.   As the Shannon uncertainty term is now 1, the Shannon term is small and the definition of the set becomes the major contribution to the provisional entropy.  However, in this case the string can be specified directly by   the algorithm “PRINT “ h” 100 times rather than  through a process of defining the smallest set it belongs to.   The algorithmic entropy becomes now slightly greater than log2 100 + | h|.   Here the two vertical lines indicate the length of code needed to specify the character(s) between the lines.
This example shows that;

• Where little information is known, the string of interest belongs to a set where the provisional entropy of a member is  close to N , the logarithm of the number of members in the set.
• If with further information the string of interest is identified as a non- typical, a smaller set can be used to define the provisional entropy and the Shannon contribution is smaller.   I.e. if every second string is an h, the provisional entropy is N/2 plus the bits needed to define the set.
• Finally when it is clear the particular string is highly ordered, corresponding, for example, to 100 heads in a row, the provisional entropy approach may not provide the shortest description.  Rather the true algorithmic entropy becomes a little more than the 7 bits needed to specify the number 100.
While the provisional entropy is an entropy measure for a particular state, and while the physical situation also needs to be specified, the traditional entropies only apply to a macrostate, which is usually an equilibrium state.   Nevertheless, where a microstate is a typical member of a macrostate, the provisional entropy returns virtually the same value as a traditional entropy allowing for units.   As a consequence, the provisional entropy can be applied to real situations knowing that it is a real entropy.