The second law of thermodynamics

The algorithmic entropy of a microstate in a thermodynamic macrostate
The algorithmic entropy of the string specifying a microstate in the set of all strings with the same macroscopic parameters can be specified by a two-part algorithm.  The first is the algorithm that defines the set of all microstates in terms of the macrostate’s parameters, while the second specifies the particular microstate, given the macrostate requirements.  The number of bits that specifies the microstate is known as the provisional entropy and corresponds to Kolmogorov’s Algorithmic Minimum Sufficient Statistic.

The specification of microstate s i is given by;
Hprov( s i) = H (description of the macrostate) + H(specifying the microstate given the set of strings in the macrostate).
As only differences in algorithmic entropy, count in the natural world, the O(1) term specific to a particular UTM is unnecessary.  The second term in the above equation is log2Ω.  Where there are Ω microstates in the macrostate, the number of bits in the algorithm to generate the particular string is the same as the Shannon entropy, the number to identify a string.

While the approach has been defined for the thermodynamic situation it is widely applicable.  For example, the algorithm that represents the instantaneous noisy image on a screen first specifies the image, and then the Shannon entropy term to identify the noisy variation of that image at a specific instance

Real-word computations are reversible
In a natural system, if all bits are tracked, the algorithmic entropy of the instantaneous microstate is the number of bits in the algorithm that reaches the microstate by a reversible trajectory.  As there is only one forward path to such a microstate, no shorter computational path is possible.  This implies that the algorithmic entropy is a function of state, and the number of bits in a system found by tracking bits already in the system and the bit flows into or out of the system is identical to the bits specified by a halting algorithm.

A computational simulation of a real-world process must also be structured to maintain reversibility, either by using a reversible simulation, or by tracking all bits.  In an isolated system bits are conserved as discussed below.

The second law
What is commonly called equilibrium is the set of most probable microstates that characterizes the system for overwhelmingly most of the time.  From Landauer’s principle, the bits in the momentum degrees of freedom of a microstate in the most probable set of states, aligns with the thermodynamic entropy by kBln2 per bit.  When  an ordered fluctuation away from the most probable states occurs in an isolated system, bits are transferred both to the stored energy states and to the instruction bits that implement the natural laws in the structures, ensuring bits are conserved. The bits transferred to the stored energy states become potential entropy, as only momentum bits contribute to the thermodynamic entropy. 
The converse is where an isolated system, initially in a highly ordered state, trends towards the most probable set, under processes consistent with the second law of thermodynamics. In which case, the number of bits specifying the momentum states increases as program or instruction bits are added to the configurational bits.  Nevertheless, if all bits, both (potential and realised) are tracked, the algorithmic entropy of each microstate in the macrostate remains constant, as bits do not leave the system.
The argument is that the reversible computation on the UTM U, which starts from a highly ordered state s0 to generate a less ordered state is of the form U(pt, s0) = st,gt. where ptis the programme shifting the system to state st, and gt where st is the state at time t, and  gt are the history  or programme bits tracked to ensure reversibility.  I.e. U(gt,st) = pt, s0 and H(pt,s0) = H(gt,st) = log2Ω, where Ω is the total number of available states in the trajectory that moves through all microstates.  In the most probable set of states only a minimal number of instruction bits, H(H(gt), exist to ensure reversibility.  Driving the system backwards in time regenerates the original structural entities embodying the instructions. However, if the reversible path is lost when bits pass to the environment, the exact specification of the microstate is lost, and the entropy increases. This is consistent with  the von Neumann entropy of an isolated system, described by a time-dependent Hamiltonian, is invariant over time.

The degree of order measures the distance from equilibrium, i.e. the number of instruction bits that enter the system to create an equilibrium configuration.
I.e.
Dord = log2Ω-H(st), or H(gt,st)- H(st)
Dord is close zero for a microstate in the most probable set.