** Landauer's Principle**

In
a real-world computational system, bits are actual entities associated
with packets of energy. Landauer’s principal identifies the
minimum entropy cost of removing or transferring a bit in a reversible
computation, as k_{B}ln2T Joules. Here T is the temperature of
the real-world, or laboratory computational device. Erasure occurs
where bits are passed to the environment and, if reversibility is lost,
the algorithmic entropy of the environment increases by more than k_{B}ln2,
as the instruction bits needed to ensure reversibility dissipate as
heat. From Landauer’s principal, the net change in the
thermodynamic entropy of a system, allowing for units, is identical to
the net bit flows through the system. Landauer argues that both
logical and thermodynamic irreversibility requires the loss of
information as bits. In the case of a laboratory UTM, the input
and the output are the initial configuration and final configuration of
bits stored within the computer at the start and finish of the
computation. In a typical laboratory computation reversibility is
usually lost, because energy, as heat, is passed to degrees of freedom
not considered part of the output. Effectively the difference in
algorithmic entropy between two states is about the transfer of bits of
information, whereas the thermodynamic entropy is about the transfer of
heat energy from a system at a given temperature.

It
can be argued from Shannon’s information theory that, for a bit to be
recognisable as a signal above the thermal noise at the operating
temperature T, the bit must carry k_{B}ln2TJoules.
Usually, however, Landauer’s principle is justified by arguments
relying on the equipartition principle. However, Devine, Entropy
2021
has shown that Landauer’s principle is more fundamental, arising
directly from Stirling’s Approximation and the definition of
temperature in terms of the derivative of entropy. In other words, the
principle holds generally, except when the system approaches absolute
zero. As a consequence, the algorithmic entropy of the momentum
states can be unequivocally and consistently identified with the
thermodynamic entropy except near absolute zero.

The Landauer principle
at the quantum limit. While
Landauer’s principle seems to be widely applicable, until recently it
was not clear how effective it is in the quantum regime. Yan
et al. 2018,
by employing an ultra-cold trapped ion, showed the quantum version of
Landauer’s principle is consistent with the theoretical approach of Reeb
and Wolf 2014.
The quantum version is the thermodynamic cost of reducing the von
Neumann entropy by one bit. This leads to the following equality.

∆Q/kBT = ∆S + I(S ′:R′) + D(ρ ′R||ρR).

H(R)
is the Hamiltonian specifying the states of the reservoir, R, while ∆Q
is the average increase of the thermal energy in the reservoir. ∆S is
the decrease of the von Neuman entropy of the trapped ion when a bit is
erased. In that context ρ is the operator in the von Neumann density
matrix. As discussed, when all bits are tracked,
Landauer’s principle becomes an equality. But the above equation has a
term I(S ′:R′). This is the mutual information in bits between
the system S ′, and the reservoir R′ and corresponds to H(g _{t})
bits, the bits in the residual program or history in equation
that increase the entropy when reversibility is lost (Landauer).
The second term, D(ρ ′R||ρR),, which is the relative entropy before and
after the erasure given ρ ′ and ρ, will be zero with an appropriate
definition of heat, as argued by Bera et al. 2017.
From the perspective here, these bits are potential entropy only, and
do not align with the realized entropy in the momentum degrees of
freedom.

I.e. provided the momentum states are thermalised, the algorithmic approach is consistent with the quantum mechanical approach.

Uncertainty due to phase space graining The instantaneous microstate of a natural system can be represented by a point in the multidimensional state space of the system that defines the position, momentum, binding states and electronic states of all the species in the system. Key is how to specify the position and momentum coordinates of each species along the x, y, z axes, i.e. in the phase space of the system, see Entropy 2021

At
the fundamental level, as the computational bits are discrete,
discreteness is built into the algorithmic universe and translational
symmetry is restricted to increments of the minimum sized cell in
position-space. Let the fundamental unit of length δx = a along
the x direction be fine enough to ensure position coordinates can be
captured by a binary string. In which case, allowable translations in
the x direction that shift one configuration to another must be
integral multiples of a
given by Na where N is any integer. The
Fourier transform of the position vector is wave number k_{x}, in reciprocal space. In
this space, the fundamental unit of length δ k_{x} space is 1/a. The minimum
area of a cell in the position momentum space and δxδk_{x}=1.
This is in effect an uncertainty principle, a fundamental property of
the algorithmic universe, as increasing resolution in the x space
decreases the allowable resolution in the reciprocal space. As
the momentum in the x direction, p
_{x}, is also invariant under
translational symmetry, both p_{x} and k_{x }are constants of the motion,
implying that p_{x} = h k_{x}, and δxδp_{x} = h. It is plausible to take h as
Plank’s constant, as this determines the minimum resolution of a
position-momentum space.

In other words, the
uncertainty implicit in the algorithmic computations is consistent with
quantum mechanics.