# Entropy notion and definition

The macroscopic approach and the statistical one was the first approach of the concept of entropy.

Indeed, the macrostate formula from Boltzmann and his statistical entropy perspective has been followed by the statistical ensemble of distribution on the microstates approach of Gibbs.

After what, entropy has been a measure allowing to characterize a statistical distribution as the probability of event for a specific distribution, it is the Shannon perspective of information.

After this statistical approach of the entropy, we arrive at the quantic perspective, we dive into quantum physics approach with the Planck-Einstein relation. With mass-energy equivalence relation from Einstein, we are at confluence of the principles of mechanics, principle of relativity and electromagnetic theory for this quantic way of description of the information concept.

But we can go further into the exploration of the notion of information by introducing the mass-energy-information equivalence principle.

Based on the "it from bit" perspective, we can say that tThe new Entropic Information approach is founded on the bit of information such as the number of bits of the system, the number of bits necessary to specify the actual microscopic configuration among the total number of microstates allowed and thus characterize the macroscopic states of the system under consideration.

Where the entropy of a thermodynamic system in equilibrium measure of the uncertainty as to which all its internal configurations compatible with its macroscopic thermodynamic parameters (temperature, pressure, etc.) are actually realized.

To apprehend this new concept of entropy,

we start from the hidden thermodynamic of Louis De Broglie

$$action\over h$$=$$-entropy \over k$$

action = Energy times time,

$$Energy =m c^2$$

$$S =k ln(W)$$

by k simplification,

we obtain :

$$ln(W)$$= $$action\over h$$=$$mc^2t \over h$$

We use now,

the mass-energy-information equivalence principle,

the mass of bit of information

$$m_{bit}$$=$$kTln(2)\over c^2$$

By replacing mass by mass of bit of information,

by simplification of c²,

we obtain finally this equation

$$ln(W)$$= $$action\over h$$=$$mc^2t \over h$$=$$kTln(2)t \over h$$

If we consider that the degree of freedom of a system can be viewed as the minimum number of coordinates required to specify a configuration.

So YES ... W reflects the degree of freedom of a system.

We generalize this equation to the concept of Entropy

$$ln(W)$$=$$kTln(2)t \over h$$

$$S =k ln(W)$$

We obtain  new Entropic Information formulation of Entropy concept

$$S$$= $$k^2Tln(2)t \over h$$

Entropic information entropy new concept

in four differente formulations

$$S$$= $$k^2Tln(2)t \over h$$ = $$mc^2kln(2)t \over h$$ = $$kln(2)tf$$= $$2R_∞ RTKln⁡(2)t \over M_uA_r(e)cα^2$$