Of those, one with maximal information entropy is the proper distribution, according to this principle.
In that case, the information entropy would be equal to zero.
This is the case of the unbiased bit, the most common unit of information entropy.
The information entropy of the distribution has the following formula:
The logarithm can also be taken to the natural base in the case of information entropy.
In comparison, information entropy of any macroscopic event is so small as to be completely irrelevant.
The concept of information entropy was created by a mathematician.
In some sense we can see as the information entropy contained within the particles' labels.
The nat is the natural unit for information entropy.
This formulation is analogous to that of Shannon's information entropy.