Abstract
The relationship between some probability distributions and their invariant
property is discussed. A measure I of uncertainty (informational entropy) of
the probability distribution p(x) is defined in a variational way which makes
it possible to derive three entropy forms directly from the distribution laws
widely observed in nature without postulating a priori the entropy properties
such as additivity and extensivity.
Users
Please
log in to take part in the discussion (add own reviews or comments).