Article,

Information Mechanics

.
(April 2008)

Abstract

Advances in science are brought fourth by hypothesizing that the action of a system is a direct measure of the amount of information in that system. We begin to interpret this governing hypothesis by examining its implications to current research. From this investigation, we find four primary conclusions. 1) To properly and completely quantify the amount of information contained within a particle (or system), one must add the self-information of both the wavefunction and its Fourier transform pair. 2) Information in nature is found in packets quantized to an integer number of the natural units. 3) Over a period of time, the energy of a system acts like an information rate and thus the information needed to describe that system for that period of time is equal to the product of the energy and the time divided by the minimum uncertainty. 4) At a given instant in time, the angular momentum, J, of a system is in direct proportion to the amount of information that is contained within or can be transmitted by that system. Empirical evidence affirming our governing hypothesis is given through twelve examples of systems (ranging from a black hole, to an electric circuit, to an electron). Thus from the very big down to the limits of the Heisenberg uncertainty principal, the conclusions are shown as a self consistent theory, accurately quantifying the amount of information in each given system.

Tags

Users

  • @janpaniev

Comments and Reviews