Abstract
We present numerical evidence of the presence of a trade-off between the amount of information comprised in a time series, which is the output of a natural system, and the entropy produced by the system in generating this information. In particular, we analyze the relation between the spectral correlation exponent and the entropy. We observed that, as the time correlations increase,
the variability of the signal reduces. Finally, we discuss a hypothesis on how this trade-off could be related to energy optimization done by the system, while at the same increasing its robustness on the information generation.
Users
Please
log in to take part in the discussion (add own reviews or comments).