Misc,

Using machine learning to compress the matter transfer function $T(k)$

, , and .
(2022)cite arxiv:2211.06393Comment: 11 pages, 5 figures, 2 tables.

Abstract

The linear matter power spectrum $P(k,z)$ connects theory with large scale structure observations in cosmology. Its scale dependence is entirely encoded in the matter transfer function $T(k)$, which can be computed numerically by Boltzmann solvers, and can also be computed semi-analytically by using fitting functions such as the well-known Bardeen-Bond-Kaiser-Szalay (BBKS) and Eisenstein-Hu (EH) formulae. However, both the BBKS and EH formulae have some significant drawbacks. On the one hand, although BBKS is a simple expression, it is only accurate up to $10\%$, which is well above the $1\%$ precision goal of forthcoming surveys. On the other hand, while EH is as accurate as required by upcoming experiments, it is a rather long and complicated expression. Here, we use the Genetic Algorithms (GAs), a particular machine learning technique, to derive simple and accurate fitting formulae for the transfer function $T(k)$. When the effects of massive neutrinos are also considered, our expression slightly improves over the EH formula, while being notably shorter in comparison.

Tags

Users

  • @intfxdx

Comments and Reviews