Abstract
The Fisher information matrix is a quantity of fundamental importance for
information geometry and asymptotic statistics. In practice, it is widely used
to quickly estimate the expected information available in a data set and guide
experimental design choices. In many modern applications, it is intractable to
analytically compute the Fisher information and Monte Carlo methods are used
instead. The standard Monte Carlo method produces estimates of the Fisher
information that can be biased when the Monte-Carlo noise is non-negligible.
Most problematic is noise in the derivatives as this leads to an overestimation
of the available constraining power, given by the inverse Fisher information.
In this work we find another simple estimate that is oppositely biased and
produces an underestimate of the constraining power. This estimator can either
be used to give approximate bounds on the parameter constraints or can be
combined with the standard estimator to give improved, approximately unbiased
estimates. Both the alternative and the combined estimators are asymptotically
unbiased so can be also used as a convergence check of the standard approach.
We discuss potential limitations of these estimators and provide methods to
assess their reliability. These methods accelerate the convergence of Fisher
forecasts, as unbiased estimates can be achieved with fewer Monte Carlo
samples, and so can be used to reduce the simulated data set size by several
orders of magnitude.
Users
Please
log in to take part in the discussion (add own reviews or comments).