Misc,

The underlying connections between identifiability, active subspaces, and parameter space dimension reduction

, and .
(2018)cite arxiv:1802.05641.

Abstract

The interactions between parameters, model structure, and outputs can determine what inferences, predictions, and control strategies are possible for a given system. Parameter space reduction and parameter estimation---and, more generally, understanding the shape of the information contained in models with observational structure---are thus essential for many questions in mathematical modeling and uncertainty quantification. As such, different disciplines have developed methods in parallel for approaching the questions in their field. Many of these approaches, including identifiability, sloppiness, and active subspaces, use related ideas to address questions of parameter dimension reduction, parameter estimation, and robustness of inferences and quantities of interest. In this paper, we show that active subspace methods have intrinsic connections to methods from sensitivity analysis and identifiability, and indeed that it is possible to frame each approach in a unified framework. A particular form of the Fisher information matrix (FIM), which we denote the sensitivity FIM, is fundamental to all three approaches---active subspaces, identifiability, and sloppiness. Through a series of examples and case studies, we illustrate the properties of the sensitivity FIM in several contexts. These initial examples show that the interplay between local and global and linear and non-linear strongly impact the insights each approach can generate. These observations underline that one's approach to parameter dimension reduction should be driven by the scientific question and also open the door to using tools from the other approaches to generate useful insights.

Tags

Users

  • @holgerp

Comments and Reviews