@achakraborty

Polynomial Regression As an Alternative to Neural Nets

, , , and . (2018)cite arxiv:1806.06850Comment: 23 pages, 1 figure, 13 tables.

Abstract

Despite the success of neural networks (NNs), there is still a concern among many over their "black box" nature. Why do they work? Here we present a simple analytic argument that NNs are in fact essentially polynomial regression models. This view will have various implications for NNs, e.g. providing an explanation for why convergence problems arise in NNs, and it gives rough guidance on avoiding overfitting. In addition, we use this phenomenon to predict and confirm a multicollinearity property of NNs not previously reported in the literature. Most importantly, given this loose correspondence, one may choose to routinely use polynomial models instead of NNs, thus avoiding some major problems of the latter, such as having to set many tuning parameters and dealing with convergence issues. We present a number of empirical results; in each case, the accuracy of the polynomial approach matches or exceeds that of NN approaches. A many-featured, open-source software package, polyreg, is available.

Description

[1806.06850] Polynomial Regression As an Alternative to Neural Nets

Links and resources

Tags

community

  • @achakraborty
  • @dblp
  • @jpvaldes
@achakraborty's tags highlighted