Artikel,

Generalization in Deep Learning

, , und .
(2017)cite arxiv:1710.05468Comment: To appear in Mathematics of Deep Learning, Cambridge University Press. All previous results remain unchanged.

Zusammenfassung

This paper provides non-vacuous and numerically-tight generalization guarantees for deep learning, as well as theoretical insights into why and how deep learning can generalize well, despite its large capacity, complexity, possible algorithmic instability, nonrobustness, and sharp minima, responding to an open question in the literature. We also propose new open problems and discuss the limitations of our results.

Tags

Nutzer

  • @kirk86
  • @loroch
  • @jk_itwm
  • @dblp

Kommentare und Rezensionen