Author of the publication

Existence, uniqueness, and convergence rates for gradient flows in the training of artificial neural networks with ReLU activation.

, , , and . CoRR, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Strong overall error analysis for the training of artificial neural networks via random initializations., and . CoRR, (2020)Non-convergence to global minimizers for Adam and stochastic gradient descent optimization and constructions of local minimizers in the training of artificial neural networks., and . CoRR, (2024)A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions., , , and . CoRR, (2021)A proof of convergence for stochastic gradient descent in the training of artificial neural networks with ReLU activation for constant target functions., and . CoRR, (2021)On the existence of global minima and convergence analyses for gradient descent methods in the training of deep neural networks., and . CoRR, (2021)Convergence analysis for gradient flows in the training of artificial neural networks with ReLU activation., and . CoRR, (2021)Convergence proof for stochastic gradient descent in the training of deep neural networks with ReLU activation for constant target functions., , , , and . CoRR, (2021)Deep neural network approximation of composite functions without the curse of dimensionality.. CoRR, (2023)A proof of convergence for the gradient descent optimization method with random initializations in the training of neural networks with ReLU activation for piecewise linear target functions., and . CoRR, (2021)A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions., , , and . J. Complex., (2022)