Zusammenfassung
We study in this paper Fenchel-Young losses, a generic way to construct
convex loss functions from a convex regularizer. We provide an in-depth study
of their properties in a broad setting and show that they unify many well-known
loss functions. When constructed from a generalized entropy, which includes
well-known entropies such as Shannon and Tsallis entropies, we show that
Fenchel-Young losses induce a predictive probability distribution and develop
an efficient algorithm to compute that distribution for separable entropies. We
derive conditions for generalized entropies to yield a distribution with sparse
support and losses with a separation margin. Finally, we present both primal
and dual algorithms to learn predictive models with generic Fenchel-Young
losses.
Nutzer