Article,

Learning Multivariate Log-concave Distributions

, , and .
(2016)cite arxiv:1605.08188Comment: To appear in COLT 2017.

Abstract

We study the problem of estimating multivariate log-concave probability density functions. We prove the first sample complexity upper bound for learning log-concave densities on $R^d$, for all $d 1$. Prior to our work, no upper bound on the sample complexity of this learning problem was known for the case of $d>3$. In more detail, we give an estimator that, for any $d 1$ and $\epsilon>0$, draws $O_d łeft( (1/\epsilon)^(d+5)/2 \right)$ samples from an unknown target log-concave density on $R^d$, and outputs a hypothesis that (with high probability) is $\epsilon$-close to the target, in total variation distance. Our upper bound on the sample complexity comes close to the known lower bound of $Ømega_d łeft( (1/\epsilon)^(d+1)/2 \right)$ for this problem.

Tags

Users

  • @kirk86
  • @dblp

Comments and Reviews