Well-written, clear paper showing the similarities of EM/Gibbs/VB. The paper contains one strange mistake: The MAP estimate of multinomials with sparse Dirichlet priors does not yield "negative probabilities" (did they read Feynman?). Uh oh. The second derivative is positive for those values -> minimum, not maximum. Instead, the maximum is in {0,1}, which is easy to derive. This is well known for the Arcsine distribution (a special case of the Beta/Dirichlet). However, the paper still is awesome, and the mistake does not affect the message of the paper - with zero probabilities the perplexity should be really bad anyway :)
Btw. this mistake made it into the Spark ML library.
References
Bookmarks
deleting review
Please log in to take part in the discussion (add own reviews or comments).
Cite this publication
More citation styles
- please select -
%0 Conference Paper
%1 conf/uai/AsuncionWST09
%A Asuncion, Arthur U.
%A Welling, Max
%A Smyth, Padhraic
%A Teh, Yee Whye
%B UAI
%D 2009
%E Bilmes, Jeff A.
%E Ng, Andrew Y.
%I AUAI Press
%K
%P 27-34
%T On Smoothing and Inference for Topic Models.
%U http://dblp.uni-trier.de/db/conf/uai/uai2009.html#AsuncionWST09
@inproceedings{conf/uai/AsuncionWST09,
added-at = {2018-04-10T21:27:30.000+0200},
author = {Asuncion, Arthur U. and Welling, Max and Smyth, Padhraic and Teh, Yee Whye},
biburl = {https://www.bibsonomy.org/bibtex/2a78a193a501b2397363f153c781027b2/ckling},
booktitle = {UAI},
crossref = {conf/uai/2009},
editor = {Bilmes, Jeff A. and Ng, Andrew Y.},
ee = {https://dslpitt.org/uai/displayArticleDetails.jsp?mmnu=1&smnu=2&article_id=1663&proceeding_id=25},
interhash = {8e02687513e37ddc5fe2a26532fd5651},
intrahash = {a78a193a501b2397363f153c781027b2},
keywords = {},
pages = {27-34},
publisher = {AUAI Press},
timestamp = {2018-04-10T21:27:30.000+0200},
title = {On Smoothing and Inference for Topic Models.},
url = {http://dblp.uni-trier.de/db/conf/uai/uai2009.html#AsuncionWST09},
year = 2009
}