From post

Choosing starting values for the EM algorithm for getting the highest likelihood in multivariate Gaussian mixture models.

, , и . Comput. Stat. Data Anal., 41 (3-4): 561-575 (2003)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Relaxations de la régression logistique : modèles pour l'apprentissage sur une sous-population et la prédiction sur une autre., и . DMAS, том A-1 из RNTI, стр. 200-212. Cépaduès-Éditions, (2005)A predictive deviance criterion for selecting a generative model in semi-supervised classification., , , и . Comput. Stat. Data Anal., (2013)Interpretable domain adaptation using unsupervised feature selection on pre-trained source models., , , и . Neurocomputing, (2022)Gaussian-Based Visualization of Gaussian and Non-Gaussian-Based Clustering., , и . J. Classif., 38 (1): 129-157 (2021)Relaxing the Identically Distributed Assumption in Gaussian Co-Clustering for High Dimensional Data., , и . CoRR, (2018)Choosing starting values for the EM algorithm for getting the highest likelihood in multivariate Gaussian mixture models., , и . Comput. Stat. Data Anal., 41 (3-4): 561-575 (2003)Parameter Setting for Evolutionary Latent Class Clustering., , , , и . ISICA, том 4683 из Lecture Notes in Computer Science, стр. 472-484. Springer, (2007)Initializing EM using the properties of its trajectories in Gaussian mixtures.. Stat. Comput., 14 (3): 267-279 (2004)Pourquoi les modèles de mélange pour la classification ?. Monde des Util. Anal. Données, (2009)Model-based clustering of multivariate ordinal data relying on a stochastic binary search algorithm., и . Stat. Comput., 26 (5): 929-943 (2016)