Abstract

We propose a unified and systematic framework for performing online nonnegative matrix factorization in the presence of outliers that is particularly suited to large datasets. Within this framework, we propose two solvers based on proximal gradient descent and alternating direction method of multipliers. We prove that the objective function converges almost surely by appealing to the quasi-martingale convergence theorem. We also show the learned basis matrix converges to the set of local minimizers of the objective function almost surely. In addition, we extend our basic problem formulation to various settings with different constraints and regularizers, and adapt the solvers and analyses to each setting. We perform extensive experiments on both synthetic and image datasets. These experiments demonstrate the efficiency and efficacy of our algorithm on tasks such as basis learning, image denoising and shadow removal.

Description

1604.02634v1.pdf

Links and resources

Tags

community

  • @kchoong
  • @bsc
  • @pixor
  • @dblp
@pixor's tags highlighted