Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Space-alternating generalized expectation-maximization algorithm
1.048
Zitationen
2
Autoren
1994
Jahr
Abstract
The expectation-maximization (EM) method can facilitate maximizing likelihood functions that arise in statistical estimation problems. In the classical EM paradigm, one iteratively maximizes the conditional log-likelihood of a single unobservable complete data space, rather than maximizing the intractable likelihood function for the measured or incomplete data. EM algorithms update all parameters simultaneously, which has two drawbacks: 1) slow convergence, and 2) difficult maximization steps due to coupling when smoothness penalties are used. The paper describes the space-alternating generalized EM (SAGE) method, which updates the parameters sequentially by alternating between several small hidden-data spaces defined by the algorithm designer. The authors prove that the sequence of estimates monotonically increases the penalized-likelihood objective, derive asymptotic convergence rates, and provide sufficient conditions for monotone convergence in norm. Two signal processing applications illustrate the method: estimation of superimposed signals in Gaussian noise, and image reconstruction from Poisson measurements. In both applications, the SAGE algorithms easily accommodate smoothness penalties and converge faster than the EM algorithms.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">></ETX>
Ähnliche Arbeiten
Compressed sensing
2006 · 22.872 Zit.
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
1984 · 17.888 Zit.
Compressed sensing
2004 · 17.132 Zit.
Regularization Paths for Generalized Linear Models via Coordinate Descent
2010 · 16.668 Zit.
Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
2006 · 15.635 Zit.