OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 06.04.2026, 04:36

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Space-alternating generalized expectation-maximization algorithm

1994·1.048 Zitationen·IEEE Transactions on Signal ProcessingOpen Access
Volltext beim Verlag öffnen

1.048

Zitationen

2

Autoren

1994

Jahr

Abstract

The expectation-maximization (EM) method can facilitate maximizing likelihood functions that arise in statistical estimation problems. In the classical EM paradigm, one iteratively maximizes the conditional log-likelihood of a single unobservable complete data space, rather than maximizing the intractable likelihood function for the measured or incomplete data. EM algorithms update all parameters simultaneously, which has two drawbacks: 1) slow convergence, and 2) difficult maximization steps due to coupling when smoothness penalties are used. The paper describes the space-alternating generalized EM (SAGE) method, which updates the parameters sequentially by alternating between several small hidden-data spaces defined by the algorithm designer. The authors prove that the sequence of estimates monotonically increases the penalized-likelihood objective, derive asymptotic convergence rates, and provide sufficient conditions for monotone convergence in norm. Two signal processing applications illustrate the method: estimation of superimposed signals in Gaussian noise, and image reconstruction from Poisson measurements. In both applications, the SAGE algorithms easily accommodate smoothness penalties and converge faster than the EM algorithms.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Sparse and Compressive Sensing TechniquesTarget Tracking and Data Fusion in Sensor NetworksMedical Imaging Techniques and Applications
Volltext beim Verlag öffnen