Sequential Sample Average Majorization–Minimization

Research output: Contribution to journalArticlepeer-review

Abstract

Many statistical inference and machine learning methods rely on the ability to optimize an expectation functional, whose explicit form is intractable. The typical method for conducting such optimization is to approximate the expected value problem by a size-N sample average, often referred to as Sample Average Approximation (SAA) or M-estimation. When the solution to the SAA problem cannot be obtained in closed form, the Majorization–Minimization (MM) algorithm framework constitutes a broad class of incremental optimization solutions, relying on the iterative construction of surrogates, known as majorizers, of the original problem. The ability to solve an SAA problem depends on the availability of all N observations, contemporaneously, which is difficult when N is large or data are observed as a stream. We propose a stochastic MM algorithm that solves the expected value problem via iterative SAA majorizer constructions using sequential subsets of data, which we call Sequential Sample Average Majorization–Minimization (SAM2). Compared to previous stochastic MM algorithm variants, our method permits an extended definition of majorizers, and does not rely on convexity assumptions, smoothness assumptions, or restrictions on functional classes for objectives and majorizers. We develop a theory of stochastic convergence for SAM2, made possible via the presentation of a novel double array uniform strong law of large numbers. Examples of SAM2 algorithms are given along with a numerical demonstration of SAM2 to quantile regression problems, in the regular and sparse parameter settings, including both convex and non-convex objective functions.

Original languageEnglish
Article number31
JournalStatistics and Computing
Volume36
Issue number1
DOIs
Publication statusPublished - 1 Feb 2026
Externally publishedYes

Keywords

  • Majorization-Minimization Algorithms
  • Non-convex optimization
  • Quantile regression
  • Sample average approximation
  • Stochastic optimization

Fingerprint

Dive into the research topics of 'Sequential Sample Average Majorization–Minimization'. Together they form a unique fingerprint.

Cite this