Core-elements for large-scale least squares estimation

Mengyu Li, Jun Yu, Tao Li, Cheng Meng*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The coresets approach, also called subsampling or subset selection, aims to select a subsample as a surrogate for the observed sample and has found extensive application in large-scale data analysis. Existing coresets methods construct the subsample using a subset of rows from the predictor matrix. Such methods can be significantly inefficient when the predictor matrix is sparse or numerically sparse. To overcome this limitation, we develop a novel element-wise subset selection approach, called core-elements, for large-scale least squares estimation. We provide a deterministic algorithm to construct the core-elements estimator, only requiring an O(nnz(X)+rp2) computational cost, where X is an n×p predictor matrix, r is the number of elements selected from each column of X, and nnz(·) denotes the number of non-zero elements. Theoretically, we show that the proposed estimator is unbiased and approximately minimizes an upper bound of the estimation variance. We also provide an approximation guarantee by deriving a coresets-like finite sample bound for the proposed estimator. To handle potential outliers in the data, we further combine core-elements with the median-of-means procedure, resulting in an efficient and robust estimator with theoretical consistency guarantees. Numerical studies on various synthetic and real-world datasets demonstrate the proposed method’s superior performance compared to mainstream competitors.

Original languageEnglish
Article number190
JournalStatistics and Computing
Volume34
Issue number6
DOIs
Publication statusPublished - Dec 2024

Keywords

  • Coresets
  • Linear model
  • Sparse matrix
  • Subset selection

Fingerprint

Dive into the research topics of 'Core-elements for large-scale least squares estimation'. Together they form a unique fingerprint.

Cite this