SlimML: Removing Non-Critical Input Data in Large-Scale Iterative Machine Learning

Rui Han, Chi Harold Liu*, Shilin Li, Lydia Y. Chen, Guoren Wang, Jian Tang, Jieping Ye

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

10 引用 (Scopus)

摘要

The core of many large-scale machine learning (ML) applications, such as neural networks (NN), support vector machine (SVM), and convolutional neural network (CNN), is the training algorithm that iteratively updates model parameters by processing massive datasets. From a plethora of studies aiming at accelerating ML, being data parallelization and parameter server, the prevalent assumption is that all data points are equivalently relevant to model parameter updating. In this article, we challenge this assumption by proposing a criterion to measure a data point's effect on model parameter updating, and experimentally demonstrate that the majority of data points are non-critical in the training process. We develop a slim learning framework, termed SlimML, which trains the ML models only on the critical data and thus significantly improves training performance. To such an end, SlimML efficiently leverages a small number of aggregated data points per iteration to approximate the criticalness of original input data instances. The proposed approach can be used by changing a few lines of code in a standard stochastic gradient descent (SGD) procedure, and we demonstrate experimentally, on NN regression, SVM classification, and CNN training, that for large datasets, it accelerates model training process by an average of 3.61 times while only incurring accuracy losses of 0.37 percent.

源语言英语
文章编号8890886
页(从-至)2223-2236
页数14
期刊IEEE Transactions on Knowledge and Data Engineering
33
5
DOI
出版状态已出版 - 1 5月 2021

指纹

探究 'SlimML: Removing Non-Critical Input Data in Large-Scale Iterative Machine Learning' 的科研主题。它们共同构成独一无二的指纹。

引用此

Han, R., Liu, C. H., Li, S., Chen, L. Y., Wang, G., Tang, J., & Ye, J. (2021). SlimML: Removing Non-Critical Input Data in Large-Scale Iterative Machine Learning. IEEE Transactions on Knowledge and Data Engineering, 33(5), 2223-2236. 文章 8890886. https://doi.org/10.1109/TKDE.2019.2951388