TY - JOUR
T1 - Local probabilistic model for Bayesian classification
T2 - A generalized local classification model
AU - Mao, Chengsheng
AU - Lu, Lijuan
AU - Hu, Bin
N1 - Publisher Copyright:
© 2020 Elsevier B.V.
PY - 2020/8
Y1 - 2020/8
N2 - In Bayesian classification, it is important to establish a probability distribution model, e.g., a Gaussian distribution for each class for probability estimation. Most of the previous methods modeled the probability distribution in the whole sample space. However, real-world problems are usually too complex to model in the whole sample space; some fundamental assumptions are required to simplify the global model, for example, the class conditional independence assumption for naive Bayesian classification. In this paper, with the insight that the distribution in a local sample space should be simpler than that in the whole sample space, a local probabilistic model established for a local region is expected much simpler and can relax the fundamental assumptions that may not be true in the whole sample space. Based on these advantages we propose establishing local probabilistic models for probability estimation in Bayesian classification. In addition, a Bayesian classifier adopting a local probabilistic model can even be viewed as a generalized local classification model; by tuning the size of the local region and the corresponding local model assumption, a fitting model can be established for a particular classification problem. The experimental results on several real-world datasets demonstrate the effectiveness of local probabilistic models for Bayesian classification.
AB - In Bayesian classification, it is important to establish a probability distribution model, e.g., a Gaussian distribution for each class for probability estimation. Most of the previous methods modeled the probability distribution in the whole sample space. However, real-world problems are usually too complex to model in the whole sample space; some fundamental assumptions are required to simplify the global model, for example, the class conditional independence assumption for naive Bayesian classification. In this paper, with the insight that the distribution in a local sample space should be simpler than that in the whole sample space, a local probabilistic model established for a local region is expected much simpler and can relax the fundamental assumptions that may not be true in the whole sample space. Based on these advantages we propose establishing local probabilistic models for probability estimation in Bayesian classification. In addition, a Bayesian classifier adopting a local probabilistic model can even be viewed as a generalized local classification model; by tuning the size of the local region and the corresponding local model assumption, a fitting model can be established for a particular classification problem. The experimental results on several real-world datasets demonstrate the effectiveness of local probabilistic models for Bayesian classification.
KW - Bayesian decision
KW - Classification
KW - Local learning
KW - Probabilistic model
KW - Probability estimation
UR - http://www.scopus.com/inward/record.url?scp=85084763358&partnerID=8YFLogxK
U2 - 10.1016/j.asoc.2020.106379
DO - 10.1016/j.asoc.2020.106379
M3 - Article
AN - SCOPUS:85084763358
SN - 1568-4946
VL - 93
JO - Applied Soft Computing
JF - Applied Soft Computing
M1 - 106379
ER -