Abstract
This brief aims to provide theoretical guarantee and practical guidance on constructing a type of graphs from input data via distance preserving criterion. Unlike the graphs constructed by other methods, the targeted graphs are hidden through estimating a density function of latent variables such that the pairwise distances in both the input space and the latent space are retained, and they have been successfully applied to various learning scenarios. However, previous work heuristically treated the multipliers in the dual as the graph weights, so the interpretation of this graph from a theoretical perspective is still missing. In this brief, we fill up this gap by presenting a detailed interpretation based on optimality conditions and their connections to neighborhood graphs. We further provide a systematic way to set up proper hyperparameters to prevent trivial graphs and achieve varied levels of sparsity. Three extensions are explored to leverage different measure functions, refine/reweigh an initial graph, and reduce computation cost for medium-sized graph. Extensive experiments on both synthetic and real datasets were conducted and experimental results verify our theoretical findings and the showcase of the studied graph in semisupervised learning provides competitive results to those of compared methods with their best graph.
Original language | English |
---|---|
Pages (from-to) | 6642-6649 |
Number of pages | 8 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 34 |
Issue number | 9 |
DOIs | |
Publication status | Published - 1 Sept 2023 |
Externally published | Yes |
Keywords
- Density estimation
- distance preservation
- graph construction and learning
- sparse graph