Abstract
To compare the similarity of probability distributions, the information-theoretically motivated metrics like Kullback-Leibler divergence (KL) and Jensen-Shannon divergence (JSD) are often more reasonable compared with metrics for vectors like Euclidean and angular distance. However, existing locality-sensitive hashing (LSH) algorithms cannot support the information-theoretically motivated metrics for probability distributions. In this paper, we first introduce a new approximation formula for S2JSD-distance, and then propose a novel LSH scheme adapted to S2JSD-distance for approximate nearest neighbors search in high-dimensional probability distributions. We define the specific hashing functions, and prove their local-sensitivity. Furthermore, extensive empirical evaluations well illustrate the effectiveness of the proposed hashing schema on six public image datasets and two text datasets, in terms of mean Average Precision, Precision@N and Precision-Recall curve.
Original language | English |
---|---|
Pages | 3244-3251 |
Number of pages | 8 |
Publication status | Published - 2017 |
Event | 31st AAAI Conference on Artificial Intelligence, AAAI 2017 - San Francisco, United States Duration: 4 Feb 2017 → 10 Feb 2017 |
Conference
Conference | 31st AAAI Conference on Artificial Intelligence, AAAI 2017 |
---|---|
Country/Territory | United States |
City | San Francisco |
Period | 4/02/17 → 10/02/17 |