TY - JOUR
T1 - Similarity Transitivity Broken-Aware Multi-Modal Hashing
AU - Tu, Rong Cheng
AU - Mao, Xian Ling
AU - Liu, Jinyu
AU - Ji, Yatai
AU - Wei, Wei
AU - Huang, Heyan
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Due to the low storage cost and fast retrieval speed, multi-modal hashing, which maps the instances with different modal data-views into hash codes, has earned increasing research attention. Most existing supervised multi-modal hashing methods exploit the label information to define the similarities between instance pairs to supervise their training process. However, such methods ignore that the transitivity of their defined similarity has been broken in the multi-label scenarios, i.e., the instance x is similar to the instance y, and the instance z is also similar to the instance y, but x may be not similar to z, which will lead to fluctuations in the model optimization process and damage their retrieval performance. For example, in the first batch with instances x and y but without z, the model will be optimized to make the hash codes of x and y similar to each other; In the second batch with instances z and y but without x, the model will be optimized to make the hash codes of z and y similar to each other; In the third batch with the instances x and z but without y, the model will be optimized to make the hash codes of z and x dissimilar to each other, meanwhile in this process, the hash codes of z and x may be dissimilar to that of y which damage the optimizing results of the first two batches. Therefore, we propose a novel Similarity Transitivity Broken-aware Multi-modal Hashing, called STBMH, to solve this problem by adding a novel regularization loss into the original pair-wise loss. For each instance x in a training batch, the regularization loss will take all instances in the training set into account. Extensive experiments on four widely used datasets show that the proposed method achieves better performance than the state-of-the-art baselines on multi-modal retrieval task.
AB - Due to the low storage cost and fast retrieval speed, multi-modal hashing, which maps the instances with different modal data-views into hash codes, has earned increasing research attention. Most existing supervised multi-modal hashing methods exploit the label information to define the similarities between instance pairs to supervise their training process. However, such methods ignore that the transitivity of their defined similarity has been broken in the multi-label scenarios, i.e., the instance x is similar to the instance y, and the instance z is also similar to the instance y, but x may be not similar to z, which will lead to fluctuations in the model optimization process and damage their retrieval performance. For example, in the first batch with instances x and y but without z, the model will be optimized to make the hash codes of x and y similar to each other; In the second batch with instances z and y but without x, the model will be optimized to make the hash codes of z and y similar to each other; In the third batch with the instances x and z but without y, the model will be optimized to make the hash codes of z and x dissimilar to each other, meanwhile in this process, the hash codes of z and x may be dissimilar to that of y which damage the optimizing results of the first two batches. Therefore, we propose a novel Similarity Transitivity Broken-aware Multi-modal Hashing, called STBMH, to solve this problem by adding a novel regularization loss into the original pair-wise loss. For each instance x in a training batch, the regularization loss will take all instances in the training set into account. Extensive experiments on four widely used datasets show that the proposed method achieves better performance than the state-of-the-art baselines on multi-modal retrieval task.
KW - hashing
KW - Multi-modal retrieval
KW - semantic similarity
UR - http://www.scopus.com/inward/record.url?scp=85195415237&partnerID=8YFLogxK
U2 - 10.1109/TKDE.2024.3396492
DO - 10.1109/TKDE.2024.3396492
M3 - Article
AN - SCOPUS:85195415237
SN - 1041-4347
VL - 36
SP - 7003
EP - 7014
JO - IEEE Transactions on Knowledge and Data Engineering
JF - IEEE Transactions on Knowledge and Data Engineering
IS - 11
ER -