TY - JOUR
T1 - Pre-training for Legal Case Retrieval Based on Inter-Case Distinctions
AU - Su, Weihang
AU - Ai, Qingyao
AU - Wu, Yueyue
AU - Xie, Anzhe
AU - Wang, Changyue
AU - Ma, Yixiao
AU - Li, Haitao
AU - Wu, Zhijing
AU - Liu, Yiqun
AU - Zhang, Min
N1 - Publisher Copyright:
© 2025 Copyright held by the owner/author(s).
PY - 2025/7/23
Y1 - 2025/7/23
N2 - Legal case retrieval aims to help legal workers find relevant cases related to their cases at hand, which is important for the guarantee of fairness and justice in legal judgments. While recent advances in neural retrieval methods have significantly improved the performance of open-domain retrieval tasks (e.g., Web search), their advantages haven’t been observed in legal case retrieval due to their thirst for annotated data. As annotating large-scale training data in legal domains is prohibitive due to the need for domain expertise, traditional search techniques based on lexical matching such as TF-IDF, BM25, and Query Likelihood are still prevalent in legal case retrieval systems. While previous studies have designed several pre-training methods for IR models in open-domain tasks, these methods are usually suboptimal in legal case retrieval because they cannot understand and capture the key knowledge and data structures in the legal corpus. To this end, we propose a novel pre-training framework named Caseformer that enables the pre-trained models to learn legal knowledge and domain-specific relevance-matching patterns in legal case retrieval without any human-labeled data. This framework is designed to support both dense retrieval models and neural re-ranking models. Through three unsupervised learning tasks, Caseformer is able to capture the special language, document structure, and relevance-matching patterns of legal case documents, making it a strong backbone for downstream legal case retrieval tasks. Experimental results show that our model has achieved state-of-the-art performance in both zero-shot and fine-tuning settings. Also, experiments on both Chinese and English legal datasets demonstrate that the effectiveness of Caseformer is language-independent in legal case retrieval.
AB - Legal case retrieval aims to help legal workers find relevant cases related to their cases at hand, which is important for the guarantee of fairness and justice in legal judgments. While recent advances in neural retrieval methods have significantly improved the performance of open-domain retrieval tasks (e.g., Web search), their advantages haven’t been observed in legal case retrieval due to their thirst for annotated data. As annotating large-scale training data in legal domains is prohibitive due to the need for domain expertise, traditional search techniques based on lexical matching such as TF-IDF, BM25, and Query Likelihood are still prevalent in legal case retrieval systems. While previous studies have designed several pre-training methods for IR models in open-domain tasks, these methods are usually suboptimal in legal case retrieval because they cannot understand and capture the key knowledge and data structures in the legal corpus. To this end, we propose a novel pre-training framework named Caseformer that enables the pre-trained models to learn legal knowledge and domain-specific relevance-matching patterns in legal case retrieval without any human-labeled data. This framework is designed to support both dense retrieval models and neural re-ranking models. Through three unsupervised learning tasks, Caseformer is able to capture the special language, document structure, and relevance-matching patterns of legal case documents, making it a strong backbone for downstream legal case retrieval tasks. Experimental results show that our model has achieved state-of-the-art performance in both zero-shot and fine-tuning settings. Also, experiments on both Chinese and English legal datasets demonstrate that the effectiveness of Caseformer is language-independent in legal case retrieval.
KW - Contrastive Learning
KW - Legal Case Retrieval
KW - Pre-training Methods
UR - https://www.scopus.com/pages/publications/105018462569
U2 - 10.1145/3735127
DO - 10.1145/3735127
M3 - Article
AN - SCOPUS:105018462569
SN - 1046-8188
VL - 43
JO - ACM Transactions on Information Systems
JF - ACM Transactions on Information Systems
IS - 5
M1 - 129
ER -