DADER: Hands-Off Entity Resolution with Domain Adaptation

Jianhong Tu, Xiaoyue Han, Ju Fan Fanj@Ruc.Edu.Cn*, Nan Tang, Chengliang Chai, Guoliang Li, Xiaoyong Du

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

5 Citations (Scopus)

Abstract

Entity resolution (ER) is a core data integration problem that identifies pairs of data instances referring to the same real-world entities, and the state-of-the-art results of ER are achieved by deep learning (DL) based approaches. However, DL-based approaches typically require a large amount of labeled training data (i.e., matching and non-matching pairs), which incurs substantial manual labeling efforts. In this paper, we introduce DADER, a hands-off deep ER system through domain adaptation. DADER utilizes multiple well-labeled source ER datasets to train a DL-based ER model for a new target ER dataset that does not have any labels or with only a few labels. To address the key challenge of domain shift, DADER judiciously selects labeled entity pairs from the source and then aligns distributions of the source and the target by using six popular domain adaptation strategies. DADER can also harness the users to gather a few labels for further improvement. We have built DADER as an open-sourced Python Library with intuitive APIs and demonstrated its utility on supporting hands-off ER in real-world scenarios.

Original languageEnglish
Pages (from-to)3666-3669
Number of pages4
JournalProceedings of the VLDB Endowment
Volume15
Issue number12
DOIs
Publication statusPublished - 2022
Externally publishedYes
Event48th International Conference on Very Large Data Bases, VLDB 2022 - Sydney, Australia
Duration: 5 Sept 20229 Sept 2022

Fingerprint

Dive into the research topics of 'DADER: Hands-Off Entity Resolution with Domain Adaptation'. Together they form a unique fingerprint.

Cite this

Tu, J., Han, X., Fan Fanj@Ruc.Edu.Cn, J., Tang, N., Chai, C., Li, G., & Du, X. (2022). DADER: Hands-Off Entity Resolution with Domain Adaptation. Proceedings of the VLDB Endowment, 15(12), 3666-3669. https://doi.org/10.14778/3554821.3554870