LELA: an LLM-based Entity Linking Approach with Zero-Shot Domain Adaptation
3 authors
arXiv:2601.05192v1
Authors
Samy HaffoudhiFabian M. SuchanekNils Holzenberger
Abstract
Entity linking (mapping ambiguous mentions in text to entities in a knowledge base) is a foundational step in tasks such as knowledge graph construction, question-answering, and information extraction. Our method, LELA, is a modular coarse-to-fine approach that leverages the capabilities of large language models (LLMs), and works with different target domains, knowledge bases and LLMs, without any fine-tuning phase. Our experiments across various entity linking settings show that LELA is highly competitive with fine-tuned approaches, and substantially outperforms the non-fine-tuned ones.
Paper Information
- arXiv ID:
- 2601.05192v1
- Published:
- Categories:
- cs.CL