Guillaume Buthmann, Tomoya Sakai, et al.
ICASSP 2025
Wikification of large corpora is beneficial for various NLP applications. Existing methods focus on quality performance rather than run-time, and are therefore non-feasible for large data. Here, we introduce RedW, a run-time oriented Wikification solution, based on Wikipedia redirects, that can Wikify massive corpora with competitive performance. We further propose an efficient method for estimating RedW confidence, opening the door for applying more demanding methods only on top of RedW lower-confidence results. Our experimental results support the validity of the proposed approach.
Guillaume Buthmann, Tomoya Sakai, et al.
ICASSP 2025
Revanth Reddy, Jaehyeok Doo, et al.
EMNLP 2024
Massimiliano Pronesti, Joao Bettencourt-Silva, et al.
ACL 2025
Nandana Mihindukulasooriya, Sarthak Dash, et al.
ISWC 2023