Learning from Novel Knowledge: Continual Few-shot Knowledge Graph Completion

Abstract

Knowledge graph (KG) completion has been increasingly recognized as a vital approach for uncovering missing knowledge and addressing the incompleteness issue in KGs. To enhance inference on rare relations and mitigate the impact of the long-tail distribution, the dominant strategy designs few-shot models following the meta-learning paradigm. However, these approaches typically operate under the assumption that KGs are available instantly, disregarding the newly emerging relations during KG enrichment. Thus, the emergence of these novel relations presents a need for few-shot models to continually learn from emerging knowledge. Although promising, two significant obstacles, i.e., catastrophic forgetting and the scarcity of novel relations, prevent effective learning from newly emerging relations. In this paper, we propose a novel framework designed to equip the few-shot model with the ability to learn sequentially from novel relations. Specifically, we introduce innovative strategies at both data and model levels: datalevel rehearsal and model-level modulation to address catastrophic forgetting, alongside multi-view relation augmentation aimed at resolving the issue of insufficient novel relations. Extensive experiments conducted on real-world KGs validate the effectiveness of our proposed method.

Publication
Accepted by CIKM 2024 Full Research Paper Track
Zhuofeng Li
Zhuofeng Li
Fourth Year Undergraduate Student

My research interests include Multimodal AI, Large Language Models and Graph Representation Learning.