KGCRR: An Effective Metric-Driven Knowledge Graph Completion Framework by Designing a Novel Upper Bound Function with Adaptive Approximation to Reciprocal Rank
Abstract
Knowledge Graph Embedding (KGE) methods have achieved great success in predicting missing links in knowledge graphs, a task also known as Knowledge Graph Completion (KGC). Under this task, the Reciprocal Rank (RR) of ground-truth items serve as a key indicator for evaluating the method’s performance. However, most existing studies have overlooked the inconsistency between the ranking metric, RR, and the optimization objective functions, resulting in sub-optimal KGC performance. To address this issue, we propose a KGC framework called KGCRR by designing a novel upper bound function named CRR. By introducing the parameter-pressure ρ to shift the sigmoid function, CRR achieves a better approximation to RR compared with existing objective functions. We theoretically proved that by adjusting ρ, CRR can achieve a more effective approximation to RR. By narrowing the discrepancy with RR and alleviating the gradient vanishing issue associated with the direct optimization of RR loss, CRR demonstrates an advantage in optimizing RR. CRR serves as a plug-and-play objective, capable of seamless integration into various KGE methods. Through extensive experiments conducted on FB15k-237 and WN18RR datasets, we have obtained promising results, with an average improvement of 19.06% in MRR, indicating that CRR significantly enhances the performance of existing methods.