Geometric Foundation Model for Knowledge Graph Reasoning
Analysis
This paper introduces Gamma, a novel foundation model for knowledge graph reasoning that improves upon existing models like Ultra by using multi-head geometric attention. The key innovation is the use of multiple parallel relational transformations (real, complex, split-complex, and dual number based) and a relational conditioned attention fusion mechanism. This approach aims to capture diverse relational and structural patterns, leading to improved performance in zero-shot inductive link prediction.
Key Takeaways
- •Proposes Gamma, a new foundation model for knowledge graph reasoning.
- •Employs multi-head geometric attention with diverse relational transformations.
- •Uses a relational conditioned attention fusion mechanism.
- •Achieves significant performance improvements in zero-shot inductive link prediction compared to existing models.
“Gamma consistently outperforms Ultra in zero-shot inductive link prediction, with a 5.5% improvement in mean reciprocal rank on the inductive benchmarks and a 4.4% improvement across all benchmarks.”