Skip to main content

Multi-hop reasoning over knowledge graphs: methods, applications and large-scale systems

Hongyu Ren ( Stanford )

Learning low-dimensional embeddings of knowledge graphs (KGs) is a powerful approach for predicting unobserved or missing relations between entities. However, an open challenge in this area is developing techniques that can go beyond single link prediction and handle more complex multi-hop logical queries, which might involve multiple unobserved edges, entities, and variables. In this talk I present a framework to efficiently and robustly answer multi-hop logical queries on knowledge graphs. Based on prior work that learns entity and relation embeddings on KGs, our key insight is to embed queries in the latent space and design neural logical operators that simulate the real logical operations. We give rise to the first multi-hop reasoning framework that can handle all first-order logic queries on large-scale KGs. We demonstrate the effectiveness and robustness of our approach in the presence of noise and missing relations for query answering as well as a direct application of such approach in answering natural language questions on KGs. Finally, I will introduce an efficient codebase SrKG, the first framework that scales the above algorithms to KGs with over 90 million nodes.

Speaker bio

Hongyu Ren is a fourth year CS Ph.D. student at Stanford advised by Prof. Jure Leskovec. His research interests lie in the intersection of graph representation learning and neural symbolic reasoning on structured data. His recent work includes learning knowledge representations and advancing multi-hop reasoning on large-scale knowledge graphs. His research is supported by the Masason Foundation Fellowship and Apple PhD Fellowship.

 

 

Share this: