Description
"We all know that LLMs hallucinate and RAG can help by providing current, relevant information to the model for generative tasks.
But can we do better than just vector retrievals? A knowledge graph can represent data (and reality) at high fidelity and can make this rich context available based on the user's questions. But how to turn your text data into graphs data structures?
Here is where the language skills of LLM can help to extract entities and relationships from text, which you then can correlate with sources, cluster into communities and navigate while answering the questions.
In this talk we will both dive into Microsoft Research's GraphRAG approach as well as run the indexing and search live with Neo4j and LangChain."