GraphRAG App Project using Neo4j, Langchain, GPT-4o, and Streamlit

preview_player
Показать описание
In this tutorial, I walk you through the development of Graphy v1, a real-time GraphRAG (Graph Retrieval-Augmented Generation) application. Using LangChain, Neo4j, and OpenAI's GPT-4 and text-ada-002 models, I'll show you how to extract knowledge from documents and enable natural language querying over a graph database.

What You'll Learn:
1. Setting up a modular app where users can input their own credentials.
2. Using LangChain's LLMGraphTransformer to convert documents into graph data.
3. Integrating with Neo4j to store and query graph data.
4. Implementing natural language querying using OpenAI's GPT-4.
5. Enhancing the app's UI with Streamlit, including adding a sidebar, logo, and interactive elements.

By the end of this video, you'll have a functional app that allows users to upload PDF documents, extract their content into a Neo4j graph database, and interact with the data using natural language queries.

If you found this video helpful, please like, comment, and subscribe to my channel for more tutorials like this! Your support helps me create more content to help you in your development journey.

Join this channel to get access to perks:

To further support the channel, you can contribute via the following methods:

Bitcoin Address: 32zhmo5T9jvu8gJDGW3LTuKBM1KPMHoCsW

#graphrag #rag #ai
Рекомендации по теме
Комментарии
Автор

great! cannot wait till Ollama v2 version :D

artur
Автор

Can u please make a video on hybrid rag (graph + vector), it will be really useful

DARK-fsrz
Автор

Can u please do the same thing using both local models from ollama
It will be really useful

vamsivamsi
Автор

please do it using gemini or any other open source LLM

zanefalcao
Автор

Can you please make a video about contextual retrieval rag with n8n or flowise.

liviuspinu
Автор

Hi, I want to know whether it can be functional if our company's neo4j database is at other server, which we access through remote desktop connection specifically jumphost. Because we have so many restrictions working on other server.

Aspirant