Large Language Models and Knowledge Graphs: Merging Flexibility and Structure

preview_player
Показать описание
We discuss how to infuse Large Language Models (LLMs) with Knowledge Graphs (KGs)! This is a very exciting approach, as we can combine the flexibility and generalisability of LLMs with the structure and reliability of KGs, and is a first step towards neurosymbolic architectures!

I will also be going through a LangChain implementation of LLMs with knowledge graphs as inputs, demonstrate some of the limitations currently faced, and show how we can better prompt engineer KG usage with LLMs using my very own StrictJSON Framework.

~~~~~~~~~~~~~

LLMs as Graph Neural Networks / Embeddings

~~~~~~~~~~~~~

0:00 Introduction
1:55 Pros and Cons of LLMs and Knowledge Graphs (KGs)
4:55 Retrieval Augmented Generation (RAG)
8:10 Problems with LLMs and RAG
17:40 Basics of KG
26:09 Hierarchy in KG
31:13 KGs can be structurally parsed
33:17 KG can represent environmental transitions
33:58 KG as tool/memory for LLM
39:16 3 approaches to integrate KG and LLMs
40:21 Approach 1: KG-augmented LLMs
59:05 Approach 2: LLM-augmented KG
1:05:37 Approach 3: LLMs and KG two-way interaction
1:10:16 LangChain Graph QA Example
1:16:35 Strict JSON Framework Graph QA Example
1:23:00 Discussion

~~~~~~~~~~~~~

AI and ML enthusiast. Likes to think about the essences behind breakthroughs of AI and explain it in a simple and relatable way. Also, I am an avid game creator.

Рекомендации по теме
Комментарии
Автор

Thanks for this, this talk was excellent. I've been looking to combine LLMs with KGs and have very similar intuitions when it comes to using the same embedding space for the KG as for the LLM. I really like your frame of having the right abstraction spaces to solve the problem at hand. Having written countless prompts, as well as looking at how humans have solved problems over the years, it seems to me that fostering the right context (abstraction space) is vital when trying to solve a new problem. Einstein's discoveries were, in part possible due to the context of his life experience that gave him intuitions to solve a certain type of problem. The cool thing with LLMs is that we can bootload intuition at will, allowing us to swap out abstraction spaces until we find a combination that gives us the right context to solve a problem. Great work!

TheHoinoel
Автор

Knowledge packed video and excellent teaching skill.

snehotoshbanerjee
Автор

You stretched my mind, thank you for taking the time to share.

AaronEden
Автор

You're the first person I've hear mentioning this concept of context-dependent embeddings. I started tinkering with the same idea back in December of last year, never had a name for it. I was doing some self-reflection and thought about how some of my own behaviors and thoughts were contradictory sometimes- dependent on how my emotions were and such. If I could make a certain perspective of mine a 'node' it's embedding would very likely change given different contexts

leonlysak
Автор

While I also appreciate the flexibility of knowledge graphs (KGs) as far as them being able to easily represent relationships, I too agree with you that KGs are not necessarily the best or most effective way to represent intelligence. I will stay in tune with your works. I hope to publish on this in the near future. Thanks for the presentation.

chrisogonas
Автор

Amazing work. Many thanks for your efforts sharing your knowledge

AyaAya-fhwx
Автор

1:31:30 "Mix and match" - I'd be interested in understanding how the AI might decide which space(s) to interrogate based on the input/prompt.

AndrewNeeson-vpti
Автор

1:13:25 "When was the MacNCheese pro announced?" --> Fail!
But, I wonder - what if the source text was stored as alongside the triplet? So that way the graph capability could be used to efficiently identify the relevant records, but the original language would be retained.

AndrewNeeson-vpti
Автор

Valuable content! Thank you for sharing:)

polarbear
Автор

Where are these livestreams done? Is this a college course? I've actually never this hyped for a presentation-driven video, you've done a really good job walking thru the papers :)

jimhrelb
Автор

Forgive me if my question is not correct, Are we using LLMs to build a Knowledge graph here?

rajathslr
Автор

at around 23:00 it is said that current knowledge graphs are 'too restrictive' because of context. but the way i see it, they are too broad. you still want that total knowledge available even if its not currently relevant, we just want to filter it down.. right?

judgeomega
Автор

Update: StrictJSON is now a python package
Simply "pip install strictjson"

johntanchongmin
Автор

Great info. I have a small question.

KG parser what you are talking about expects KG as input however if we have a huge dataset constructing and sending such KG will cost more right?

chakradharkasturi
Автор

suppose we have large pdf about some person and we want rate him based on skills which is define in pdf so those kind of questions which is rely on not just based on subpart of text but whole text at that time how we can appporach this problem with KG and Vector Embedding stuff. and every time we can't call langchain summarization api (chain_type:stuff) becuase it's costly how we can solve this problem

nikhilshingadiya
Автор

Curious how far you are 😅 i relistened to this and now second probably we are trying to enhance for the same solution

mulderbm
Автор

"maybe the text itself is way more expressive than the knowledge graph"

ouch 1:16:10

in the age of LLM, it seems that any representation that deviates from the source text is a serious gamble

bastabey
Автор

what about Graphormers ? no need for different embedding spaces: language token sequence is just a specialization of a graph

JReuben
Автор

Does most of this apply to decentralized knowledge graphs?

AEVMU
Автор

thanks for the content. It really states challenges for the present and future. He forget somehow about openAi, and gives google a role of good, remembering that #don't be evil# motto.... but is it something else behind that old motto?

rodrigosantosalvarez