dev-resources.site
for different kinds of informations.
Are LLMs Still Lost in the Middle?
Published at
11/1/2024
Categories
ai
rag
llm
opensource
Author
Daniel Davis
A few days ago, I talked about some of the inconsistency I've seen varying LLM temperature for knowledge extraction tasks.
What does LLM Temperature Actually Mean?
Daniel Davis for TrustGraph ・ Oct 28
#ai
#aiops
#rag
#opensource
I decided to revisit this topic and talk through the behavior I'm seeing. Not only did Gemini-1.5-Flash-002
not disappoint in producing yet more unexpected results, but I saw some strong evidence that long context windows still ignore data in the middle. Below is the Notebook I used during the video:
Featured ones: