23 - LLM theory | Mike Erlihson
Listen now
Description
What are transformers, why it is so expensive to train a Transformer-based model and what is the architecture of the future LLMs
More Episodes
advanced RAG techniques like RAPTOR and using Clues
Published 11/24/24
Published 11/24/24
Cursor, RepoAgent, Aider, and some more coding agents - overview and interesting architectural concepts
Published 10/19/24