✨
AI Summary
- Jack Morris focuses on information-theoretic understanding of LLMs including embeddings and latent space representations, offering underrated research with accessible explanations for mass audience
- Research demonstrates embeddings reveal nearly as much as full text; contextual document embeddings and transformer architecture evolution directly inform practical AI engineering decisions
- Departing from trendy agent/benchmark work to study foundational information theory provides valuable counterweight to engineering-driven research cycles
Guests on This Episode
JM
Jack Morris
1 podcast appearance