PodcastIntel
Sign in Get Started Free
Latent Space: The AI Engineer Podcast
Latent Space: The AI Engineer Podcast

Information Theory for Language Models: Jack Morris

Jul 2, 2025 · 1h 18m
AI Summary
  • Jack Morris focuses on information-theoretic understanding of LLMs including embeddings and latent space representations, offering underrated research with accessible explanations for mass audience
  • Research demonstrates embeddings reveal nearly as much as full text; contextual document embeddings and transformer architecture evolution directly inform practical AI engineering decisions
  • Departing from trendy agent/benchmark work to study foundational information theory provides valuable counterweight to engineering-driven research cycles

Guests on This Episode

JM
Jack Morris
1 podcast appearance

More from Latent Space: The AI Engineer Podcast

View all episodes →

Get AI Summaries for Every New Episode

Subscribe to Latent Space: The AI Engineer Podcast and get AI summaries, guest tracking, and email digests delivered automatically.

Sign Up Free →