podcast

#490 – State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI

01.02.2026
Listen to the episode on your favorite platforms:
  • Apple Podcasts
  • Youtube
  • Spotify
  • Castbox
  • Pocket Casts
  • Stitcher
  • iHeart
  • PlayerFM
  • Overcast
  • Castro
  • RadioPublic

Nathan Lambert and Sebastian Raschka are machine learning researchers, engineers, and educators. Nathan is the post-training lead at the Allen Institute for AI (Ai2) and the author of The RLHF Book. Sebastian Raschka is the author of Build a Large Language Model (From Scratch) and Build a Reasoning Model (From Scratch).
Thank you for listening ❤ Check out our sponsors: https://lexfridman.com/sponsors/ep490-sc
See below for timestamps, transcript, and to give feedback, submit questions, contact Lex, etc.

Transcript:
https://lexfridman.com/ai-sota-2026-transcript

CONTACT LEX:
Feedback – give feedback to Lex: https://lexfridman.com/survey
AMA – submit questions, videos or call-in: https://lexfridman.com/ama
Hiring – join our team: https://lexfridman.com/hiring
Other – other ways to get in touch: https://lexfridman.com/contact

SPONSORS:
To support this podcast, check out our sponsors & get discounts:
Box: Intelligent content management platform.
Go to https://box.com/ai
Quo: Phone system (calls, texts, contacts) for businesses.
Go to https://quo.com/lex
UPLIFT Desk: Standing desks and office ergonomics.
Go to https://upliftdesk.com/lex
Fin: AI agent for customer service.
Go to https://fin.ai/lex
Shopify: Sell stuff online.
Go to https://shopify.com/lex
CodeRabbit: AI-powered code reviews.
Go to https://coderabbit.ai/lex
LMNT: Zero-sugar electrolyte drink mix.
Go to https://drinkLMNT.com/lex
Perplexity: AI-powered answer engine.
Go to https://perplexity.ai/

OUTLINE:
() – Introduction
() – Sponsors, Comments, and Reflections
() – China vs US: Who wins the AI race?
() – ChatGPT vs Claude vs Gemini vs Grok: Who is winning?
() – Best AI for coding
() – Open Source vs Closed Source LLMs
() – Transformers: Evolution of LLMs since 2019
() – AI Scaling Laws: Are they dead or still holding?
() – How AI is trained: Pre-training, Mid-training, and Post-training
() – Post-training explained: Exciting new research directions in LLMs
() – Advice for beginners on how to get into AI development & research
() – Work culture in AI (72+ hour weeks)
() – Silicon Valley bubble
() – Text diffusion models and other new research directions
() – Tool use
() – Continual learning
() – Long context
() – Robotics
() – Timeline to AGI
() – Will AI replace programmers?
() – Is the dream of AGI dying?
() – How AI will make money?
() – Big acquisitions in 2026
() – Future of OpenAI, Anthropic, Google DeepMind, xAI, Meta
() – Manhattan Project for AI
() – Future of NVIDIA, GPUs, and AI compute clusters
() – Future of human civilization