AI for Atoms: How Periodic Labs is Revolutionizing Materials Engineering with Co-Founder Liam Fedus

Key Takeaways

  • Scientific AI requires a closed-loop interface with the physical world - progress in material science is bottlenecked by the inability of purely digital models to conduct experiments and learn from real-world feedback loops.

    Science ultimately isn't sitting in a room thinking really hard. You have to conduct experiments. You have to learn from them. You have to interface with reality.

    Liam Fedus
  • Domain-specific labs should leverage existing LLM priors - Periodic Labs focuses its R&D exclusively on material science while utilizing third-party foundation models for coding and general reasoning to accelerate development.

    Periodic spends zero effort on improving coding models. We're incredibly impressed by Codex, Cloud Code and so that's been a huge accelerator for the company.

    Liam Fedus
  • Physical experimentation provides the ground truth missing from literature - because scientific papers often contain noisy or contradictory data spanning multiple orders of magnitude, physical labs are required to ground ML models in reality.

    One of the engineers on our team was looking at a reported material property. And it was just sort of extracted values from literature. It was really interesting to see the reported value spanned many orders of magnitude. And so you train an ML system on that and it's like, well, the best you can do is model this distribution, but you're no closer to like a ground truth.

    Liam Fedus
Want more? Subscribe to go deeper! →

Episode Description

What happens when you apply the scaling laws of large language models to the physical work of atoms? Elad Gil sits down with Liam Fedus, co-founder at Periodic Labs, which is pioneering an AI foundation lab for atoms. Liam discusses how he pivoted from dark matter physics research to the front lines of artificial intelligence, including stints at Google Brain and working on ChatGPT at OpenAI. He talks about how Periodic is connecting massive language models to the physical world to overcome data bottlenecks in material science. Liam also shares how they use language models as an orchestration layer operating alongside specialized neural nets to run closed-loop physical experiments. They also explore the future of AGI and ASI, as well as the role of robotics in lab automation. Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @LiamFedus | @periodiclabs Chapters: 00:00 – Cold Open 00:05 – Liam Fedus Introduction 00:39 – Liam’s Background at Google Brain, OpenAI 05:14 – From ChatGPT to Materials and Atoms 06:34 – Training Data in the Physical World 09:52 – Generalization Across Domains 11:31 – Models as an Orchestration Layer 12:48 – Commercialization and Business Model 16:10 – How Periodic’s Success May Shape the Future  17:45 – Multidisciplinary Scaling 19:41 – Capital and Compute 21:12 – Hiring at Periodic 21:44 – Thoughts on AGI and ASI 23:30 – Timeline for Machine-Directed Self-Improvement 25:39 – Automation and Data Generation 27:59 – Why Liam is Excited About the Future of Robotics 29:25 – Conclusion

Featured in Category Feeds

Stay in the Loop

Get No Priors: Artificial Intelligence | Technology | Startups summaries and more, delivered free.