Exa Laboratories recently launched!

Launch YC: Exa Laboratories - 27.6x more efficient chips for AI πŸ”₯πŸš€
‍
"The next paradigm of sustainable computing"

‍

TL;DR Exa Laboratories is building reconfigurable chips for AI that are up to 27.6x more efficient and powerful than the H100 GPUs. This could save data centers hundreds of millions to billions in annual energy costs.‍

‍‍
‍Founded by
Elias Almqvist & Prithvi Raj

‍

Meet the Team

Meet Elias and Prithvi from Exa. They're developing reconfigurable chips for AI that are up to around 27.6x* more efficient and performant than the modern H100 GPUs.

*: Read their litepaper!

CEO, Elias Almqvist (right): Self-taught engineer who also studied computer science and computer engineering (dropped out and founded Exa, btw) at Chalmers University of Technology. Previously worked in the embedded software space but also worked on various aerospace projects at university.

CTO, Prithvi Raj (left): Holds an MEng from the world-leading Computational Stats & ML Lab at Cambridge. During his time there, he fell in love with scientific machine learning, a field that demands bespoke neural network architectures and extreme hardware efficiency, and also interned at Microsoft as a software engineer.

‍

The problems!

The AI industry faces critical challenges threatening its sustainable growth:

  1. Unsustainable Energy Consumption: Modern GPUs consume 600-1000 W per unit, creating massive scaling issues for data centers. Large data centers face energy costs in the hundreds of millions to potentially billions each year. GPU power draw seems to be increasing with each new release, while compute per area has remained the same for the past 5 years.
  2. Exponential Compute Demand: With AI advancements, computational power demand is rapidly increasing. Unchecked, this trend could lead to an energy crisis, impeding AI progress and costing data centers billions of dollars.
  3. Hardware Limitations: Current fixed architectures constrain AI innovation. They lack the versatility to efficiently support diverse AI architectures and custom neural network designs crucial for solving real-world problems.

‍‍
The solution.

Exa's polymorphic computing technology addresses these challenges:

  • Reconfigures for each AI model architecture, maximizing efficiency and versatility.
  • Supports diverse approaches, from transformers and GPTs to novel AI architectures (e.g., the new Kolmogorov-Arnold Networks (KAN)).
  • Early simulations indicate potential efficiency gains of up to 27.6x over the H100 GPUs.

This technology could save data centers hundreds of millions to billions in annual energy costs, significantly reducing operational expenses and environmental impact.

Image Credits: Exa Laboratories

‍

For a somewhat deeper technical dive, refer to Exa's litepaper!

‍

Learn More

‍

🌐 Visit exalaboratories.com to learn more.
‍‍
πŸ“– Read Exa's litepaper! All feedback welcome!

‍

🀝 Introduce the founders to anyone in the scientific machine learning space and/or someone conducting research in AI, particularly those who have very β€œcursed model architectures.”

‍

πŸ“§ Get Exa in contact with any data center, AI research organization, or GPU cloud provider (i.e., AWS, OpenAI, Anthropic, DeepMind, Lambda).

‍

πŸ™Œ Give the team intros to semiconductor industry professionals, particularly those interested in bringing back chip manufacturing to the US!

‍

πŸ“ Feel free to reach the founders via email , they would love to hear your feedback and answer your questions!
‍
πŸ‘£ Follow Exa Laboratories on
LinkedIn.

‍

PostedΒ 
October 11, 2024
Β inΒ 
Launch
Β category
← Back to all posts Β 

Join Our Newsletter and Get the Latest
Posts to Your Inbox

No spam ever. Read our Privacy Policy
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.