In this episode, Françoise von Trapp speaks with Chee Ping Lee, of Lam Research, about the critical role of high bandwidth memory (HBM) in generative AI, emphasizing its high bandwidth and compact design.
HBM memory has received a lot of attention as one of the first technologies to implement 2.5D and 3D stacking. Lee explains how HBM uses advanced packaging technologies like TSV and microbumps to achieve high memory capacity and performance. Lam Research's solutions are key to HBM's success.
Listen to learn details about:
Contact Chee Ping Lee on LinkedIN
Learn more about why HBM Is a critical enabler for generative AI in this blog post.
Lam ResearchBecome a sustaining member!
Like what you hear? Follow us on LinkedIn and Twitter
Interested in reaching a qualified audience of microelectronics industry decision-makers? Invest in host-read advertisements, and promote your company in upcoming episodes. Contact Françoise von Trapp to learn more.
Interested in becoming a sponsor of the 3D InCites Podcast? Check out our 2024 Media Kit. Learn more about the 3D InCites Community and how you can become more involved.