After NVIDIA launched its Rubin AI GPUs last month , we decided to interview Larry Yang, the chief product officer at Phononic. We were wondering about the new chips' cooling requirements given that energy constraints are closely related to AI rollout.
Larry is an industry veteran with more than 30 years of experience under his belt. He has previously worked at Google, IBM, Microsoft and Cisco. Our conversation revolved around the cooling requirements for NVIDIA's and other AI chips. It also covered AI ASICs, commonly known as custom AI processors.
Larry outlined that high bandwidth memory (HBM) chips are one reason why AI chips require significant cooling. He also explained how data centers are cooled and the role Phononic's unique thermoelectric coolers (TECs) can allow AI companies

WCCFTECH News

IFL Science
PC World
FOX 13 Tampa Bay Crime
VARIETY
The Journal Gazette
Raw Story
The Babylon Bee
Truthout
Real Simple Home