Qualcomm has announced its latest AI chips, which are designed to scale up to a purpose-built rack-level AI inference solution, but interestingly, they employ mobile memory onboard.
Qualcomm's New AI Chips Take a 'Daring' Pivot Away From HBM To Target Efficient Inferencing Workloads
Qualcomm has come a long way from being a mobile-focused firm, and in recent years, the San Diego chipmaker has expanded into new segments, including consumer computing and AI infrastructure. Now, the firm has announced its newest AI200 and AI250 chip solutions , which are reportedly designed for rack-scale configurations. This not only marks the entry of a new player in a segment dominated by NVIDIA and AMD, but Qualcomm has managed to find a unique implementation by utilizing mobile-focused LPDDR memory.

WCCFTECH News

IFL Science
PC World
FOX 13 Tampa Bay Crime
VARIETY
The Journal Gazette
NBC 5 Dallas-Fort Worth Entertainment
NBA
NBC10 Philadelphia Entertainment
The Babylon Bee
Truthout