Google has unveiled a new AI chip technology called TurboQuant, designed to accelerate AI inference tasks. The technology focuses on improving the efficiency of running large language models by using advanced numerical formats and computational methods.
Industry analysts note that a key feature of such AI-specific hardware is its potential to reduce the system's dependency on external high-bandwidth memory (HBM). This memory is a critical and expensive component currently supplied by companies like Micron, Samsung, and SK Hynix.
If widely adopted, Google's in-house advancements could decrease the growth trajectory for HBM demand from major cloud providers, directly impacting memory chip manufacturers. However, the broader market for AI memory remains robust, and Micron and others are continuing to innovate and ramp up production to meet general industry needs.
The long-term impact on Micron is uncertain and will depend on the adoption rate of Google's proprietary technology versus the continued industry-wide reliance on standardized, high-performance memory solutions for AI training and inference.