MEX Markets: Outlook for the ASIC Chip Market

Advertisements

The recent surge in Broadcom's stock, soaring nearly 25% and pushing its market capitalization past the remarkable threshold of $1 trillion, highlights the robust competition in the realm of artificial intelligence (AI) chipsThe landscape is undergoing a significant transformation as tech giants innovate and compete to dominate this pivotal sectorAmazon's latest AI chips have demonstrated a remarkable cost-performance ratio, with ASIC-based instances outperforming traditional GPU-based counterpartsApple's rumored collaboration with Broadcom to develop its own AI chip, potentially ready for production by 2026, further underscores the increasing interest in custom ASIC solutions among major tech playersAs the AI technology race intensifies, Broadcom is well-positioned to obtain a more prominent role in this burgeoning market.

Recently, Google has also made significant waves in the AI chip landscape

Just a few days ago, it introduced its most powerful AI model yet, Gemini 2.0, effectively "cutting in line" ahead of OpenAIThe backbone of this formidable model is Google's proprietary AI chip, the Trillium TPU, which falls under the category of ASIC (Application-Specific Integrated Circuit) technologyCoinciding with the model's launch, Google rolled out the Trillium TPU for broader market availabilityThis chip boasts exceptional capabilities, particularly in handling training-intensive large language models and Mixture of Experts (MoE) frameworks, showcasing performance that far surpasses competing productsMoreover, it offers enhanced cost-effectiveness for AI training and inference tasksThis presents a compelling case for enterprises and research institutions seeking efficient and budget-friendly AI solutions.


At the beginning of this month, Amazon Web Services (AWS) aggressively stepped into the AI chip race, unveiling the AI chip Trn2 Ultra Server and Amazon EC2 Trn2 instance

These products, based on ASIC technology, represent a significant leap in cost-performance metricsFor instance, the Amazon EC2 Trn2 instance cleverly integrates 16 Trainium 2 chips, delivering an impressive computing capacity of up to 20.8 PFLOPSCompared to mainstream GPU-based EC2 instances currently available, the cost-performance advantage is striking, with savings estimated at 30% to 40%. Such enhancements amplify AWS's competitiveness in the AI computing segment, positioning it as a more attractive option for clients embarking on complex AI projects, ranging from large-scale deep learning training tasks to real-time AI inference applications—all achievable within a controlled budget.


The ASIC chip supplier space is witnessing a renaissance, with companies like Broadcom and Marvell hitting their performance stride

On December 12, Broadcom released a remarkable earnings report reflecting explosive growth in AI revenue, surging 220% during fiscal year 2024 and eclipsing $12.2 billionThanks to favorable market trends, Broadcom anticipates continued growth in its AI product sales, with estimates suggesting a year-on-year increase of up to 65% in the first fiscal quarter of 2025. Additionally, Broadcom is reportedly collaborating with three significant clients on AI chip developmentThe company envisions an AI chip market size of $15 billion to $20 billion for the upcoming year, demonstrating an impressive confidence in its technological capabilities and market positioning.


Meanwhile, Marvell recently reported its fiscal year 2025 third-quarter earnings, which concluded on November 2, 2024. The company reported a 7% year-over-year increase in revenue and a 19% quarter-over-quarter increase, marking $1.516 billion

alefox

Both its quarterly performance and outlook for the fourth quarter surpassed analysts' expectationsCEO Matt Murphy attributed this success to the sales of new custom AI chips, particularly for Amazon and other data center enterprises.

Market analysts at MEXMARKETS predict a significant shift in demand towards inference computation, estimating it could outpace training computation by as much as 4.5 timesA separate report from Barclays projects rapid growth in AI inference demand, suggesting it may account for over 70% of total computation needs in general AI applicationsCurrently, NVIDIA's GPUs dominate the inference market with around 80% market share; however, as more major tech companies roll out custom ASIC chips, forecasts indicate this figure may decline to about 50% by 2028. This growing preference for ASIC over traditional GPU architectures is indicative of a shift in market dynamics as manufacturers seek to optimize performance and cut costs in AI-related projects.