Key takeaways
VC investors have already shifted their focus to categories in which startups can carve out AI chip market share. In 2021 and 2022, AI/ML VC deal value in inferencing has become significantly larger than training-focused only, breaking a historical trend.
Both the PC and automotive AI chip markets are growing faster than the data center AI chip market at over 30% each, at this pace they will surpass data center’s market size by 2025.
Edge inferencing is likely to be dominated by existing vendors, all of which are investing heavily in supporting transformers and LLMs. So, what opportunities exist for new entrants? Automotive partnerships could be the hope; second, supplying IP or chiplets to one of the SoC vendors; and, creating customized chips for intelligent edge devices that can afford the cost.
PC and automotive AI chip markets
AI computing remains a major growth driver for the semiconductor industry, and at $43.6 billion in 2022, the market remains large enough to support large private companies. Also, the AI semiconductor market is divided into companies in China and those outside of China, because of the current political circumstance.
Nvidia is clearly the leader in the market for training chips, but that only makes up about 10% to 20% of the demand for AI chips. Inference chips interpret trained models and respond to user queries. This segment is much bigger, and quite fragmented, not even Nvidia has a lock on this market. Techspot estimates that the market for AI silicon will comprise about 15% for training, 45% for data center inference, and 40% for edge inference. The serviceable market for foundation model training will likely remain too small to support large companies, thereby relatively low acquisition offers are possible. Where is the opportunity?
Data center, automotive, and PC, these three sectors take 90% of the AI chip market if excluding the market of smartphones and smartwatches (to prevent data bias from Apple and Samsung), but the data center has 6 vendors taking 99% of market share, that market is saturated.
Both the PC and automotive AI semiconductor markets are growing faster than the data center AI semiconductor market at over 30% each, at this pace they will surpass data center’s market size by 2025.
Inferencing at the Edge
In the past 2 years, we saw a 69.0% decline in year-over-year VC funding for AI chip startups outside of China, VC investors have already shifted their focus to categories in which startups can carve out market share. In 2021 and 2022, AI/ML VC deal value in inferencing has become significantly larger than training-focused only, breaking a historical trend. Also, edge computing demands are driving more commercial partnerships for inference-focused chips than for cloud training chips.
Custom chips and startups can outperform the chip giant on specific inference tasks that will become crucial as large language models are rolled out from cloud data centers to customer environments – Inferencing at the Edge. The term ‘edge’ is referring to any device in the hands of an end-user (phones, PCs, cameras, robots, industrial systems, and cars). These chips are likely to be bundled into a System on a Chip (SoC) that executes all the functions of those devices.
What opportunities exist for new entrants?
Edge inferencing is likely to be dominated by existing vendors of traditional silicon, all of which are investing heavily in supporting transformers and LLMs. So, what opportunities exist for new entrants?
- Supply IP or chiplets to one of the SoC vendors. This approach has the advantage of relatively low capital requirements; let your customer handle payments to TSMC. There is a plethora of customers aiming to build SoCs.
- Find some new edge devices that could benefit from a tailored solution. Shift focus from phones and laptops to cameras, robots, drones, industrial systems, etc. But some of these devices are extremely cheap and thus cannot accommodate chips with high ASPs. A few years ago, many pitches for companies looking to do low-power AI on cameras and drones. Very few have survived. But, edge computing has become more prevalent in the trend of “smart everything”, and computing platforms also extend into wearables such as Mixed Reality headsets, technology advancements always push new possibilities.
- Automotive partnerships could be the hope, this market is still highly fragmented, but the opportunity is substantial. In Q2 2022, edge AI chip startup Hailo announced a partnership with leading automotive chipmaker Renesas for self-driving applications.
As the world is going through a major trend of electrification for decarbonization and automating optimization of energy usage and everything, edge AI chips with the right upstream and downstream partnerships are promising opportunities for investors and startups. The financial downturn may encourage M&A for some startups that align with the product needs of incumbents. Some historical examples are Annapurna Labs’ $370.0 million exit to Amazon and Habana Labs’ $1.7 billion exit to Intel.
References:
Inferring the future of AI chips, Pitchbook