TSMC founder Morris Chang recently revealed that customers have approached the company to build up to ten new fabs for AI processors, indicating a significant increase in demand for processors used in AI applications. This surge in demand is not surprising, given the booming market for AI processors and the well-known inability of market leader Nvidia to meet it. Moreover, companies like OpenAI are experiencing insufficient access to AI computing performance, leading them to demand more processors from existing suppliers and consider building their own silicon.

The reported conversations with customers have included requests for multiple fabs, ranging from three to ten. While this level of demand is unprecedented, Chang predicts that the actual demand for AI processors lies between tens of thousands of wafers and tens of fabs. However, constructing such facilities comes with substantial financial implications, as a large 3nm-capable fab may cost over $20 billion and require years of construction.

As of now, TSMC's 2024 capital expenditure budget is between $28 billion and $32 billion, making it impractical for the company to construct multiple Gigafabs every year. In fact, building ten leading-edge fabs would cost well over $200 billion, not including the cost of supporting the supply chain and infrastructure, significantly surpassing TSMC's investing capabilities. This aligns with rumors that OpenAI's CEO, Sam Altman, is considering raising trillions to build a network of fabs run by leading chipmakers, a plan that faces significant economic challenges.

Due to the short lifecycle of advanced processors for AI and HPC, companies like TSMC may hesitate to build excess leading-edge capacity to meet immediate AI sector demand, as this could lead to significant losses in the foundry sector in the future. Hence, while an upsurge in production is expected, it is likely to be within rational limits rather than the construction of ten new leading-edge fabs specifically for AI chips in the near future.