Everyone raves about Nvidia, whose market capitalization topped $1 trillion earlier this year, for its ongoing artificial intelligence (AI) and high-performance computing (HPC) megatrends, but the AI and HPC megatrends There is another company that has not only benefited enormously. It basically controls the production of AI processors.
tens of billions of dollars
That company is TSMC. reports that TSMC makes some of the most complex processors for AI and HPC machines ever made, including Nvidia, AMD, Intel, Tenstorrent, Cerebras and Graphcore (just to name a few). . Digi Times. Nvidia’s A100 and H100 (and A800 and H800 derivatives for the Chinese market), the most popular computing GPUs used for AI and HPC workloads, are manufactured at TSMC, as are AMD’s EPYC CPUs and Instinct GPUs It has been. Emerging AI and HPC stars such as Tenstorrent and developers of curiosities such as wafer-scale processors such as Cerebras have also chosen his TSMC for their products.
TSMC has not disclosed how much revenue it makes from selling specialty processors or SoCs for CPUs, GPUs, AI, data centers, HPCs and servers, but these products are lots of silicon – So TSMC probably makes tens of billions of dollars manufacturing these products for high-profile customers. For example, Nvidia’s GH100 computing GPU has a die size of 814 mm2, while AMD’s EPYC ‘Genoa’ uses 12 Zen 4 based CCD chiplets, each measuring around 72 mm2. , 864 mm2 of N5 silicon is used.
TSMC’s rivals Samsung Foundry and GlobalFoundries do not share revenue, but they Lagging far behind Taiwanese chip manufacturers, it’s safe to say that TSMC is benefiting from AI and HPC in general. The company especially dominates his AI GPU shipments, as it makes AI GPUs for both Nvidia (which controls more than 90% of his shipments) and AMD (which controls less than 10%).
AI and HPC increase in importance for TSMC
TSMC itself offers fairly detailed revenue splits that clearly distinguish between automotive, IoT, smartphones and high-performance computing, but not detailed enough to tell the difference between chips for AI, HPC, client PCs, servers and game consoles. not. For TSMC, these processors and his SoC all belong to the HPC segment, which continues to grow.
Eligible HPC Products 30% of TSMC’s revenue in 2019, or $10.389 billion. In the same year, smartphone SoCs accounted for 49% of his TSMC revenue, or $16.97 billion for him. However, his HPC product share of TSMC’s revenue is increasing. 33% in 2020 ($15 billion), 37% in 2021 ($21 billion), and 41% in 2022 ($31.11 billion). The trend was reversed for smartphone SoCs, which accounted for 39% of TSMC’s revenue in 2022 ($29.59 billion).
While AMD and Nvidia are buying a ton of data center silicon from TSMC, Apple remains the world’s No. 1 chipmaker’s biggest customer, especially as the company sells smartphones and PC SoCs (which fall into the HPC category). ), even more so. Apple alone accounted for about 23% of TSMC’s total revenue in 2022, according to the report. Digi Times.
more chips arrived
Growing interest in generative AI is fueling the market as the semiconductor sector recovers from a downturn. Nvidia has benefited greatly from this AI surge through his A100/A30/A800 and H100/H800 computing GPUs made by TSMC. Similarly, AMD has expanded orders for its upcoming Instinct MI300 series products with TSMC, which are expected to start mass production on TSMC’s N5 class node in the second half of 2023.
Additionally, Apple, AMD and Nvidia have promised to use TSMC’s N3 (3nm class) and N2 (2nm class) manufacturing technology for their future chips. Digi Times report.