The top ten artificial intelligence chip manufacturing companies in 2024

2024-06-13

This article will introduce top AI chip suppliers to help businesses choose the right chips.

As shown in the figure below, the number of parameters in neural networks (i.e., width and depth) as well as model size are increasing. To build better deep learning models and powerful artificial intelligence applications, organizations need to increase computational power and memory bandwidth.

Powerful general-purpose chips (such as CPUs) cannot support highly parallel deep learning models. Therefore, the demand for artificial intelligence chips that support parallel computing capabilities is growing, and McKinsey believes this trend will continue.

However, even Intel, with its many world-class engineers and strong research background, needs three years to develop its own AI chips. Therefore, for most companies, purchasing chips from these suppliers or renting capacity from cloud GPU providers is the only way to develop powerful deep learning models. This article will introduce top AI chip suppliers to help businesses choose the right chips.

Who are the leading artificial intelligence chip manufacturers?1. NVIDIA

Since the 1990s, NVIDIA has been producing Graphics Processing Units (GPUs) for the gaming sector. Both PlayStation 3 and Xbox utilize NVIDIA's graphics arrays. The company also manufactures AI chips such as Volta, Xavier, and Tesla. Benefiting from the generative AI boom, NVIDIA achieved outstanding results in 2023, with a valuation reaching trillions, solidifying its position as a leader in the GPU and AI hardware market.

Advertisement

NVIDIA's chipsets are designed to address business issues across various industries. For instance, Xavier serves as the foundation for autonomous driving solutions, while Volta is targeted at data centers. The DGX A100 and H100 are NVIDIA's successful flagship AI chips, specifically designed for AI training and inference in data centers. NVIDIA has released H200, B200, and GB200 chips; HGX servers, such as the HGX H200 and HGX B200 that integrate eight of these chips; the NVL series, and the GB200 SuperPod that combines more chipsets into large clusters.

Cloud GPUs

For AI workloads in the cloud, Nvidia has an almost monopolistic position, with most cloud vendors using only Nvidia GPUs as cloud GPUs. Nvidia also launched the DGX Cloud product, providing enterprises with direct access to cloud GPU infrastructure.

2. AMD

AMD is a chip manufacturer with products in CPUs, GPUs, and AI accelerators. For example, AMD's Alveo U50 data center accelerator card has 50 billion transistors. The accelerator can run 10 million embedded datasets and perform graph algorithms within milliseconds.

In June 2023, AMD launched the MI300 for AI training workloads and will compete with NVIDIA for market share in this segment. As demonstrated by ChatGPT, the rise of generative AI has led to a rapid increase in demand, making Nvidia's AI hardware difficult to procure, resulting in startups, research institutions, enterprises, and tech giants adopting AMD hardware in 2023.

AMD also collaborates with machine learning companies such as Hugging Face, enabling data scientists to use their hardware more effectively.

FOPLP, widely popular
The top ten artificial intelligence chip manufacturing companies in 2024
What will future chips look like?
SoftBank wants to become an AI investment leader, it is recommended to ask Nvidi
Intellectual disability or "wisdom," what kind of chips do robots need?
The astonishing speed of China's chip production capacity increase
The latest data: The shipment of each mainboard manufacturer in mainland China h
The price war of flash memory modules has begun, mainland Chinese manufacturers
Nvidia's competitors are making efforts again
The next key point of Intel's growth

The software ecosystem is crucial, as hardware performance largely depends on software optimization. For instance, there is a public disagreement between AMD and NVIDIA on the H100 and MI300 benchmarks. The focus of the disagreement is the packages and floating points used in the benchmarks. According to the latest benchmarks, for inference of a 70B LLM, the MI300 appears to be better or comparable to the H100.3. Intel

Intel is the largest manufacturer in the CPU market and has a long history of semiconductor development. In 2017, Intel became the world's first AI chip company to break the $10 billion sales mark.

Intel's Xeon CPUs are suitable for a variety of tasks, including data center processing, and have had an impact on the company's commercial success.

Gaudi3 is Intel's latest AI accelerator processor. Since its public release in April 2024, there are currently limited benchmarks for its performance.

4. Alphabet/Google Cloud Platform

Google Cloud TPU is a specially built machine learning accelerator chip that supports Google products such as translation, photos, search, assistants, and Gmail. It is also available for use through Google Cloud. Google released the TPU in 2016. The latest TPU is Trillium, the sixth-generation TPU.

Edge TPU is another accelerator chip from Google Alphabet, smaller than a penny coin, designed specifically for edge devices such as smartphones, tablets, and IoT devices.

5. AWS

AWS produces Tranium chips for model training and Inferentia chips for inference. Although AWS is a leader in the public cloud market, it started building its own chips after Google.

6. IBMIBM is set to release its latest deep learning chip—the Artificial Intelligence Unit (AIU)—in 2022. IBM is considering using these chips to support its Watson.x generative artificial intelligence platform.

The AIU is built on the "IBM Telum processor," which provides support for AI processing capabilities of IBM Z mainframe servers. At the time of its launch, the prominent use case for the Telum processor included fraud detection.

IBM has also demonstrated that combining computing and memory can improve efficiency. This has been demonstrated in the NorthPole processor prototype.

7. Alibaba

Alibaba produces inference chips such as the Guanghuan 800.

What are the leading artificial intelligence chip startups?

Here, we would also like to introduce some startups in the AI chip industry, whose names you may frequently hear in the near future. Although these companies have just been established for a short time, they have already raised tens of millions of dollars.

8. SambaNova SystemsSambaNova Systems was founded in 2017 with the goal of developing high-performance, high-precision hardware and software systems for large-scale generative AI workloads. The company has developed the SN40L chip and raised over $1.1 billion in funding.

It is worth noting that SambaNova Systems also leases its platforms to enterprises. SambaNova Systems' AI platform-as-a-service approach makes its systems more accessible and encourages the reuse of hardware in a circular economy.

9. Cerebras Systems

Cerebras Systems was founded in 2015. In April 2021, the company announced the launch of a new AI chip model, the Cerebras WSE-2, which has 850,000 cores and 2.6 trillion transistors. Undoubtedly, the WSE-2 has made significant improvements over the WSE-1, which has 1.2 trillion transistors and 400,000 processing cores.

Cerebras' systems have partnered with multiple pharmaceutical companies such as AstraZeneca and GlaxoSmithKline, as the effective technology of the WSE-1 has accelerated genetic and genomic research, reducing the time required for drug discovery.

10. Groq

Groq was founded by former Google employees. The company represents LPU, a new model of AI chip architecture designed to make it easier for companies to adopt their systems. The startup has raised approximately $350 million and has produced its first models, such as the GroqChip processor and the GroqCard accelerator.

The company focuses on LLM inference and has released benchmark tests for Llama-2 70B.

The company stated that in the first quarter of 2024, 70,000 developers registered on its cloud platform and built 19,000 new applications.

On March 1, 2022, Groq acquired Maxeler, which provides high-performance computing (HPC) solutions for financial services.

Leave a comment