Cisco has launched a network chip for AI supercomputers, which will compete head-on with products from Broadcom and Marvell.
The new chip belongs to the Cisco SiliconOne series, and five of the six major cloud computing providers are already testing chips, but Cisco did not provide a specific name for the enterprise. In the US market, AWS, Microsoft Azure, and Google Cloud are the rulers of cloud computing.
AI applications are becoming increasingly popular, and ChatGPT has become a pioneer. It is the network composed of GPU professional chips that supports ChatGPT, and the faster the communication speed of a single chip, the better the effect.
Cisco is a major network equipment supplier, and its Ethernet switches are recognized by the market. Now Cisco has launched the new generation switches G200 and G202, which have doubled the performance compared to the previous generation products and can connect up to 32000 GPUs together.
Cisco researcher Rakesh Chopra said, "The G200 and G202 will become the most powerful network chips on the market, providing support for AI/ML workloads and assisting in building the most energy-efficient networks
According to Cisco, when the new chip performs AI and machine learning tasks, the number of exchanges is reduced by 40%, the latency is shortened, and it is more energy-efficient.
In April, Broadcom also launched the Jericho3-AI chip, which can connect up to 32000 GPU chips together.