Cisco Programs on Tuesday launched networking chips for AI supercomputers that will compete with choices from Broadcom and Marvell Know-how.
Chips from its SiliconOne sequence are being examined by 5 of the six main cloud suppliers, Cisco stated, with out naming the companies. Key cloud gamers embody Amazon Net Providers, Microsoft Azure and Google Cloud, which collectively dominate the marketplace for cloud computing, in response to Bofa World Analysis.
The rising reputation of AI purposes comparable to ChatGPT, which is powered by a community of specialised chips referred to as graphics processing items (GPUs), has made the pace at which these particular person chips talk extraordinarily essential.
Cisco is a significant provider of networking tools together with ethernet switches, which join gadgets comparable to computer systems, laptops, routers, servers and printers to a neighborhood space community.
It stated the most recent era of its ethernet switches, referred to as G200 and G202, have double the efficiency, in contrast with the earlier era, and might join as much as 32,000 GPUs collectively.
“G200 & G202 are going to be probably the most highly effective networking chips out there fueling AI/ML workloads enabling probably the most power-efficient community,” Cisco fellow and previously principal engineer Rakesh Chopra stated.
Cisco stated the chips might assist in finishing up AI and machine studying duties with 40% fewer switches and a lesser lag, whereas being extra energy environment friendly.
In April, Broadcom introduced the Jericho3-AI chip that may join as much as 32,000 GPU chips collectively.